Machine Learning: Classification using Python and Oracle ATP

Continuing the last article when we created a Jupyter Notebook and used python to connect to an Oracle Autonomous Transaction Processing Database instance, now it’s time to run a classification using the machine learning library called Scikit-Learn. This is a simple demonstration using the Iris dataset and in the near future I intend to show a more real use case. You can download the notebook here: https://github.com/waslleysouza/oracle_autonomous_jupyter/blob/master/atp_classification.ipynb. […]

Connect to Oracle ATP through Jupyter Notebook

Do you know Jupyter Notebook? The Jupyter Notebook is an open-source web application that allows you to create and share documents that contain live codes, equations, visualizations and narrative text. You can use it for data cleaning and transformation, numerical simulation, statistical modeling, data visualization, machine learning, and much more. I like to use the Jupyter Notebook for my machine learning projects because it is a very useful tool. In one of my projects I needed to […]

Face Recognition API with OCI – Part 1

You may have noticed that I’m studying Machine Learning (including Deep Learning). In my study I like to implement codes from other developers or create some codes from scratch. If you also study about facial recognition you may have heard about Facenet. Facenet is a Tensorflow implementation for face recognition that you can integrate into your projects, and I used it to create my Face Recognition API. In this first article, you will learn how to use my facial recognition API with […]

Exposing Keras as REST API

In my last blog post about Keras, you learned how to use the Kaggle dogs-vs-cats dataset. But would you like your friends to use your model to identify dogs and cats in pictures? Yes, this blog post is about it! You’ll learn how to expose your model as a REST API in a simple way. Lets go! We’ll use the code created in “Using Kaggle datasets” by adding a modification to save the model. Then, download all the files and run the jupyter notebook to train and save your model. […]

Installing Anaconda in OCI GPU instance

Now that you know how to create an Oracle Cloud Infrastructure GPU instance, the next steps are install Anaconda and use Jupyter Notebook to develop or test your AI projects. First of all, go to your Oracle Cloud Account and add the following Ingress Rule in your Security List (Networking > Virtual Cloud Networks > Virtual Cloud Network Details > Security Lists > Security List Details). Using a Terminal, access your Ubuntu instance and download the latest version of Ananconda. […]

CPU vs GPU in Oracle Cloud

If you read my blog post called “Optimizing TensorFlow for CPU“, you learned that you can improve TensorFlow for CPU by just choosing the correct distribution, in this case the Anaconda distribution. CPU instances will do the work for simple AI projects, but if you need more computing power to reduce the execution or training time of your project, you need to use GPU instances. Since many people have asked me to run the same test using GPU instances, in this post you will see the […]

Getting Started with GPU instances in Oracle Cloud

If you are working with artificial intelligence (machine learning or deep learning) projects at some point, you will need to change your CPU instances to GPU instances speed up the training of your models. Nowadays, most cloud providers offer GPU as a service and you can use it to speed up your projects. Oracle Cloud offers the two most advanced GPU models for your choice: NVIDIA V100 and NVIDIA P100. In this blog post, you’ll learn how to request and create GPU instances in the Oracle […]