AI python with 5 Lines of Code

Development | Programming languages
5.0 (1 ratings)
These days, machine learning and computer vision are all the craze. We’ve all seen the news about self-driving cars and facial recognition and probably imagined how cool it’d be to build our own computer vision models. However, it’s not always easy to break into the field, especially without a strong math background. Libraries like PyTorch and TensorFlow can be tedious to learn if all you want to do is experiment with something small.
No attachments
In this tutorial, I present a simple way for anyone to build fully-functional object detection models with just a few lines of code. More specifically, we’ll be using Detecto, a Python package built on top of PyTorch that makes the process easy and open to programmers at all levels.

Quick and easy example

To demonstrate how simple it is to use Detecto, let’s load in a pre-trained model and run inference on the following image:

First, download the Detecto package using pip:

pip3 install detecto

Then, save the image above as “fruit.jpg” and create a Python file in the same folder as the image. Inside the Python file, write these 5 lines of code:
from detecto import core, utils, visualize
image = utils.read_image('fruit.jpg')
model = core.Model()
labels, boxes, scores = model.predict_top(image)
visualize.show_labeled_image(image, boxes, labels)

After running this file (it may take a few seconds if you don’t have a CUDA-enabled GPU on your computer; more on that later), you should see something similar to the plot below:

Awesome! We did all that with just 5 lines of code. Here’s what we did in each:
* Imported Detecto’s modules
* Read in an image
* Initialized a pre-trained model
* Generated the top predictions on our image
* Plotted our predictions


on July 31st, 2020 (11:06 pm)
All coments
Wow, thanks
on August 03rd, 2020 (12:28 am)
You must sign in to comment!! LOGIN
We use cookies to give you the best possible experience on our site. By continuing to use the site you agree to our use of cookies. Find out more Accept