Instance Segmentation Using Mask R CNN

Instance Segmentation Using Mask R CNN

Here I am going to perform instance segmentation of Cars using Mask R-CNN model. I will be using the Supervisely a Web UI for building different object detection and segmentation modles.

So lets begin by logging into sepervisely and creating a work space. I have created a workspace named as "mlops_work". You can name as per your own wish.

No alt text provided for this image

Now click on the workspace and in the project start importing the dataset. Supervisely is a tool that lets you upload dataset folder containing the images by simple drag and drop method.

No alt text provided for this image

Next comes the data preprocessing part which will also be done using supervisely. We need to segregate the required objects/instances using supervisely which is also known as annotation. To do that just click on the imported data folder in supervisely and it will let you be at the image annotation page. Now start annotating the images and for doing that I will use polygon from the left side pannel of supervisely.

No alt text provided for this image

Now we will ferform DTL which will expand your dataset and also tag the images with "train' and "val".

No alt text provided for this image

After DTL is done, go to neural networks at the left panel of supervisely and add Mask R-CNN (Keras+Tensorflow)_(COCO) from the availabe neural networks. Then comes the part of training the model but before that we need to add an agent to supervisely cluster at the Cluster page with required configuration. You will know more from the image below.

No alt text provided for this image

I am using Deep Learning AMI (Ubuntu 18.04)Versio 30.2 as an agent from amazon clouds but you can use whichever you want but having required configuration. The highlighted command will be used to connect the agent to supervisely.

Then go neural network again and click the train button. The process of training the model will start.

No alt text provided for this image

After training we will be testing the model on testing. So upload your testing dataset as done previously Remember that the testing data does not require to be annotated as it is for testing only so we will use it is. After uploading your data click the test button and begin the testing part. You will get the following output in your dataset.

No alt text provided for this image

We can also create the the model from different check points with diffent loss and choose the optimum one. We can also check the graph of the training process.

No alt text provided for this image

This is the overall workflow and for any query feel free to dm for the same.

To view or add a comment, sign in

More articles by Akash Saini ( Akki )

  • Expert Session on Red Hat OpenShift

    Yesterday an expert sessions was held on Red Hat Openshift container platform organized by LinuxWorld Informatics Pvt…

  • A use case study of how Netflix uses services provided by AWS.

    Online content provider Netflix can support seamless global service by using Amazon Web Services (AWS). AWS enables…

  • __Getting started with Big Data__

    This article is for those who want to know basics of Big Data like what is big data, what is the need of it in the…

  • A Self Reflection of mine

    Here I am writing a self reflection of mine, what I knew 2 months age and what I know now in the world of Machine…

  • Task 2

    In this task of mlops training I am going create a complete Jenkins Pipeline from pulling the code from github…

  • MLOps

    I completed one of the tasks given to me by Sir Vimal Daga, a world record holder, during my industrial training under…

  • Do not remain just sitting, Fight against Corona....

    Yes, today the situation we are in is very tough. This pandemic is a tragedy happening all around the world.

Others also viewed

Explore content categories