Auto Hyper-Parameter Optimizer
This is the small implementation to understand the power of MLOps. It is one another approach to mutate the hectic process of designing Ml/Dl model with desired accuracy without human intervention. It is more efficient approach.
I explained all the details of this project and step by step implementation in my previous article : https://www.garudax.id/posts/himani-agarwal-6b5ba418a_mlops-devops-integration-activity-6672573725944246272-oj5
Here i mention only the changes which i made to do this task more effectively.
In this approach I use some basic linux commands to change the values of hyper-parameters and for training model I use Transfer learning.
Hyper-parameters : those parameters which user have to specify and to get more accuracy he has to change them accordingly again and again.
Transfer Learning : when new things come up we just add that in model without re-setting all the old experience or learning is performed without considering past learned knowledge in other tasks.
Here I use vgg16 - pretrined model and add new knowledge to this.
For reference use my git-hub repository - https://github.com/17ejtcs031/ml_update_task3.git
- Dockerfile
2. After creating job for download the code from SCM the following jobs ,I created
Our core changes has been doing here -
so that's it ,all other things will be same as previous one (article details mentioned above) and yeah it is working great.........!!!