From the course: NVIDIA Certified Associate AI Infrastructure and Operations (NCA-AIIO) Cert Prep

Unlock this course with a free trial

Join today to access over 25,500 courses taught by industry experts.

Model training vs. model inference

Model training vs. model inference

When you build your world-class AI infrastructure, you would focus on using this infrastructure for model training and model inferences. It is highly possible that these two things would keep on happening regularly in your environment. Obviously, once your model is deployed, you would use it for inferences, so lots of requests would come to it. Let's say it's a fraud detection model or recommendation system or it is prediction model for market that will be always performing inferences and then over a period of time this model would require retraining. So retraining would ensure that whatever new data has been collected model is able to understand it if there were any errors it is able to fix it and then keep on providing the accurate inferences for your application. So these two activity will keep on happening on a regular basis on your AI infrastructure but there is a difference in their use cases or in the way they are optimized. So let's talk about differences between model…

Contents