Day 6 of Solving ML Problems From Scratch: Adam Optimizer Today I worked on implementing the Adam Optimizer from scratch. What I like about Adam is that it combines the benefits of momentum and adaptive learning rates in a very practical way. Instead of taking the same type of step every time, it adjusts based on both past gradients and gradient magnitude, which makes optimization more stable and efficient. While solving this, I got a better understanding of: how momentum helps smooth the update direction how the velocity term adapts the step size why bias correction is important, especially in the early steps how Adam can converge faster than plain SGD in many cases Building these concepts from scratch is helping me understand what is really happening behind the libraries we use every day. It is one thing to call an optimizer in code, but it is very different to actually implement and reason through each update step yourself. Small daily practice like this is making machine learning feel much more intuitive. #MachineLearning #DeepLearning #ArtificialIntelligence #Python #DataScience
Implementing Adam Optimizer from Scratch in Machine Learning
More Relevant Posts
-
Headline: Logic meets Code. 🧩💻 I just wrapped up another challenge on HackerRank focusing on Probability & Statistics—specifically calculating outcomes across multiple independent events. The task: Determining the exact probability of drawing a specific color combination from two different bags. While the math can be done on paper, translating these permutations and combinations into clean, efficient code is where the real fun is. Steps like these are small but vital foundations for building more complex machine learning models later on. Excited to keep this momentum going! #DataScience #Python #HackerRank #Statistics #ContinuousLearning #AI link of #Solution :- https://lnkd.in/gC9j7RgS
To view or add a comment, sign in
-
-
🔗 GitHub Repository: [https://lnkd.in/gXa9zEBs] Strengthening Machine Learning concepts with Logistic Regression Covered practical implementation of: ✔ Binary Classification (Single & Multiple Inputs) ✔ Polynomial Logistic Regression ✔ Multiclass Classification (OVR & Multinomial) ✔ Decision Boundaries & Model Evaluation using Python and scikit-learn Understanding how logistic regression predicts probabilities and solves classification problems gives deeper insight into real-world ML applications. From theory to implementation, every project adds more clarity and confidence to the learning journey. #MachineLearning #LogisticRegression #Python #DataScience #ScikitLearn #GitHub
To view or add a comment, sign in
-
-
Just completed the Gradient Descent lab in Andrew Ng's ML Specialization — and it genuinely clicked for me here. The concept: instead of guessing the best values for w and b in a linear model, gradient descent finds them automatically by repeatedly moving in the direction that reduces error. What I built from scratch in Python: ✅ compute_cost() — measures how wrong the model is ✅ compute_gradient() — calculates which direction to move ✅ gradient_descent() — runs 10,000 iterations to find optimal parameters What surprised me most: → Starting from w=0, b=0, the algorithm found w≈200, b≈100 for a house price dataset → The cost dropped rapidly at first, then slowed as it approached the minimum — exactly like rolling a ball to the bottom of a bowl → Setting the learning rate too high (α = 0.8) caused the model to completely diverge — cost shot up instead of down That last point was the most valuable. Seeing divergence visually made the theory real. Building these functions line by line beats reading about them any day. #MachineLearning #Python #AndrewNg #LearningInPublic #DataScience #GradientDescent
To view or add a comment, sign in
-
-
Day 72. Spent time going deeper into XGBoost today. Covered classification and worked through the math: gradients & hessian leaf weights similarity score & gain Some questions I tried to answer while learning: Why do we need Taylor expansion here? Why can’t we directly differentiate the objective? What makes decision trees non-smooth / non-differentiable? The key realization: since trees produce piecewise constant outputs, the loss surface isn’t smooth — which is why second-order approximation becomes necessary. Still revising, but things are starting to connect. Notes: https://lnkd.in/gCqHUeK9 #MachineLearning #XGBoost #LearningInPublic #Python #DataScience
To view or add a comment, sign in
-
Day 2 of learning Machine Learning. Today I worked on a simple linear regression model using Python in Jupyter Notebook. The idea was straightforward: - Input (x): house size - Output (y): price Model used: f(x) = wx + b I understood how: - Training data is structured (x_train, y_train) - Parameters (w, b) define the relationship - The model uses this to make predictions on new inputs Also got hands-on with NumPy and basic plotting using Matplotlib. Still very early, but it's becoming clearer how data is converted into predictions. #MachineLearning #AI #Python #LearningInPublic
To view or add a comment, sign in
-
-
Claude just diagnosed me with a classic developer bug 😂 After hours of learning Python — functions, loops, dictionaries, if/else, and AI agent architecture — I started asking the same questions twice. Claude's response? ``` while awake == True: ask_questions() if questions == repeat: print("Go to sleep Anil! 😄") break ``` Turns out even humans need a break statement. 😄 The grind is real. But so is the progress. 💪 #Python #AI #MachineLearning #CareerChange #AIAgent #LearningToCode #Claude #100DaysOfCode
To view or add a comment, sign in
-
-
🚀 Machine Learning Journey (Prime 2.0) : Day-2 Continuing my Python learning journey, today I focused on control flow and problem-solving concepts that are essential for building logic in Machine Learning 🧠💻 I covered: • Conditional statements (if-else, nesting, and match-case) • Solving problems like checking odd/even numbers • Loops in Python (while & for loops) • Practicing loop-based problems like multiplication table and sum of N numbers • Understanding break and continue statements • Using the range() function effectively • Solving string-based problems like vowel count • Introduction to functions in Python One interesting insight from today: Loops and conditionals are the core of logical thinking in programming—most real-world ML problems rely heavily on these fundamentals. This session helped me improve my problem-solving approach using Python. Still need more practice to write optimized logic, but the basics are getting stronger 📈 Excited to move closer to actual Machine Learning concepts soon 🚀 #MachineLearning #Python #AI #DataScience #LearningInPublic #DeveloperJourney #ApnaCollege #MLJourney #prime2.0
To view or add a comment, sign in
-
-
Not every day is about solving problems, some days are about understanding concepts. Day 38/100 — Data Structures & Algorithms Journey Today I focused on learning the Sliding Window technique instead of solving problems. Taking time to understand the pattern deeply before jumping into implementation. Today’s Focus: Understanding how sliding window works Learning when to expand and shrink the window Studying problem patterns where it applies Building intuition step by step Why this matters? Because strong concepts make problem-solving faster and more efficient. Key Takeaways: Learning is also progress Clarity builds confidence Patterns simplify complex problems Consistency matters more than intensity Taking it slow, but moving forward #Day38 #DSA #LeetCode #ProblemSolving #CodingJourney #100DaysOfCode #SoftwareEngineering #Python #InterviewPreparation #LearnInPublic #Consistency
To view or add a comment, sign in
-
Excited to share my latest project: LinearRegression-ML This is a beginner-friendly Machine Learning project focused on understanding and implementing Linear Regression from scratch. It includes practical notebooks like profit analysis and medical data predictions, along with clear explanations of loss and cost functions. ???What I learned =>Fundamentals of Linear Regression =>Cost & loss function implementation =>Real-world dataset analysis using Python #https://lnkd.in/guCQQdNe #MachineLearning #Python_Jupyter_Notebook #DataScience
To view or add a comment, sign in
-
-
Starting my journey in Machine Learning! Today, I worked on a simple Linear Regression model using Python and Scikit-learn. 🔹 Created a dataset with input (X) and output (y) 🔹 Trained the model using Linear Regression 🔹 Predicted the output for a new input value This small step helped me understand how machines can learn patterns from data and make predictions. Key takeaway: Even a simple model can give powerful insights when the relationship between data is clear. Looking forward to exploring more concepts like classification, model evaluation, and real-world datasets! #MachineLearning #Python #DataScience #LearningJourney #AI #StudentLife
To view or add a comment, sign in
-
Explore content categories
- Career
- Productivity
- Finance
- Soft Skills & Emotional Intelligence
- Project Management
- Education
- Technology
- Leadership
- Ecommerce
- User Experience
- Recruitment & HR
- Customer Experience
- Real Estate
- Marketing
- Sales
- Retail & Merchandising
- Science
- Supply Chain Management
- Future Of Work
- Consulting
- Writing
- Economics
- Artificial Intelligence
- Employee Experience
- Workplace Trends
- Fundraising
- Networking
- Corporate Social Responsibility
- Negotiation
- Communication
- Engineering
- Hospitality & Tourism
- Business Strategy
- Change Management
- Organizational Culture
- Design
- Innovation
- Event Planning
- Training & Development
The bias correction part is interesting—it's a detail that's easy to miss when you're just using the library, but it explains why Adam doesn't stall at the start.