Optimization Techniques
Article 2 | Optimization Series by Eman Tora

Optimization Techniques

Follow this series of articles on optimization to build a comprehensive understanding step by step.

In this article, I’ve compiled a broad spectrum of optimization techniques, each accompanied by a concise definition to make them accessible across disciplines. ***If any term seems difficult or unclear, rest assured it will be covered in detail in upcoming articles***. In the next installment, I’ll delve deeper into selected methods, linking them to practical applications in thermal energy and the chemical engineering industry. Follow this series of articles on optimization to build a comprehensive understanding step by step.

Optimization Techniques

______________________________________

Classical Optimization

  • Linear Programming (LP): Optimizes a linear objective function subject to linear constraints. Note: “Linear” in this context means that each variable appears to the first power only—such as x, y, T, or P. There are no squared terms like x2, nor inverse terms like 1/T.
  • Nonlinear Programming (NLP): Deals with nonlinear objective functions or constraints.
  • Integer Programming (IP): Variables are restricted to integers, common in scheduling and resource allocation.
  • Quadratic Programming (QP): Objective function is quadratic, constraints are linear. Here, “Quadratic” means the function includes variables raised to the second power, such as x2, T2, or any squared term

Gradient-Based Methods

  • Gradient Descent: Iteratively moves toward the minimum using the gradient. Here, the gradient is a vector of partial derivatives, showing the rate of change of a function with respect to each variable.

∇f= (∂f/∂x, ∂f/∂y, ∂f/∂z)

(): gradient

  • Newton’s Method: Uses second-order derivatives for faster convergence.
  • Conjugate Gradient: Efficient for large-scale problems without storing full Hessians.

Heuristic & Metaheuristic Techniques

  • Genetic Algorithms (GA): Inspired by natural selection, useful for complex search spaces.
  • Simulated Annealing (SA): Mimics cooling processes in metallurgy to escape local minima.
  • Particle Swarm Optimization (PSO): Models social behavior of swarms to explore solutions.
  • Ant Colony Optimization (ACO): Inspired by ant foraging, effective in routing and network problems.

Convex Optimization

  • Focuses on convex functions where local minima are also global minima. Widely used in machine learning and signal processing.

Article content

Multi-Objective Optimization

  • Balances trade-offs between conflicting objectives (e.g., cost vs. efficiency).
  • Techniques include Pareto optimality and weighted sum approaches.

Article content


Specialized Techniques

  • Dynamic Programming: Breaks problems into subproblems with overlapping solutions.
  • Stochastic Optimization: Accounts for randomness in data or constraints.
  • Constraint Programming: Focuses on satisfying logical constraints rather than optimizing a function.

ML / AI

In machine learning and AI, the most widely used optimization techniques are gradient-based methods like Stochastic Gradient Descent (SGD), adaptive variants like Adam, and second-order methods like Newton’s Method. Heuristic approaches such as Genetic Algorithms and Particle Swarm Optimization are also used for hyperparameter tuning and model architecture search.

Article content
Optimization Techniques used in ML & AI

🧪 Emerging and Advanced Techniques

  • Bayesian Optimization is gaining traction for hyperparameter tuning in expensive-to-evaluate models.
  • Differentiable Architecture Search (DARTS) uses gradient-based methods to optimize neural network structures.
  • Reinforcement Learning-based Optimization is used in AutoML and policy learning

This overview has laid the groundwork by summarizing key optimization techniques with brief definitions. If any term appears challenging or unclear, it will be explained thoroughly in future articles. In the next piece, I’ll explore specific methods in greater depth, highlighting their relevance to thermal energy systems and chemical engineering processes. Keep following this series of articles on optimization to gain a complete picture of how these techniques shape industry and innovation.

𝐁𝐲 𝐄𝐦𝐚𝐧 𝐓𝐨𝐫𝐚 Founder of 𝐄𝐂𝐀𝐃𝐎 𝐈𝐧𝐧𝐨𝐯𝐚𝐭𝐢𝐨𝐧 | Chemical Engineering & Thermal Energy Expert | Developer of 𝐒𝐮𝐧𝐟𝐥𝐨𝐰𝐞𝐫 𝐂𝐨𝐨𝐥𝐢𝐧𝐠 | Lead in Scientific & Industrial Research & Innovation

📧 emantora@gmail.com

Every choice is an optimization problem. Want to know how to make smarter decisions?

Like
Reply

To view or add a comment, sign in

More articles by Eman Tora

Others also viewed

Explore content categories