Skip to content

Free University of Bozen-Bolzano

Optimisation

Semester 2 · 42169 · Bachelor in Industrial and Mechanical Engineering · 6CP · EN


The course mainly aims to acquaint students with practical continuous nonlinear optimization models and algorithms, as well as the optimization with MATLAB. At the end of the course, the students are expected to be able to formulate a real-world optimization problem in the framework of a nonlinear programming model, analyze various optimality features of the model, suggest suitable algorithms for solving the model, and finally, determine an approximation of the optimal solution of the model using MATLAB (or another software).
- Mathematical Preliminaries and Topological Aspects of Nonlinear Optimization

- Applied nonlinear optimization models
- Optimality conditions
- Convexity and convex optimization
- Unconstrained optimization: Theory and algorithms
- Constrained optimization: Theory and algorithms

Lecturers: Saman Babaiekafaki

Teaching Hours: 40
Lab Hours: 20
Mandatory Attendance: Highly recommended (not compulsory).

Course Topics
- Nonlinear Optimization Modelling: Formulaic Structure of Nonlinear Optimization Models, Fundamental Models in Production Planning, Support Vector Machine, Energy Capacity Planning, Portfolio/Inventory Optimization, Facility Location, Engineering Design (Geometric Optimization), Regression, Control Systems, Optimal Control, and Robotics Motion Planning - Mathematical Preliminaries and Topological Aspects of Nonlinear Optimization: Vector/Matrix Norms and Nonlinear Multivariable Function Approximation - Optimality Conditions for Unconstrained Optimization Models: Formulaic Structure of Unconstrained Optimization Models, First/Second Order Analysis of Optimality, and Necessary and Sufficient Optimality Conditions - Least Squares Models: Data Fitting, Noise Cancellation, and Circle Fitting - First Order Algorithms for Unconstrained Optimization: Line Search, Gradient Method with Convergence Analysis, Gauss–Newton Method for Nonlinear Least Squares Models, and Fermat–Weber Problem - Second Order Algorithms for Unconstrained Optimization: Newton’s Method with Convergence Analysis - Convex Sets: Definition, Convex Balls in Various Norms, Algebraic Operations Preserving Convexity, Convex Hulls, Convex Cones, Conic Hulls, and Topological Properties of Convex Sets - Convex Functions: Definition, Jensen’s Inequality, First/Second Order Analysis of Convexity, Global Optimality, Well-Known Convex Functions, Operations Preserving Functional Convexity, Level Sets and Epigraphs, and Quasi-Convex Functions - Convex Optimization: Formulaic Structure, Global Optimality, Convex Quadratic Models, Chebyshev Center of a Set of Points, Analysis of the Markowitz Portfolio Optimization Model, Orthogonal Projection Models, Analysis of Linear Classification Models, and Convex Form of the Trust Region Subproblem - Optimization Over Convex Sets: Stationarity and Optimality Conditions, Gradient Projection Method with Convergence Analysis, Sparsity Constrained Optimization Models, and Iterative Hard-Thresholding Method - Linearly Constrained Nonlinear Optimization Models: Formulaic Structure, Karush–Kuhn–Tucker (KKT) Conditions, Lagrangian Function, Orthogonal Projection onto Half-Spaces, and Orthogonal Regression - KKT Conditions for Equality/Inequality Constrained Nonlinear Optimization Models: Feasible Descent Directions, Fritz–John Conditions, KKT Conditions, Sufficiency of KKT Conditions for Convex Optimization, Analysis of Constrained Least Squares Models, Second Order Optimality Conditions, and Total Least Squares Models - Duality Theory in Nonlinear Optimization: Dual Model Definition, Weak and Strong Duality, Duality Gap and Optimality Bounds, Dual Models of Well-Known Nonlinear Optimization Problems, and Regularization and Denoising

Teaching format
Lectures + Exercises + Software Lab

Educational objectives
Intended Learning Outcomes (ILO) Knowledge and Understanding: 1. Knowledge of the main concepts of the nonlinear optimization theory 2. Understanding of the analytical origins of the optimization algorithms 3. Knowledge of the optimization applications in data mining and machine learning Applying Knowledge and Understanding: 4. Ability to formulate some real-world problems in the framework of the nonlinear optimization models 5. Ability to deal with some problems in the fields of data mining and machine learning Making Judgments: 6. Ability to evaluate reliability of the nonlinear optimization models 7. Ability to assess efficiency of the nonlinear optimization algorithms Communication Skills: 8. Ability to interpret different parts of the classic optimization models 9. Ability to analyse performance of the nonlinear optimization algorithms based on the computational results 10. Ability to conduct post-optimal analysis Learning Skills: 11. Ability to modify classic nonlinear optimization models for specific real-world problems 12. Capability to adapt classic nonlinear optimization algorithms for high-dimensional optimization models 13. Ability to design (use) software to solve the practical optimization models.

Assessment
- Formative Assessments: This part is implemented through weekly exercises assigned to students, which support their understanding of the course material. - Summative Assessments: Students’ knowledge is additionally assessed through a final examination, which includes: - A written exam; - An oral exam (Optional); - A course project (Optional). The detailed structure of the assessment is presented as follows: - 40% Weekly Exercises; ILOs assessed: 1 - 12; - 40% Final Exam: Computation; Duration: 2 hours or more; ILOs assessed: 5, 6, 7, 9, 10; - 20% Final Exam: Theory; Duration: 1 hours or less; ILOs assessed: 1, 4; - Oral Exam (Optional); ILOs assessed: 2, 8.; - Course Project (Optional); ILOs assessed: 3, 11, 12, 13.

Evaluation criteria
- Weekly Exercises: Certain exercises are assigned to students on a weekly basis, closely aligned with the course content of the corresponding week. Solutions should be submitted within approximately one week. - Final (Written) Exam: The main part of the final exam is devoted to numerical problems in which students are required to implement algorithmic approaches for selected problems. In addition, the exam includes theoretical questions that require students to analyze the convergence behavior of algorithms, discuss specific aspects of the mathematical models, and evaluate the accuracy of the solutions. - Oral Exam (Optional): Students may further choose to participate in an oral exam in which their understanding of the general concepts of the course is assessed. - Course Project (Optional): Students are encouraged to address a well-known real-world problem in order to enhance their practical experience with optimization models. The project must be presented, and a written report must also be submitted.

Required readings

- Amir Beck, Introduction to Nonlinear Optimization: Theory, Algorithms, and Applications with MATLAB, 2nd Edition, SIAM: Philadelphia, 2023.

https://sites.google.com/site/amirbeck314/books



Supplementary readings

- Amir Beck, First Order Methods in Optimization, SIAM: Philadelphia, 2017.

- Jorge Nocedal and Stephen J. Wright, Numerical Optimization, Springer: New York, 2006.



Further information
Software: MATLAB


Download as pdf

Sustainable Development Goals
This teaching activity contributes to the achievement of the following Sustainable Development Goals.

4 7 8 9

Request info