Optimization and Machine Learning
- Instructor:
Chih-Jen Lin, Room 413, CSIE building.
The best way to contact me is via e-mails.
- TA: Chien-Chih Wang d98922007 at ntu.edu.tw (TA hour: Th 10-12 in lab 528)
Your HW/exam scores
- Time: Monday 10:20-13:10, room 110
We have a 20-minute break at around 11:30. So we will finish at 13:00.
From Oct 17, we will start at 10:10 to make up the course
on September 26
-
This course will be taught in English.
We will use slides as well as the blackboard.
You are assumed to take notes.
- Textbook:
- First part: Convex Optimization by
Boyd and Vandenberghe
Slides are available at the book web site.
We will cover the following slides. This is
always tentative. Please check this web page
before printing your slides.
- Chap 1: 1-2, 4-8, 14-15
- Chap 2: 1-14, 16-20
- Chap 3: 1-4, 7-10, 12-16, 19, 21-23, 26-27
- Chap 4: 1-13, 17-18, 22-25, 35-43
- Chap 5: 1-5, 8-13, 15-19, 24-30
- Chap 7: 1-6
- Chap 8: 8-14. We will also use some SVM slides
here
- Chap 10: 1-30
- Sparse Representation and Optimization Methods for
L1-regularized Problems. We will use slides
here (not available yet). See detailed descriptions of
optimization methods in this paper.
-
FAQ of this course is here
-
No course on October 10 (a National holiday).
Also no course on September 26 (because I must attend an important event). Sorry for the late notice because I couldn't
access the Internet since Saturday noon.
We make it up by changing the starting time to 10:10am.
Course Outline
Optimization techniques are used in all kinds of machine learning
problems because in general we would like to minimize the testing
error. This course will contain two parts. The first part focuses on
convex optimization techniques. We discuss methods for least-squares,
linear and quadratic programs, semidefinite programming, and others.
We also touch theory behind these methods (e.g., optimality conditions
and duality theory). In the second part of this course we will
investigate how optimization techniques are applied to various machine
learning problems (e.g., SVM, maximum entropy, conditional random
fields, sparse reconstruction for signal processing applications). We
further discuss that for different machine learning applications how
to choose right optimization methods.
Course Objective
- Learn how to use optimization techniques
for solving machine learning problems.
- Convex set, Convex function
- Linear, quadratic programming
- Convex optimization
- Duality
- Unconstrained minimization
- Equality constrained minimization
- SVM
- Maximum entropy, CRF
- Applications
Homework
Once every two weeks. Please write your homework/reports in English.
For late homework, the score will be exponentially decreased. See
FAQ about how to submit your
homework.
- HW1: 2.2, 2.3, 2.7, due on October 17.
- HW2: 2.20, 2.24, 3.19, due on October 31.
- HW3: 3.21, 3.36(a),(b), 4.5, due on November 28.
- HW4: 4.11(a), 4.39, 5.3, due on December 12.
- HW5: 5.22(c), 5.26, 5.28, due on December 26
- HW6: 9.1, 9.10, due on January 9.
Exams
You can bring notes and the textbook.
Other books or
electronic devices are not allowed.
- Midterm 1: October 31
- Midterm 2: December 5
- Final: January 9. Discussion: January 10 12:20pm at room 107.
If you cannot come for discussion, you can get your answer sheet from TA.
Exams last year: exam1,
exam2, exam3.
For midterms, discussions will be in the following week. For the Final exam, it will be on January 10.
Grading
30% homework, 70% Exam. (tentative)
Some (usually 10%) may fail if they don't work hard.
Last modified: Mon Dec 12 12:04:14 CST 2011