ATCS – Selected Topics in Learning, Prediction, and Optimization (with applications in Finance)
2021 Spring
Lecturer: Jian Li ( lapordge at gmail dot com)
TA: Yitong Duan, Liang Zeng
time: every Monday 9:50am12:15am
Room: 四教4303
We intend to cover a subset of the following topics (tentative):
(1) I assume you already know all basics (convex optimization and machine learning, stochastic gradient descent, gradient boosting, deep learning basics, CNN, RNN, please see my undergrad course). If you don't know much machine learning (e.g., you do not know how to derive the dual of SVM yet), please do NOT take this course. I will recall some concepts briefly when necessary.
(2) online learning/optimization, multiarmed bandit, statistical learning theory, theory of deep learning
I won't stickly follow the above order....I may skip something mentioned above and cover something not mentioned above...It is a graduate course.
I will be talking about several applications of ML and optimization in Finance (trading, pricing derivatives etc), and of course in typical CS areas like vision, nlp, social networks as well...
I will teach about 2/3 of the classes. For the rest, I will choose some topics and students need to do class presentation.
Tentative topics for class presentation: generative models (GAN), adverserial learning and robustness, unsupervised learning (cotraining, pseudolabeling, contrastive learning), metalearning, AutoML, various financial applications.
Some
knowledge about convex optimization may be useful. See
this course (by
S. Boyd) and
a previous course by myself. But it will be fine if you didn't take those
courses. Basic machine learning knowledge is a must.
Andrew Ng's undergrad lecture
notes
The course is a blending of theory and practice. We will cover both the underlying mathematics as well as interesting heuristics.
Grading:
Schedule:
1.Feb 22  Basics of Convex
optimization Strongly Convexity, Smoothness Convergence analysis of Gradient Descent Basics of Online learning, regret Online gradient descent 

2.Mar 1 
The expert problem 

3.Mar 8  Some concepts in convex
optimization: Fenchel conjugate, Bregman divergence Mirror Descent Online Mirror Descent Basics of Stock, future markets2 

4.Mar 15  Online Mirror Descent, Follow the
Regularized Leader Adversarial Bandit (EXP3) Upper Confidence Bound 

5.Mar 22  Conjugate Prior Thompson Sampling Median Elimination Contextual Bandit (EXP4) epsGreedy (LangfordZhang) 

6.Mar 29  Gaussian process Bayesian optimization a sketch of regret analysis of GPUCB Brownian Motion, Ito integral 
Selected reading: 
7.Apr 5  Holiday, no class  
8.Apr 12  Ito's caculus.
Introduction to options Binomial tree, Deltahedge 
Corresponding captures in [Book] Options, Futures and Other Derivatives Optional reading: A rigorous and through treatment of SDE (including construction of Brownian motion): [Book]Stochastic Calculus, Filtering, and Stochastic Control

9.Apr 19  BSM formula, BS formula for European call Risk neutral valution. Volatility Smile FeymannKac, Fokker Planck, Langevin Dynamics 
Corresponding captures in [Book]
Options, Futures and Other Derivatives SDE and PDE: [Book]Arbitrage Theory in Continuous Time, Chapter 5 Optional reading: Fokker Planck Equation 
10.Apr 26  Maxima of Gaussian Process. Subgaussian process Dudley theorem (chaining) Symmetrization, Rademacher complexity 
[Book] Probability in High Dimension, Chapter 5 [Book] Foundation of Machine Learning, Chapter 3 Advanced Reading:Talagrand's Generic Chaining, [Book] Probability in High Dimension, Chapter 6 
11.May 6  VCdimension, Pseudodimension,
FatShattering dimension Algorithm Stability, generalization of GD (convex, strongly convex) 
[Book] Probability in High Dimension, Chapter 5 Train faster, generalize better: Stability of stochastic gradient descent 
12.May 10  PACBayesian framework generalization of SGLD Generalization based on Pacbayesian


13.May 17  Learnability of convex functions. a brief survey of convex optimization (in particular finite sum form) RKHS 
Convex
parts can be found in [Book]
Understanding Machine Learning: From Theory to
Algorithms
A Primer on Reproducing Kernel Hilbert Spaces 
14.May 24  RKHS, Universal Kernel Max Mean Discrepancy (MMD) Equivalence between KRR and Gaussian process regression Integral operator local rademacher complexity 

15.May 31  Margin related semisupervised learning 
References:
[Book] Introduction to online convex optimization
[Book] Learning, Prediction and Games
[Book] Options, Futures and Other Derivatives
[Book] Advances in Financial Machine Learning
[Book] Convex Optimization
[Book] Foundation of Machine Learning
[Book] Understanding Machine Learning: From Theory to Algorithms
Python is the default programming language we will use in the course.
If you haven't use it before, don't worry. It is very easy to learn (if you know any other programming language), and is a very efficient language, especially
for prototyping things related scientific computing and numerical optimization. Python codes are usually much shorter than C/C++ code (the lauguage has done a lot for you). It is also more flexible and generally faster than matlab.
A standard combination for this class is Python+numpy (a numeric lib for python)+scipy (a scientific computing lib for python)+matplotlab (for generating nice plots)
Another somewhat easier way is to install Anaconda (it is a free Python distribution with most popular packages).