ATCS – Selected Topics in Learning, Prediction, and Optimization (with applications in Finance)
Lecturer: Jian Li ( lapordge at gmail dot com)
TA: Yitong Duan, Liang Zeng
time: every Monday 9:50am-12:15am
We intend to cover a subset of the following topics (tentative):
(1) I assume you already know all basics (convex optimization and machine learning, stochastic gradient descent, gradient boosting, deep learning basics, CNN, RNN, please see my undergrad course). If you don't know much machine learning (e.g., you do not know how to derive the dual of SVM yet), please do NOT take this course. I will recall some concepts briefly when necessary.
(2) online learning/optimization, multi-armed bandit, statistical learning theory, theory of deep learning
I won't stickly follow the above order....I may skip something mentioned above and cover something not mentioned above...It is a graduate course.
I will be talking about several applications of ML and optimization in Finance (trading, pricing derivatives etc), and of course in typical CS areas like vision, nlp, social networks as well...
I will teach about 2/3 of the classes. For the rest, I will choose some topics and students need to do class presentation.
Tentative topics for class presentation: generative models (GAN), adverserial learning and robustness, unsupervised learning (co-training, pseudolabeling, contrastive learning), meta-learning, AutoML, various financial applications.
knowledge about convex optimization may be useful. See
this course (by
S. Boyd) and
a previous course by myself. But it will be fine if you didn't take those
courses. Basic machine learning knowledge is a must.
Andrew Ng's undergrad lecture notes
The course is a blending of theory and practice. We will cover both the underlying mathematics as well as interesting heuristics.
|1.Feb 22||Basics of Convex
Strongly Convexity, Smoothness
Convergence analysis of Gradient Descent
Basics of Online learning, regret
Online gradient descent
|2.Mar 1|| |
The expert problem
|3.Mar 8||Some concepts in convex
optimization: Fenchel conjugate, Bregman divergence
Online Mirror Descent
Basics of Stock, future markets-2
|4.Mar 15||Online Mirror Descent, Follow the
Adversarial Bandit (EXP3)
Upper Confidence Bound
|5.Mar 22||Conjugate Prior
Contextual Bandit (EXP4)
The Epoch-Greedy Algorithm for Contextual Multi-armed Bandits
|6.Mar 29||Gaussian process
a sketch of regret analysis of GP-UCB
Brownian Motion, Ito integral
|7.Apr 5||Holiday, no class|
|8.Apr 12||Ito's caculus.
Introduction to options
Binomial tree, Delta-hedge
Corresponding captures in [Book] Options, Futures and Other Derivatives
A rigorous and through treatment of SDE (including construction of Brownian motion):
[Book]Stochastic Calculus, Filtering, and Stochastic Control
|9.Apr 19||BSM formula, BS formula for European call
Risk neutral valution.
Feymann-Kac, Fokker Planck, Langevin Dynamics
Corresponding captures in [Book]
Options, Futures and Other Derivatives
SDE and PDE: [Book]Arbitrage Theory in Continuous Time, Chapter 5
Optional reading: Fokker Planck Equation
|10.Apr 26||Maxima of Gaussian Process.
Dudley theorem (chaining)
Symmetrization, Rademacher complexity
[Book] Probability in High Dimension, Chapter 5
[Book] Foundation of Machine Learning, Chapter 3
Advanced Reading:Talagrand's Generic Chaining, [Book] Probability in High Dimension, Chapter 6
|11.May 6||VC-dimension, Pseudo-dimension,
Algorithm Stability, generalization of GD (convex, strongly convex)
|[Book] Probability in High Dimension, Chapter 5|
|12.May 10||PAC-Bayesian framework
generalization of SGLD
Generalization based on Pac-bayesian
|13.May 17||Learnability of convex functions.
a brief survey of convex optimization (in particular finite sum form)
parts can be found in [Book]
A Primer on Reproducing Kernel Hilbert Spaces
|14.May 24||RKHS, Universal Kernel
Max Mean Discrepancy (MMD)
Equivalence between KRR and Gaussian process regression
local rademacher complexity
|15.May 31||Margin related
[Book] Introduction to online convex optimization
[Book] Learning, Prediction and Games
[Book] Options, Futures and Other Derivatives
[Book] Advances in Financial Machine Learning
[Book] Convex Optimization
[Book] Foundation of Machine Learning
Python is the default programming language we will use in the course.
If you haven't use it before, don't worry. It is very easy to learn (if you know any other programming language), and is a very efficient language, especially
for prototyping things related scientific computing and numerical optimization. Python codes are usually much shorter than C/C++ code (the lauguage has done a lot for you). It is also more flexible and generally faster than matlab.
A standard combination for this class is Python+numpy (a numeric lib for python)+scipy (a scientific computing lib for python)+matplotlab (for generating nice plots)
Another somewhat easier way is to install Anaconda (it is a free Python distribution with most popular packages).