ATCS ¨C Selected Topics in Learning, Prediction, and Optimization (with application in Finance)

2019 Spring

Lecturer: Jian Li ( lapordge at gmail dot com)

TA: Xu Jin ( jxu3425 at gmail dot com )

time: every Tue 9:50am-12:15am

Room: 6B307 £¨Áù½Ì£©


We intend to cover a subset of the following topics (tentative):

(1) I assume you already know all basics (convex optimization and machine learning, stochastic gradient descent, gradient boosting, deep learning basics, CNN, RNN, please see my undergrad course). I will recall some details when necessary.

(2) Online learning and sequential prediction. Multi-armed bandit, Universal portfolio, Multiplicative weighting method, online convex optimization, (maybe) online learning and pricing derivatives

(3) learning theory: VC, Margin theory, Generalization

I won't stickly follow the above order....I may skip something mentioned above and cover something not mentioned above...It is a graduate course.

I will be talking about several applications of ML and optimization in Finance (trading, pricing derivatives etc), and of course in typical CS areas like vision, nlp, social networks as well...

Some knowledge about convex optimization may be useful. See http://itcs.tsinghua.edu.cn/~john/convex2013.html and a previous course by myself. But it will be fine if you didn't take those courses. Basic machine learning knowledge will be very useful. If you don't know any machine learning, I would suggest you to read some notes from Andrew Ng's undergrad lecture notes

The course is a blending of theory and practice. We will cover both the underlying mathematics as well as interesting heuristics. 

A similar previous course.


Grading:

  1. Homeworks (30pts, 1 homework every two or three weeks, the homeworks may have some small coding assignments)
  2. 10 pts for taking notes: Each student should take notes for at least one lecture (maybe two), using LaTex (use this template sample.tex algorithm2e.sty).
  3. Course projects (60 pts. 10pts for mid-report, 10pts for final presentation, 40pts for final report)
  4. No exam. 

     


Schedule:

 

Feb 26 The expert problem.

Multiplicative weights method

Using MW to solve Zero sum game

Basics of Stock and Future Markets

Multiplicative weights method: a meta-algorithm and its applications. (A survey) Sanjeev Arora, Elad Hazan, and Satyen Kale. [pdf]

¡¡

The method can be used to solve approximately the zero-sum game and linear program, and is also closely related to Adaboost.

¡¡

Mar 5 Cover's universal portfolio.

Gradient descent for online learning

Basics of Forward contract and futures

Online convex programming and generalized infinitesimal gradient ascent. M. Zinkevich.

¡¡

Universal Portfolios

¡¡

Universal Portfolios With and Without Transaction Costs

¡¡

Mar 12 Online Mirror Descent

Basics of options, binomial tree

¡¡
Mar 19 Online Mirror Descent and Follow the Regularized leader

Blackwell approachability Theorem

Basics of Stochastic calculus

¡¡
Mar 26 Blackwell approachability Theorem and Online Learning

Online to Batch Conversion

Ito formula, idea of Delta hedge

Blackwell Approachability and No-Regret Learning are Equivalent

Online to batch

https://ttic.uchicago.edu/~tewari/lectures/lecture13.pdf
¡¡

¡¡

Apr 2 Rademacher complexity, VC-dimension

BSM formula

[Book] Foundation of Machine Learning, Chapter 3

[Book] Options, Futures and Other Derivatives , Chapter 14

Apr 9 the Relaxation technique

Cover's Bit Prediction, its connection with rademacher complexity

The material can be found in Lecture notes for STAT928: Statistical Learning and Sequential Prediction 
Apr 16 Interval Regret, Sleeping expert, Dynamic regret.

Pricing derivatives thru online learning

¡¡

Online Trading Algorithms and Robust Option Pricing

¡¡

Robust Option Pricing: Hannan and Blackwell Meet Black and Scholes

¡¡
Apr 23 Pricing derivatives thru online learning

Online learnability and littlestone dimension

¡¡
May 7 Online learnability and littlestone dimension

Multi-armed bandits, UCB, EXP3, Contextual bandit EXP4

Very brief mentioning of the portfolio theory

https://haipeng-luo.net/courses/CSCI699/

Portfolio theory:

https://en.wikipedia.org/wiki/Modern_portfolio_theory

Fama and French Three Factor Model   https://www.investopedia.com/terms/f/famaandfrenchthreefactormodel.asp

May 14 Generalization, Algorithmic Stability. Stability for SGD (convex).

Some basics of the stock markets.

Train faster, generalize better: Stability of stochastic gradient descent

the worldquat paper i mentioned: 101 Formulaic Alphas

May 21 Stability for SGD (convex) , Convergence of SGD (convex),

Stability of SGLD (nonconvex).

On Generalization Error Bounds of Noisy Gradient Methods for Non-Convex Learning
May 28 Generalization via Compression Framework ¡¡

Stronger generalization bounds for deep nets via a compression approach

¡¡

Jun 4 Neural Tangent Kernel and Overparameterized Neural network Class based on paper On exact computation with an infinitely wide neural net

See the references in this paper for several previous papers on this topic.

¡¡

Jun 11 (Project presentation) ¡¡

¡¡

¡¡


References:

[Book] Introduction to online convex optimization

[Book] Learning, Prediction and Games

[Book] Options, Futures and Other Derivatives  

[Book] Foundation of Machine Learning

[Book] Understanding Machine Learning: From Theory to Algorithms

Lecture notes for STAT928: Statistical Learning and Sequential Prediction 


Python is the default programming language we will use in the course.

If you haven't use it before, don't worry. It is very easy to learn (if you know any other programming language), and is a very efficient language, especially

for prototyping things related scientific computing and numerical optimization. Python codes are usually much shorter than C/C++ code (the lauguage has done a lot for you). It is also more flexible and generally faster than matlab.

A standard combination for this class is Python+numpy (a numeric lib for python)+scipy (a scientific computing lib for python)+matplotlab (for generating nice plots)

Another somewhat easier way is to install Anaconda (it is a free Python distribution with most popular packages).

¡¡

¡¡

 

¡¡

¡¡

¡¡