v2.6.4 (3793)

# PA - C8 - MAP658 : Time series for Financial Data

## Descriptif

General objectives
The main goal of this course is to introduce and explain the statistical methods used for the analysis
and forecasting of certain nancial time series. This domain of applications have given rise to
substantial modeling e orts in the last decades, which allow one to consider many types of nancial
time series (price returns, rates, transactions data): linear time series, conditionally heteroscedastic
time series, multivariate time series, discrete time series and so on. The main classes of linear and
non-linear models will be introduced as well as the statistical methods associated to them.
The main prerequisites to attend this course are the bases of linear algebra, Hilbert geometry,
probability and statistics.
Core information
Schedule: Mid-March to end of March.
ECTS: 2.5
Evaluation: Case study report with executable R code in the form of an R Markdown document,
and presentation.
Outline
The precise outline is as follows.

/18

Numérique sur 20

## Pour les étudiants du diplôme Echanges PEI

Le rattrapage est autorisé (Max entre les deux notes)
L'UE est acquise si Note finale >= 10
• Crédits ECTS acquis : 2.5 ECTS

## Pour les étudiants du diplôme Data Sciences

Le rattrapage est autorisé (Max entre les deux notes)
L'UE est acquise si Note finale >= 10
• Crédits ECTS acquis : 2.5 ECTS

### Programme détaillé

Day I : Crash course on nancial time series.
Lecture 1 An introduction using the likelihood as a guideline. In this rst lesson, we use the
likelihood function as a guideline for statistical modeling of time series. The likelihood
function is essential for statistical inference as it fully exploits the way one models the
data. The goal of this lesson is to understand why and how, in the context of time
series, the dynamics contained in the data appear in the likelihood. We will follow the
following outline:
1) Examples of nancial time series.
2) Reminders: i.i.d. models.
a) Univariate models.
b) Multivariate models.
c) Regression model.
d) Hidden variables.
3) Introducing dynamics.
a) What's wrong with i.i.d. models ?
b) Univariate models
c) Multivariate models
Lecture 2 Stationary and weakly stationary time series. Stationarity is an assumption underlying
all statistical inference procedures. This notion has to be understood in the context
of time series, where the dynamics of the data is essential in the modeling, as seen in
Lesson 1. The goal of this lesson is to de ne stationary and weakly stationary time
series and to understand these de nitions through examples and practical questions
such as detrending. We also introduce the main tools for the statistical analysis of
linear L2 time series.

1) Stationary Time series
a) The statistical approach
b) Classical steps of statistical inference
c) Random processes in a nutshell
d) Examples
e) Stationary time series
2) Weakly stationary time series
a) L2 processes
b) Weak stationarity
c) Spectral measure
d) Empirical estimation
 Day II : Linear prediction
Lecture 3 Linear prediction and ARMA models. Linear prediction relies only on the second order
properties of the process. Practical algorithms such as the Levinson algorithm or the
Innovation algorithm are derived in this context. The objects developed for linear pre-
diction, such as correlation, partial correlation and innovation, are also used for de ning
and understanding AR, MA, and ARMA models. In order to understand the proper-
ties of these models, some preliminary work is required on general `1 convolution lters.
ARMA models are widespread parametric linear models for time series. They can be
characterized easily using the autocovariance function and the partial autocorrelation
function.
1) Linear prediction
a) Prediction VS linear prediction
b) Linear prediction for weakly stationary processes
c) Innovation process
2) Composition and inversion of `1 convolution lters
a) Example
b) General results
c) Inversion of a nite order lter
3) ARMA processes
a) ARMA equations, stationary solutions
b) Innovations of ARMA processes
c) Characterization of MA processes
d) Characterization of AR processes
Case Study 1 ARMA modeling
 Day III : Heteroscedastic models.
Lecture 4 Modeling volatility in nancial data. Volatility is a essentially used as a measure of the
risk in nancial time series. The main limitation of linear models is that their volatility
is constant. Introducing heteroscedastic models while preserving the stationarity can be
done by conditioning. The main part of the lesson is dedicated to the class of GARCH
processes, which are built on the idea that conditional volatility can be made random
in a way similar to the conditional mean of ARMA processes. We will compare such
models with the sthochasitic volatility model, where the volatility is exogeneous.
1) Standard models for nancial time series
a) Statistical properties of returns
b) What's wrong with ARMA models?

c) Stochastic volatility models
d) ARCH and GARCH models
2) Explicit construction of GARCH processes
a) Construction from an IID sequence
b) Stochastic autoregressive models
c) Stationary non-anticipative solutions
d) Empirical study
Case Study 2 GARCH & EGARCH modeling of log returns
 Day IV : Multivariate nancial time series.
Lecture 5 Multivariate time series analysis. It has been known for a long time that investment
requires diversi cation. To optimize a portfolio, it is indeed required to model the joint
behavior of a panel of assets. The main goal of this lecture is to introduce the main
tools for modeling and estimating the second order statistics of multivariate time series.
Most of the classical linear models such as ARMA or VARMA models can be embedded
in the class of dynamic linear models, where the dynamics are essentially carried out
through a vector state variable, which is not directly observed. Ecient algorithms for
ltering, forecasting and computing the likelihood will be presented.
1) Second order statistics
a) Bases of Portfolio management
b) Autocovariance matrices
c) Spectral and cross-spectral density functions
2) Dynamic linear models
a) General setting
b) Main algorithms
c) Illustrative example
Case Study 3 Fitting and forecasting realized volatility. 