Modules / Lectures
Video Player is loading.
Current Time 0:00
Duration -:-
Loaded: 0%
Stream Type LIVE
Remaining Time -:-
 
1x

Video Transcript:

Hello, viewers. Welcome to this NPTEL-MOOC course on Mathematical Portfolio Theory. In this week’s
classes what we will do is that we will look at the basics of Probability Theory. So, in the first
lecture today, we will do probability theory in general in both discrete and continuous time,
and we will talk about random variables. This will be followed by a discussion or
expectation variance, covariance and correlation coefficients and then we will talk about two
important distributions, namely, the binomial distribution and the normal distribution. And
we will conclude this discussion by the end of the week with a discussion on linear regression.
So, we start off our lecture number 1 with basics of probability theory and
the first thing we will do is that we will do probability space and their properties. So,
what we will do is that we will first essentially look at the finite discrete space and then we
will look at a general probability space. So, finite discrete probability space.
So, we start off with a definition of what is a finite probability space. So,
a finite probability space is defined as a pair (omega, P), where the first component omega is a
finite non-empty set (called the sample space) and the second component P is a real valued function,
which is defined on the set of all subsets of omega and this P is called a probability is called
a probability measure on the sample space omega. So, next we will look at a some of the properties
of this measure P. So, the probability measure P (which is defined above) will
satisfy the following three properties. So, satisfies the following properties.
The first property it satisfies is that for all a in omega that is all subsets of omega
the probability measure of A will lie between 0 and 1, both included. So, here I make a note
that all such A which are subset of omega are henceforth going to be called events.
The second property which is the probability of omega and this is given to be equal to 1.
And the last property of the probability measure is that if A1, A2, all the way to
An are pairwise disjoint events, then the probability of union of Ai, i equal to 1
to n this is simply going to be the sum of probability of the Ai, i is equal to 1 to n.
So, this brings me to the next definition and this is sort of at a more elementary level
than any event A, and so, I start off with that omega be a finite sample space remember
we are talking about the finite probability distribution. So, let this be a finite sample
space then for all omega belonging to omega; that means, each element of the sample space
omega. The event of this singleton event given by this omega is called an elementary event.
So, accordingly, once you have defined what is an elementary event. So, accordingly,
we can assign the probability (remember as an elementary event is also an event). So,
we can assign a probability which we will denote by p subscript of omega for each elementary event
omega such that and obviously, it is going to satisfy the probability or the property that lies
between 0 and 1, both inclusive and summation of p omega equal to 1 and obviously, this omega must be
omega belonging to the sample space big omega. So, once we have both this things set up the
probability measure and the probability of relevant event. So, therefore, we can define
the probability measure P which you have already introduced along with the three properties.
So, we will define the probability measure P in terms of elementary events as probability
of the elementary event little omega, this is p of subscript omega that we have already introduced.
Further on, I can make another statement that the probability measure P in terms of an event. So,
let us go back to a generic event A will be the sum of probabilities of all the
elementary events in A and is accordingly given by probability of A is going to be simply the
summation of probability p subscript omega where my omega is a member of the event A.
So, this brings me naturally to two more definitions. The first one is so, now,
that I have defined all these probabilities for each elementary event. So, this particular set
p subscript omega for all omega belonging to omega. So, this set is called a probability
distribution and please do not confuse this with distribution function that we will
discuss in the later part of this lecture. So, the other definition that we have; so,
here now once I have all the p omegas, so, I am now in a position to define what is known as the
probability mass function. So, the real valued function f from omega to R defined as; so, this
is going to be f of the elementary event omega since it is defined on the sample space equal to
p omega so, if I define this real valued function this is called the probability mass function.
So, continuing with our definition, let us move on to the notion of independence of events. So,
accordingly the collection of events {A1, A2, all the way through An} are said to be independent
if any sub collection of this n events. So, if any sub collection which I will
denote as {As1, As2, all the way to say some Ask} of events; so, that means, there are s
number of a members including the possibility of course, of the collection of all events right. So,
this means including the collection of these events this will satisfy the following.
So, this sub collection has to satisfy the following property that the probability of
the intersection of As1, As2 all the way to A s k. This is simply going to be the product of the
probability of Asi, i is equal to 1 to k that is it is probability of As1 into probability of As2,
all the way to probability of Ask, alright. So, once we have done with this definition
of independence we move on to the definition of what is known as a random variable. So,
this is a very important component from the context of the course.
So, eventually what we will do is that we will talk about random variable. And one of the main
random variable that we look at during the course is going to be the random variable representing
the return of any asset which is in turn going to drive the notion of expected return of an asset.
And the risk of an asset and which will then be extended to talk about what is going to be
the expected return of a portfolio and what is going to be the risk of a particular asset with
the return for each of the asset over you know several time intervals being considered as the
random variable. So, accordingly we need to give a great amount of importance to what is going to
be the definition of the random variable, both in case of the finite discrete probability space and
in case of a general probability space. So, accordingly we will now start off
with the notion of random variable. So, random variable is essentially a real valued function
and typically we denote the random variable be X. So, a real valued function X from omega to R,
that is, it is defined on a finite sample space omega is called a random variable on omega.
So, accordingly, if X is a random variable on a finite probability space which you denote
by omega P, the sample space in the process and the probability measure and let my finite
sample space omega have n number of elements or omega a little omega 1 through omega n with X
taking a finite number of possible values say x1 through xm which we put them as a set fancy A. So,
this is going to be x1 through xm. So, remember that X is a random variable
from omega to R. So, basically for each omega 1 through omega n the random variable X is going to
take some value. So, that means, that the random variable X can take only a finite number of values
and here we are looking at a setting where suppose that they take m number of finite values and those
I will designate it by x1 through xm. So, that means, every each of those x1 through xm is going
to be equal to capital X of 1 of the omega j’s. So, accordingly so, once we have set this omega
this set A of the random variables it can take. So, then for each of the xi's; that means, this
x1 through xm, we have the event and this why I am calling it an event will become clear soon. So,
this X can take any 1 of the values x1 through xm. So, the event that X equal to x i this is going to
be nothing, but all those collection of omega j such that X of omega j is equal to x i. So;
that means, all those omega j’s in the sample space which takes the value x i
will be bundled together and represented as the event of X equal to x i in the chain bracket.
So, this implies that that the event that X is less than or equal to x i this is going to
be all those omega j’s such that X of omega j is less than or equal to x i, alright.
So, just to wind this up I will just say that further the random variable X this describes
the probability measure. So, I am bringing the probability measure into the picture again. So,
describes the probability measure fancy P, but the subscript X on the set A by probability subscript
X of little x i (remember this is on the set A. So, A is basically now working like some
sort of a sample space). And this is going to be nothing, but probability X equal to x i and this
is called the probability measure probability measure defined by the random variable X. So,
accordingly the function small f from A to R, just like we had defined the small f earlier
to defining the probability of mass function defined by f of x i is equal to probability
X equal to x i is called just like before this time also we are calling this as the probability
mass function of the random variable X. So, here we specify that is a probability
mass function for the random variable X. So, accordingly, we now have our definition,
now once we have this probability measure and the probability mass function for X, so,
naturally we have to talk about independence. So, the collection of random variables X 1, all
the way to X n, some n number of random variables are said to be independent if the probability of X
1 is equal to some x 1 all the way to probability of X n is equal to small x n this is simply going
to be the product of probability of X i is equal to little x i, i is equal to 1 to n alright.
So, what we have done so far is, we have looked at what is a finite probability space and we looked
at what is a probability measure and we talked about random variables. And we talked about
independence of events as well as the independence of event under the random variable X. So,
now, we need to move on from a finite discrete probability space to a general probability space
to have a more general idea with a particular emphasis on a continuous probability space.
So, accordingly, we now start on the general theory of a probability and we
will now move on to two things; one is the infinite discrete probability space and we
will talk about continuous probability space. So, we begin with a definition.
So, as before let omega be a non empty set. Now, in the previous case we had only talked about
a probability space in terms of the sample space omega and the probability measure P.
However, we now need to have an additional term here and accordingly we introduce what
is known as the sigma algebra. So, accordingly we can write that a non-empty collection which
are denote by sigma and this is the collection of subsets of the sample space omega is a sigma
algebra if it satisfies the following properties. The first property is that omega belongs to this
sigma algebra. second if we have so, this is our second properties of closure under
countable union. So, if A 1, A 2 all the way is a sequence of elements of sigma,
then the union of all these A i's, i equal to 1 to infinity this also belongs to sigma. And,
the 3rd property is closure under complement which says that if A belongs to sigma, then the
complement of A also belongs to sigma, alright. So, this brings us to the definition of a
measurable space. So, we will first have to define what is the measurable space and then we
will define what is the probability space just like we had done in the finite discrete case.
So, first definition that is on what is a measurable space. So,
a measurable space is the pair of omega along with this sigma algebra, sigma, where omega
is a non-empty set and sigma is a sigma algebra of subsets of omega that you have just defined.
So, this takes care of what is a measurable space and now we are in a position to talk about what is
a probability space. So, a probability space is a triple. So, you recollect the earlier probability
space only had omega and P, but now we have omega sigma and P with the non-empty set omega
being the sample space, and sigma being a sigma algebra of subsets of omega whose elements are
called events and P being a real valued function defined on sigma and called a probability measure.
So, basically the probability space is this triple omega sigma P, where my omega is the sample space,
sigma is a sigma algebra and P is a probability measure. So, now, that we have introduced what
is a probability measure, so, we need to talk about properties similar to the case we had
done in case of the finite probability space. So, here accordingly, so, what are going to be
the properties of P? So, now, the probability measure P as defined above in case of a general
probability space. So, I have to you know specify that. So, as it is defined above
this satisfies the following properties. The first of property is the range. So,
as before for all A event A in omega. So, 0 less than equal to P A less than equal to 1. So,
the probability lies between 0 and 1 are both inclusive. P of omega is equal to 1. So,
these two properties are the same as before. However, in the last case since now we can
have an infinite set so, we will have if A 1, A 2 and so on, is a sequence of pair wise disjoint
events. Then the probability of union of A i; i equal to 1 to infinity which was earlier i
equal to 1 to n this is going to be summation of probability A i, i is equal to 1 to infinity.
So, basically now we can say that a probability space here is given as omega as sigma P where
we are specified that the omega is a non-empty set or the sample space and sigma is a sigma
algebra whose properties have been enumerated the three properties and the probability
measure also is has been enumerated in terms of its three properties, okay.
So, let us now come to the topic of distribution function and we will first of all begin with
the motivation of a why one must make use of distribution function and then we will move on
to the definition of distribution function in the paradigm of a continuous probability space. So,
accordingly we make the statement that distribution function, so,
we just give the motivation to begin with. See, what happens is that in case of a
finite or discrete a probability space, the probability measure one typically described
using the probability mass function which I will denote by pmf and you
would recall that this was f of little omega was probability of the elementary event omega is not
easily extendable to continuous probability. And, so, because of this reason, so, accordingly
it is for this reason that we introduce the concept of probability distribution function.
So, this natural brings us to the introduction of the definition. So, a real valued function
F from R to R is called a probability distribution a function if it satisfies
the following three enumerated properties. So, the first property is that F is
non-decreasing. So, this means that if s is strictly less than t this will imply
that F of s will be less than or equal to F of t. The second property is that
F is what is known as right-continuous. So, that is a limit of F of t as t tends
to a plus is F of a, and the last property is that F satisfies limit F of t as t tends
to minus infinity is equal to 0 and limit F of t as t tends to plus infinity is equal to 1.
So, based on this definition with this three property of non-decreasing,
right continuous and basically the limit as t tends to minus infinity and plus infinity.
We are now in a position to state two results. So, the first result state the following that let P be
a probability measure on R. Then, the function F with subscript P to identify with the in
probability measure from R to R define. So, it is a particular function that I am defining in terms
of the probability measure P. So, this function F P from R to R defined as F P of t is equal to;
so, I am defining this as probability of the interval minus infinity close interval t is
qualifies as a probability distribution function and is called the distribution function of P.
And, the second result that I want to state is the following that let F from R to R be
a distribution function as defined in part one; so, this be a distribution function then there
exist a unique probability measure P F on R whose distribution function is F.
So, that is that if you are given a distribution function then there will
exist a unique probability measure whose distribution function is F. So, that means,
when you are given an F of t you can find a probability measure P F such that P F of minus
infinity t means this is equal to F of t. So, the next thing is that we can now move
on to the concept of a density functions. So, we have the definition is the following.
A probability measure P or equivalently in light of the results 1 and 2, I can talk about a measure
P and equivalently a distribution function F P is absolutely continuous if it has a density function
f from R to R which is non-negative to confirm with the non negativity of probability. So,
which is not negative such that F P of t is equal to integral minus infinity to t f x dx and so,
accordingly probability of the interval a, b this is going to be integral a to b f of x dx, alright.
So, now, that you have defined what is the distribution function and what is
the probability density function. So, we are now in a position to start talking about what
is the random variable in the context of a continuous distribution. So, accordingly
we revisit random variable in this setup now. So, accordingly so, I will first talk about so,
let omega sigma be a measurable space, alright. So, a function X from omega to R that is a real
valued function defined on the sample space R is said to be sigma measurable if the inverse
image of every open interval is in sigma. Remember that sigma was a collection of subsets of omega.
So, what I am saying is that if it turns out that there are a function X which is defined
from omega to R is said to be sigma measurable if the inverse image of every open interval. So,
any open interval in R from here if the inverse image of that belongs to sigma,
then we say that this function X is sigma measurable. So, in other words, X inverse
of any open interval a, b belongs to sigma, okay. So, a measurable function; so, in our a measurable
function on omega sigma is also called a random variable. So, that is your definition of random
variable in the continuous time set up. Further, we consider the probability space
omega sigma P and X being a random variable on omega, sigma, the measurable space omega,
sigma. Then this random variable X defines a distribution function, which will denote
by capital F subscript X, and a corresponding probability measure P subscript X on R just like
we had done in case of the finite case, and I will denote this by F X of t is going to be P subscript
X minus infinity close t which is the same as probability of X being less than or equal to t.
So, then it brings us to the definition of what is independence of random variables. So,
accordingly we can now talk about the collection of random variables, and in this case we talk
about random variables in the continuous time. So, the collection of random variables capital
X 1 through X n defined on the measurable space omega sigma are said to be independent if the
probability of X 1 less than or equal to t 1 all the way to X n less than or equal to t n, this is
nothing, but the product of the probabilities of X i less than or equal to t I, i is equal to 1 to n.
So, just to sum up what we have discuss today we talked about probability space, we talked about
a finite discrete probability space, and then we extended this to general probability space,
we talked about what is the probability mass function, probability density function at
the distribution function. And, we discussed the definition of the random variables in both the set
ups along with that definition of independence in both the discrete and the continuous time set up.
In the next class, we will talk about the moments in the probability space framework and namely,
we will talk about the first two moments, the expectation and the variance and we will talk
about covariance and correlation, coefficients and we will discuss some of the other properties
pertaining to them. Thank you for watching.
Auto Scroll Hide
Module NameDownload
noc20_ma36_assignment_Week_0noc20_ma36_assignment_Week_0
noc20_ma36_assignment_Week_1noc20_ma36_assignment_Week_1
noc20_ma36_assignment_Week_10noc20_ma36_assignment_Week_10
noc20_ma36_assignment_Week_11noc20_ma36_assignment_Week_11
noc20_ma36_assignment_Week_12noc20_ma36_assignment_Week_12
noc20_ma36_assignment_Week_2noc20_ma36_assignment_Week_2
noc20_ma36_assignment_Week_3noc20_ma36_assignment_Week_3
noc20_ma36_assignment_Week_4noc20_ma36_assignment_Week_4
noc20_ma36_assignment_Week_5noc20_ma36_assignment_Week_5
noc20_ma36_assignment_Week_6noc20_ma36_assignment_Week_6
noc20_ma36_assignment_Week_7noc20_ma36_assignment_Week_7
noc20_ma36_assignment_Week_8noc20_ma36_assignment_Week_8
noc20_ma36_assignment_Week_9noc20_ma36_assignment_Week_9





Sl.No Language Book link
1EnglishDownload
2BengaliNot Available
3GujaratiNot Available
4HindiNot Available
5KannadaNot Available
6MalayalamNot Available
7MarathiNot Available
8TamilNot Available
9TeluguNot Available