Probability (statistics square one)
I put all the blame on my procrastination! It has been so long since I wanted to know what is Kalman filter and how to use it, but all the fun I had with my games hinders me to do so. To cut it short, I did some browsing on the net and found that Kalman filter is actually based on Bayesian inference, which is a statistical method. Knowing that, I was both happy and sad. I was happy because now I know one of the uses of statistics which I learned years ago when I was sitting in undergraduate school. The sad thing is I already forgot them all! So, in this post I will share what I learned about statistics, from square one. I hope you will get something from it.
In most of the statistics books I glance through, they always have probability as one of the topics. So I will start with learning what it actually is. Let $x$ be the set of the possible outcomes, and $F$ is a collection of subsets of $x$. A probability on $(x, F)$ is $\mu : F\rightarrow [0, 1]$. In other words, to every set in $F$, $\mu$ assigns a probability between 0 and 1. $\mu$ will be called as set function because its domain is a collection of sets. To be a probability $\mu$ must:
In most of the statistics books I glance through, they always have probability as one of the topics. So I will start with learning what it actually is. Let $x$ be the set of the possible outcomes, and $F$ is a collection of subsets of $x$. A probability on $(x, F)$ is $\mu : F\rightarrow [0, 1]$. In other words, to every set in $F$, $\mu$ assigns a probability between 0 and 1. $\mu$ will be called as set function because its domain is a collection of sets. To be a probability $\mu$ must:
- The result of $\mu (\varnothing )=0$, where $\mu$ is the empty set,
- The result of $\mu (x)=1$,
- if $A_1$ and $A_2$ are disjoint, then $\mu (A_1 \cup A_2) = \mu(A_1) + \mu(A_2)$.
It is important to remember that setting $\mu (i) = 1/6$ is not simply because the die has six faces. We set $\mu (i) = 1/6$ because we believe the die to be fair. It seems in probability there are a lot of things to consider.
Following phrases will be used interchangeably when we talk about probability:
- The probability that die lands $1$,
- Can be written as $P(1)$,
- Can be written as $P[the die lands 1]$,
- Can be written as $\mu ({1})$,
- Can be written as $\mu (1)$.
We also use the word distribution in place of probability or probability measure.
The next article will discuss about the density of probability. But before that, I need to refresh my brain about basic mathematical notations, since most of the references I read used tons of them.
Comments
Post a Comment