[7/50] #Probability and Statistics# A First Course in Probability

Recently, I finished the book A First course in probability. This book is pretty easy just like its name. However, the book did very good job in examples and motivation to theories and concepts. I want to do summary for some interesting parts for this book.

[7/50] #Probability and Statistics# A First Course in Probability_第1张图片
A First Course in Probability

PART A: Some techniques in conditioning

1. Bayes's formula

We can use the Bayes's formula to investigate the relationship between initial condition and one result of one trial. Then find the equation for the probability along with a specific initial condition. This idea is just like Dynamic Programming.

2. Conditional probability as a probability

As we known, the probability is one kind of measure. In the abstract measure theory, we can also regard conditional probability as one kind of the measure, more importantly, it is a probability measure. Then, we can use the conditioning in conditional probability. 

This technique can use to determine the probability of one event, which is clear when conditioning other two or more other events.

PART B: Interpretation And Relationship Between some common Random Variables 

I think this section is most attractive part in this book.

We start with the Bernoulli Random Variable, with n independent trials, Bernoulli r.v. can generate the Binomial Random Variables. Poisson Random Variable is a good approximation to the Binomial Random Variables when n is large, p is small s.t. np is moderate size. The interpretation of Poisson r.v. is that the number of events occurring in any interval length t. If we consider about the amount of time one has to wait until a total of n events has occur, we will find this random variable follows Gamma Random Variables (Especially, wait until just one event would follow Exponential Random Variables). Moreover, by central limit theorem, the convergence of Binomial Random Variables is a Normal Random Variables.

PART C: Some techniques of conditioning expectation

1. Computing expectation by conditioning

                                                E[ X ] = E[ E[ X|Y ] ]

2. Computing probability by conditioning

                         P( E ) = E[ characteristic function of E ] = E[ P(E|Y) ]

PART D: Deriving some specific definitions and distributions under some reasonable assumptions.

1. Poisson processes

Investigating the distribution of the Poisson process by finding the distribution of inter-arrival times and waiting time.

2. Surprise (Entropy)

Deriving the formula to describe the information of the event under some reasonable assumption (just like Shapley three principles).

3. Statistics model for designing the physical experiment

Designing a physical experiment to test the half-life period by using the theory of probability and statistics.


FYI

你可能感兴趣的:([7/50] #Probability and Statistics# A First Course in Probability)