概率论快速学习04:概率公理 全概率 贝叶斯 事件独立性

简介:

The total probability


In the Set :
    image imageimage

                                                             image

The law of total probability is the proposition that if \left\{{B_n : n = 1, 2, 3, \ldots}\right\} is a finite or countably infinitepartition of a sample space (in other words, a set of pairwise disjoint events whose union is the entire sample space) and each event B_n is measurable, then for any event A of the same probability space:

             \Pr(A)=\sum_n \Pr(A\mid B_n)\Pr(B_n),\,

example:

例. 甲、乙两家工厂生产某型号车床,其中次品率分别为20%, 5%。已知每月甲厂生产的数量是乙厂的两倍,现从一个月的产品中任意抽检一件,求该件产品为合格的概率?

A表示产品合格,B表示产品来自甲厂

image

 

Bayes


for some partition {Bj} of the event space, the event space is given or conceptualized in terms of P(Bj) and P(A|Bj). It is then useful to compute P(A) using the law of total probability:        

                               image

 

example:

An entomologist spots what might be a rare subspecies of beetle, due to the pattern on its back. In the rare subspecies, 98% have the pattern, or P(Pattern|Rare) = 98%. In the common subspecies, 5% have the pattern. The rare subspecies accounts for only 0.1% of the population. How likely is the beetle having the pattern to be rare, or what is P(Rare|Pattern)?

From the extended form of Bayes' theorem (since any beetle can be only rare or common),

\begin{align}P(\text{Rare}|\text{Pattern}) &= \frac{P(\text{Pattern}|\text{Rare})P(\text{Rare})} {P(\text{Pattern}|\text{Rare})P(\text{Rare}) \, + \, P(\text{Pattern}|\text{Common})P(\text{Common})} \\[8pt] &= \frac{0.98 \times 0.001} {0.98 \times 0.001 + 0.05 \times 0.999} \\[8pt] &\approx 1.9\%. \end{align}

 

One more example:

image

 

Independence


Two events

Two events A and B are independent if and only if their joint probability equals the product of their probabilities:

\mathrm{P}(A \cap B) = \mathrm{P}(A)\mathrm{P}(B).

Why this defines independence is made clear by rewriting with conditional probabilities:

\begin{align} \mathrm{P}(A \cap B) = \mathrm{P}(A)\mathrm{P}(B) &\Leftrightarrow \mathrm{P}(A) = \frac{\mathrm{P}(A \cap B)}{\mathrm{P}(B)} \\ &\Leftrightarrow \mathrm{P}(A) = \mathrm{P}(A\mid B) \end{align}

how about Three events

           image

 

sometimes , we will see the Opposition that can be used to make the mess done. We will use the rule of independence such as : P(A^c)=1-P(A)\,

相关文章
|
5月前
【概率论基础】Probability | 数学性概率 | 统计性概率 | 几何概率 | 概率论三大公理
【概率论基础】Probability | 数学性概率 | 统计性概率 | 几何概率 | 概率论三大公理
49 0
|
人工智能 移动开发 BI
概率论<一>——随机事件与概率(二)
概率论<一>——随机事件与概率
概率论<一>——随机事件与概率(二)
|
Python 算法 机器学习/深度学习
《贝叶斯方法:概率编程与贝叶斯推断》一导读
贝叶斯方法是一种常用的推断方法,然而对读者来说它通常隐藏在乏味的数学分析章节背后。关于贝叶斯推断的书通常包含两到三章关于概率论的内容,然后才会阐述什么是贝叶斯推断。不幸的是,由于大多数贝叶斯模型在数学上难以处理,这些书只会为读者展示简单、人造的例子。
2244 0
|
Python 算法 机器学习/深度学习
《贝叶斯方法:概率编程与贝叶斯推断》——导读
贝叶斯方法是一种常用的推断方法,然而对读者来说它通常隐藏在乏味的数学分析章节背后。关于贝叶斯推断的书通常包含两到三章关于概率论的内容,然后才会阐述什么是贝叶斯推断。不幸的是,由于大多数贝叶斯模型在数学上难以处理,这些书只会为读者展示简单、人造的例子。
1707 0