งานนำเสนอกำลังจะดาวน์โหลด โปรดรอ

งานนำเสนอกำลังจะดาวน์โหลด โปรดรอ

Bayesian Network Dr.Yodthong Rodkaew. น้ำมัน Gas สตารท์รถ Start แบ็ตเตอรรี่ Battery P(Bad Battery | Has Gas, Won’t Start)

งานนำเสนอที่คล้ายกัน


งานนำเสนอเรื่อง: "Bayesian Network Dr.Yodthong Rodkaew. น้ำมัน Gas สตารท์รถ Start แบ็ตเตอรรี่ Battery P(Bad Battery | Has Gas, Won’t Start)"— ใบสำเนางานนำเสนอ:

1 Bayesian Network Dr.Yodthong Rodkaew

2 น้ำมัน Gas สตารท์รถ Start แบ็ตเตอรรี่ Battery P(Bad Battery | Has Gas, Won’t Start)

3 Introduction Bayesian Networks are directed acyclic graphs (DAGs) with an associated set of probability tables. Deal with uncertainty in inference via probability  Bayes. Handle incomplete data set, e.g., classification, regression.

4 Introduction Independence assumptions สมมติฐานที่ไม่ขึ้นต่อ กัน –เกี่ยวข้องกับความน่าจะเป็นของเหตุการณ์ที่จะเกิดขึ้นหลายๆ เหตุการณ์ที่ไม่ขึ้นต่อกัน Naïve Bayes Method (Naïve = ซื่อๆ) –สร้างสมมติฐานที่ไม่ขึ้นต่อกันที่บางครั้งอาจไม่เป็นจริงเสมอ ไป Bayesian Network –แบบจำลองอย่างชัดเจน ที่มีข้อมูลที่เป็นอิสระต่อกัน –ใช้ความสัมพันธ์อย่างเป็นอิสระต่อกัน ในการหาค่าความน่าจะ เป็นที่เกี่ยวข้องของเหตุการณ์ –Also known as: Belief Net, Bayes Net, Causal Net, …

5 Introduction Intuitive Language ภาษาที่ใช้การหยั่งรู้ –สามารถใช้ความรู้ทั่วไปสร้างแบบจำลองได้ –ผู้เชี่ยวชาญในด้านต่างๆ สามารถใช้สร้างแผนภาพได้ง่าย General Purpose "interference" algorithms ขั้นตอนวิธีครอบจักรวาลสำหรับแสดงสิ่งที่เกี่ยวข้องกัน –P(ฝนตก | รถติด, มาสาย) รถติด มาสาย ฝนตก

6 Introduction Example: Alarm The story: In LA burglary and earthquake are not uncommon. They both can cause alarm. In case of alarm, two neighbors John and Mary may call Problem: Estimate the probability of a burglary based who has or has not called Variables: Burglary (B), Earthquake (E), Alarm (A), JohnCalls (J), MaryCalls (M) Knowledge required to solve the problem: P(B, E, A, J, M) Burglary = ย่องเบา ; Earthquake = แผ่นดินไหว

7

8 Introduction What is the probability of burglary given that Mary called, P(B = y | M = y)? Compute marginal probability: P(B, M) =  E, A, J P(B, E, A, J, M) Use the definition of conditional probability Answer:

9 Bayesian Networks Directed Acyclic Graph (DAG) –Nodes เป็นตัวแปรใดๆ –Edges แสดงถึงการเชื่อมโยงกัน Burglary Earthquake Alarm JohnCalls MaryCalls Burglary = ย่องเบา ; Earthquake = แผ่นดินไหว

10 Conditional Probability Tables Each node has a conditional probability table (CPT) that gives the probability of each of its values given every possible combination of values for its parents (conditioning case). –Roots (sources) of the DAG that have no parents are given prior probabilities. Burglary Earthquake Alarm JohnCalls MaryCalls P(B).001 P(E).002 BEP(A) TT.95 TF.94 FT.29 FF.001 AP(M) T.70 F.01 AP(J) T.90 F.05

11 CPT Comments Probability of false not given since rows must add to 1. Example requires 10 parameters rather than 2 5 –1 = 31 for specifying the full joint distribution. Number of parameters in the CPT for a node is exponential in the number of parents (fan-in).

12 Joint Distributions for Bayes Nets A Bayesian Network implicitly defines a joint distribution. Example Therefore an inefficient approach to inference is: –1) Compute the joint distribution using this equation. –2) Compute any desired conditional probability using the joint distribution. Burglary Earthquake Alarm JohnCalls MaryCalls

13 Independence of Variables Instantiation of a variable is an event. A set of variables are independent iff all possible instantiations of the variables are independent. Example: X: patient blood pressure {high, medium, low} Y: patient sneezes {yes, no} P(X=high, Y=yes) = P(X=high) x P(Y=yes) P(X=high, Y=no) = P(X=high) x P(Y=no)... P(X=low, Y=yes) = P(X=low) x P(Y=yes) P(X=low, Y=no) = P(X=low) x P(Y=no) Conditional independence between a set of variables holds iff the conditional independence between all possible instantiations of the variables holds.

14 Example Train Strike Train Strike Martin Late Martin Late Norman Late Norman Late Project Delay Project Delay Office Dirty Office Dirty Boss Angry Boss Angry Boss Failure-in-Love Boss Failure-in-Love Martin Oversleep Martin Oversleep Norman Oversleep Norman Oversleep Use a DAG to model the causality.

15 Example Train Strike Train Strike Martin Late Martin Late Norman Late Norman Late Project Delay Project Delay Office Dirty Office Dirty Boss Angry Boss Angry Boss Failure-in-Love Boss Failure-in-Love Martin Oversleep Martin Oversleep Norman Oversleep Norman Oversleep Attach prior probabilities to all root nodes Norman oversleep Probability T0.2 F0.8 Train Strike Probability T0.1 F0.9 Martin oversleep Probability T0.01 F0.99 Boss failure-in-love Probability T0.01 F0.99

16 Example Train Strike Train Strike Martin Late Martin Late Norman Late Norman Late Project Delay Project Delay Office Dirty Office Dirty Boss Angry Boss Angry Boss Failure-in-Love Boss Failure-in-Love Martin Oversleep Martin Oversleep Norman Oversleep Norman Oversleep Attach prior probabilities to non-root nodes Norman untidy Norman untidy Norman oversleep TF Norman untidy T0.60.2 F0.40.8 Train strike TF Martin oversleep TFTF Martin Late T0.950.80.70.05 F 0.20.30.95 Each column is summed to 1.

17 Example Train Strike Train Strike Martin Late Martin Late Norman Late Norman Late Project Delay Project Delay Office Dirty Office Dirty Boss Angry Boss Angry Boss Failure-in-Love Boss Failure-in-Love Martin Oversleep Martin Oversleep Norman Oversleep Norman Oversleep Norman untidy Norman untidy Each column is summed to 1. Boss Failure-in-love TF Project Delay TFTF Office Dirty TFTFTFTF Boss Angry very0.980.850.60.50.30.200.01 mid0.020.150.30.250.5 0.20.02 little000.10.250.20.30.70.07 no0000000.10.9 What is the difference between probability & fuzzy measurements? Attach prior probabilities to non-root nodes

18 Example Train Strike Train Strike Martin Late Martin Late Norman Late Norman Late Train Strike Probability T0.1 F0.9 Train Strike TF Norman Late T0.80.1 F0.20.9 Train Strike TF Martin Late T0.60.5 F0.40.5 Questions: P (“Matrin Late” | “Norman Late ”)=? P (“Martin Late”)=? P (“Martin Late”, “Norman Late”, “Train Strike”)=? Joint distribution Marginal distribution Conditional distribution

19 Example Train Strike Train Strike Martin Late Martin Late Norman Late Norman Late Train Strike Probab ility T0.1 F0.9 Train Strike TF Norman Late T0.80.1 F0.20.9 Train Strike TF Martin Late T0.60.5 F0.40.5 Questions: P (“Martin Late”, “Norman Late”, “Train Strike” )=? Joint distribution BA C ABCProbability TTT0.048 FTT0.032 TFT0.012 FFT0.008 TTF0.045 FTF TFF0.405 FFF e.g.,

20 Example Train Strike Train Strike Martin Late Martin Late Norman Late Norman Late Train Strike Probab ility T0.1 F0.9 Train Strike TF Norman Late T0.80.1 F0.20.9 Train Strike TF Martin Late T0.60.5 F0.40.5 Questions: P (“Martin Late”, “Norman Late” )=? Marginal distribution BA C ABCProbability TTT0.048 FTT0.032 TFT0.012 FFT0.008 TTF0.045 FTF TFF0.405 FFF e.g., ABProbability TT0.093 FT0.077 TF0.417 FF0.413

21 Example Train Strike Train Strike Martin Late Martin Late Norman Late Norman Late Train Strike Probabili ty T0.1 F0.9 Train Strike TF Norman Late T0.80.1 F0.20.9 Train Strike TF Martin Late T0.60.5 F0.40.5 Questions: P (“Martin Late” )=? Marginal distribution BA C ABCProbabili ty TTT0.048 FTT0.032 TFT0.012 FFT0.008 TTF0.045 FTF TFF0.405 FFF e.g., ABProbabil ity TT0.093 FT0.077 TF0.417 FF0.413 AProbability T0.51 F0.49

22 Example Train Strike Train Strike Martin Late Martin Late Norman Late Norman Late Train Strike Probabilit y T0.1 F0.9 Train Strike TF Norman Late T0.80.1 F0.20.9 Train Strike TF Martin Late T0.60.5 F0.40.5 Questions: P (“Martin Late” | “Norman Late” )=? Conditional distribution BA C e.g., AProbability T0.51 F0.49 BProbability T0.17 F0.83 ABCProbabili ty TTT0.048 FTT0.032 TFT0.012 FFT0.008 TTF0.045 FTF TFF0.405 FFF ABProbabil ity TT0.093 FT0.077 TF0.417 FF0.413

23 Exercise ABCProbability TTT0.012 FTT0.048 TFT0.008 FFT0.032 TTF0.009 FTF0.081 TFF FFF0.729 Dragon is Killed Dragon is Killed Drop rare item Drop rare item Drop 1000 Gold Drop 1000 Gold Dragon is Killed Probability T =killed0.1 F=Flee0.9 Dragon killed TF Drop 1000 Gold T0.60.1 F0.40.9 Dragon killed TF Drop Rare Item T0.20.1 F0.80.9 Questions: P(“Drop Rare Item” | “ not Drop 1000 Gold ”)=? P (“Drop Rare Item”)=? P (“Dragon is not Killed”, “Drop Rare Item”, “Drop 1000 gold”)=0.009 Joint distribution Marginal distribution Conditional distribution ABProbability TT0.021 FT0.129 TF0.089 FF0.761

24 Exercise ABCProbability TTT0.012 FTT0.048 TFT0.008 FFT0.032 TTF0.009 FTF0.081 TFF FFF0.729 ABProbability TT0.021 FT0.129 TF0.089 FF0.761 Dragon is Killed Dragon is Killed Drop rare item Drop rare item Drop 1000 Gold Drop 1000 Gold Dragon is Killed Probability T =killed0.1 F=Flee0.9 Dragon killed TF Drop 1000 Gold T0.60.1 F0.40.9 Dragon killed TF Drop Rare Item T0.20.1 F0.80.9 Questions: P(“Drop Rare Item” | “ not Drop 1000 Gold ”)=0.105 P (“Drop Rare Item”)=0.11 P (“Dragon is not Killed”, “Drop Rare Item”, “Drop 1000 gold”)=0.009 Joint distribution Marginal distribution Conditional distribution AProbability T0.11 F0.89 BProbability T0.15 F0.85


ดาวน์โหลด ppt Bayesian Network Dr.Yodthong Rodkaew. น้ำมัน Gas สตารท์รถ Start แบ็ตเตอรรี่ Battery P(Bad Battery | Has Gas, Won’t Start)

งานนำเสนอที่คล้ายกัน


Ads by Google