Bayes' theorem - Wikipedia
文章推薦指數: 80 %
Using pedigree to calculate probabilities Bayes'theorem FromWikipedia,thefreeencyclopedia Jumptonavigation Jumptosearch Probabilitybasedonpriorknowledge "Bayesrule"redirectshere.Fortheconceptindecisiontheory,seeBayesestimator. AblueneonsignshowingthesimplestatementofBayes'stheorem PartofaseriesonBayesianstatistics Theory Admissibledecisionrule Bayesianefficiency Bayesianepistemology Bayesianprobability Probabilityinterpretations Bayes'theorem Bayesfactor Bayesianinference Bayesiannetwork Prior Posterior Likelihood Conjugateprior Posteriorpredictive Hyperparameter Hyperprior Principleofindifference Principleofmaximumentropy EmpiricalBayesmethod Cromwell'srule Bernstein–vonMisestheorem Schwarzcriterion Credibleinterval Maximumaposterioriestimation Radicalprobabilism Techniques Bayesianlinearregression Bayesianestimator ApproximateBayesiancomputation MarkovchainMonteCarlo IntegratednestedLaplaceapproximations Mathematicsportalvte Inprobabilitytheoryandstatistics,Bayes'theorem(alternativelyBayes'laworBayes'rule;recentlyBayes–Pricetheorem[1]: 44–46, 67 ),namedafterThomasBayes,describestheprobabilityofanevent,basedonpriorknowledgeofconditionsthatmightberelatedtotheevent.[2]Forexample,iftheriskofdevelopinghealthproblemsisknowntoincreasewithage,Bayes'theoremallowstherisktoanindividualofaknownagetobeassessedmoreaccurately(byconditioningitontheirage)thansimplyassumingthattheindividualistypicalofthepopulationasawhole. OneofthemanyapplicationsofBayes'theoremisBayesianinference,aparticularapproachtostatisticalinference.Whenapplied,theprobabilitiesinvolvedinthetheoremmayhavedifferentprobabilityinterpretations.WithBayesianprobabilityinterpretation,thetheoremexpresseshowadegreeofbelief,expressedasaprobability,shouldrationallychangetoaccountfortheavailabilityofrelatedevidence.BayesianinferenceisfundamentaltoBayesianstatistics,beingconsidered"tothetheoryofprobabilitywhatPythagoras'stheoremistogeometry."[3] Contents 1Statementoftheorem 1.1Proof 1.1.1Forevents 1.1.2Forcontinuousrandomvariables 1.1.3Generalcase 2Examples 2.1Recreationalmathematics 2.2Drugtesting 2.2.1Sensitivityorspecificity 2.3Cancerrate 2.4Defectiveitemrate 3Interpretations 3.1Bayesianinterpretation 3.2Frequentistinterpretation 3.2.1Example 4Forms 4.1Events 4.1.1Simpleform 4.1.2Alternativeform 4.1.3Extendedform 4.2Randomvariables 4.2.1Simpleform 4.2.2Extendedform 4.3Bayes'ruleinoddsform 5Correspondencetoothermathematicalframeworks 5.1Propositionallogic 5.2Subjectivelogic 6Generalizations 6.1Conditionedversion 6.1.1Derivation 6.2Bayes'rulewith3events 7History 8Useingenetics 8.1Usingpedigreetocalculateprobabilities 8.2Usinggenetictestresults 8.3Genetictestingdoneinparallelwithotherriskfactoridentification. 9Seealso 10Notes 11References 12Bibliography 13Furtherreading 14Externallinks Statementoftheorem[edit] Bayes'theoremisstatedmathematicallyasthefollowingequation:[4] P ( A ∣ B ) = P ( B ∣ A ) P ( A ) P ( B ) {\displaystyleP(A\midB)={\frac{P(B\midA)P(A)}{P(B)}}} where A {\displaystyleA} and B {\displaystyleB} areeventsand P ( B ) ≠ 0 {\displaystyleP(B)\neq0} . P ( A ∣ B ) {\displaystyleP(A\midB)} isaconditionalprobability:theprobabilityofevent A {\displaystyleA} occurringgiventhat B {\displaystyleB} istrue.Itisalsocalledtheposteriorprobabilityof A {\displaystyleA} given B {\displaystyleB} . P ( B ∣ A ) {\displaystyleP(B\midA)} isalsoaconditionalprobability:theprobabilityofevent B {\displaystyleB} occurringgiventhat A {\displaystyleA} istrue.Itcanalsobeinterpretedasthelikelihoodof A {\displaystyleA} givenafixed B {\displaystyleB} because P ( B ∣ A ) = L ( A ∣ B ) {\displaystyleP(B\midA)=L(A\midB)} . P ( A ) {\displaystyleP(A)} and P ( B ) {\displaystyleP(B)} aretheprobabilitiesofobserving A {\displaystyleA} and B {\displaystyleB} respectivelywithoutanygivenconditions;theyareknownasthemarginalprobabilityorpriorprobability. Proof[edit] Forevents[edit] Bayes'theoremmaybederivedfromthedefinitionofconditionalprobability: P ( A ∣ B ) = P ( A ∩ B ) P ( B ) , if P ( B ) ≠ 0 , {\displaystyleP(A\midB)={\frac{P(A\capB)}{P(B)}},{\text{if}}P(B)\neq0,} where P ( A ∩ B ) {\displaystyleP(A\capB)} istheprobabilityofbothAandBbeingtrue.Similarly, P ( B ∣ A ) = P ( A ∩ B ) P ( A ) , if P ( A ) ≠ 0 , {\displaystyleP(B\midA)={\frac{P(A\capB)}{P(A)}},{\text{if}}P(A)\neq0,} Solvingfor P ( A ∩ B ) {\displaystyleP(A\capB)} andsubstitutingintotheaboveexpressionfor P ( A ∣ B ) {\displaystyleP(A\midB)} yieldsBayes'theorem: P ( A ∣ B ) = P ( B ∣ A ) P ( A ) P ( B ) , if P ( B ) ≠ 0. {\displaystyleP(A\midB)={\frac{P(B\midA)P(A)}{P(B)}},{\text{if}}P(B)\neq0.} Forcontinuousrandomvariables[edit] FortwocontinuousrandomvariablesXandY,Bayes'theoremmaybeanalogouslyderivedfromthedefinitionofconditionaldensity: f X ∣ Y = y ( x ) = f X , Y ( x , y ) f Y ( y ) {\displaystylef_{X\midY=y}(x)={\frac{f_{X,Y}(x,y)}{f_{Y}(y)}}} f Y ∣ X = x ( y ) = f X , Y ( x , y ) f X ( x ) {\displaystylef_{Y\midX=x}(y)={\frac{f_{X,Y}(x,y)}{f_{X}(x)}}} Therefore, f X ∣ Y = y ( x ) = f Y ∣ X = x ( y ) f X ( x ) f Y ( y ) . {\displaystylef_{X\midY=y}(x)={\frac{f_{Y\midX=x}(y)f_{X}(x)}{f_{Y}(y)}}.} Generalcase[edit] Let P Y x {\displaystyleP_{Y}^{x}} betheconditionaldistributionof Y {\displaystyleY} given X = x {\displaystyleX=x} andlet P X {\displaystyleP_{X}} bethedistributionof X {\displaystyleX} .Thejointdistributionisthen P X , Y ( d x , d y ) = P Y x ( d y ) P X ( d x ) {\displaystyleP_{X,Y}(dx,dy)=P_{Y}^{x}(dy)P_{X}(dx)} .Theconditionaldistribution P X y {\displaystyleP_{X}^{y}} of X {\displaystyleX} given Y = y {\displaystyleY=y} isthendeterminedby P X y ( A ) = E ( 1 A ( X ) | Y = y ) {\displaystyleP_{X}^{y}(A)=E(1_{A}(X)|Y=y)} ExistenceanduniquenessoftheneededconditionalexpectationisaconsequenceoftheRadon-Nikodymtheorem.ThiswasformulatedbyKolmogorovinhisfamousbookfrom1933.Kolmogorovunderlinestheimportanceofconditionalprobabilitybywriting'Iwishtocallattentionto...andespeciallythetheoryofconditionalprobabilitiesandconditionalexpectations...'inthePreface.[5]TheBayestheoremdeterminestheposteriordistributionfromthepriordistribution.Bayestheoremcanbegeneralizedtoincludeimproperpriordistributionssuchastheuniformdistributionontherealline.[6]ModernMarkovChainMonteCarlomethodshaveboostedtheimportanceofBayestheoremincludingcaseswithimproperpriors.[7] Examples[edit] Recreationalmathematics[edit] Bayes'ruleandcomputingconditionalprobabilitiesprovideasolutionmethodforanumberofpopularpuzzles,suchastheThreePrisonersproblem,theMontyHallproblem,theTwoChildproblemandtheTwoEnvelopesproblem. Drugtesting[edit] Figure1:Usingafrequencyboxtoshow P ( User ∣ Positive ) {\displaystyleP({\text{User}}\mid{\text{Positive}})} visuallybycomparisonofshadedareas Suppose,aparticulartestforwhethersomeonehasbeenusingcannabisis90%sensitive,meaningthetruepositiverate(TPR)=0.90.Therefore,itleadsto90%truepositiveresults(correctidentificationofdruguse)forcannabisusers. Thetestisalso80%specific,meaningtruenegativerate(TNR)=0.80.Therefore,thetestcorrectlyidentifies80%ofnon-usefornon-users,butalsogenerates20%falsepositives,orfalsepositiverate(FPR)=0.20,fornon-users. Assuming0.05prevalence,meaning5%ofpeopleusecannabis,whatistheprobabilitythatarandompersonwhotestspositiveisreallyacannabisuser? ThePositivepredictivevalue(PPV)ofatestistheproportionofpersonswhoareactuallypositiveoutofallthosetestingpositive,andcanbecalculatedfromasampleas: PPV=Truepositive/Testedpositive Ifsensitivity,specificity,andprevalenceareknown,PPVcanbecalculatedusingBayestheorem.Let P ( User ∣ Positive ) {\displaystyleP({\text{User}}\mid{\text{Positive}})} mean"theprobabilitythatsomeoneisacannabisusergiventhattheytestpositive,"whichiswhatismeantbyPPV.Wecanwrite: P ( User ∣ Positive ) = P ( Positive ∣ User ) P ( User ) P ( Positive ) = P ( Positive ∣ User ) P ( User ) P ( Positive ∣ User ) P ( User ) + P ( Positive ∣ Non-user ) P ( Non-user ) = 0.90 × 0.05 0.90 × 0.05 + 0.20 × 0.95 = 0.045 0.045 + 0.19 ≈ 19 % {\displaystyle{\begin{aligned}P({\text{User}}\mid{\text{Positive}})&={\frac{P({\text{Positive}}\mid{\text{User}})P({\text{User}})}{P({\text{Positive}})}}\\&={\frac{P({\text{Positive}}\mid{\text{User}})P({\text{User}})}{P({\text{Positive}}\mid{\text{User}})P({\text{User}})+P({\text{Positive}}\mid{\text{Non-user}})P({\text{Non-user}})}}\\[8pt]&={\frac{0.90\times0.05}{0.90\times0.05+0.20\times0.95}}={\frac{0.045}{0.045+0.19}}\approx19\%\end{aligned}}} Thefactthat P ( Positive ) = P ( Positive ∣ User ) P ( User ) + P ( Positive ∣ Non-user ) P ( Non-user ) {\displaystyleP({\text{Positive}})=P({\text{Positive}}\mid{\text{User}})P({\text{User}})+P({\text{Positive}}\mid{\text{Non-user}})P({\text{Non-user}})} isadirectapplicationoftheLawofTotalProbability.Inthiscase,itsaysthattheprobabilitythatsomeonetestspositiveistheprobabilitythatausertestspositive,timestheprobabilityofbeingauser,plustheprobabilitythatanon-usertestspositive,timestheprobabilityofbeinganon-user.Thisistruebecausetheclassificationsuserandnon-userformapartitionofaset,namelythesetofpeoplewhotakethedrugtest.Thiscombinedwiththedefinitionofconditionalprobabilityresultsintheabovestatement. Inotherwords,evenifsomeonetestspositive,theprobabilitythattheyareacannabisuserisonly19%—thisisbecauseinthisgroup,only5%ofpeopleareusers,andmostpositivesarefalsepositivescomingfromtheremaining95%. If1,000peopleweretested: 950arenon-usersand190ofthemgivefalsepositive(0.20×950) 50ofthemareusersand45ofthemgivetruepositive(0.90×50) The1,000peoplethusyields235positivetests,ofwhichonly45aregenuinedrugusers,about19%.SeeFigure1foranillustrationusingafrequencybox,andnotehowsmallthepinkareaoftruepositivesiscomparedtotheblueareaoffalsepositives. Sensitivityorspecificity[edit] Theimportanceofspecificitycanbeseenbyshowingthatevenifsensitivityisraisedto100%andspecificityremainsat80%,theprobabilityofsomeonetestingpositivereallybeingacannabisuseronlyrisesfrom19%to21%,butifthesensitivityisheldat90%andthespecificityisincreasedto95%,theprobabilityrisesto49%. TestActual Positive Negative Total User 45 5 50 Non-user 190 760 950 Total 235 765 1000 90%sensitive,80%specific,PPV=45/235≈19% TestActual Positive Negative Total User 50 0 50 Non-user 190 760 950 Total 240 760 1000 100%sensitive,80%specific,PPV=50/240≈21% TestActual Positive Negative Total User 45 5 50 Non-user 47 903 950 Total 92 908 1000 90%sensitive,95%specific,PPV=45/92≈49% Cancerrate[edit] Evenif100%ofpatientswithpancreaticcancerhaveacertainsymptom,whensomeonehasthesamesymptom,itdoesnotmeanthatthispersonhasa100%chanceofgettingpancreaticcancer.Assumingtheincidencerateofpancreaticcanceris1/100000,while10/99999healthyindividualshavethesamesymptomsworldwide,theprobabilityofhavingpancreaticcancergiventhesymptomsisonly9.1%,andtheother90.9%couldbe"falsepositives"(thatis,falselysaidtohavecancer;"positive"isaconfusingtermwhen,ashere,thetestgivesbadnews). Basedonincidencerate,thefollowingtablepresentsthecorrespondingnumbersper100,000people. SymptomCancer Yes No Total Yes 1 0 1 No 10 99989 99999 Total 11 99989 100000 Whichcanthenbeusedtocalculatetheprobabilityofhavingcancerwhenyouhavethesymptoms: P ( Cancer | Symptoms ) = P ( Symptoms | Cancer ) P ( Cancer ) P ( Symptoms ) = P ( Symptoms | Cancer ) P ( Cancer ) P ( Symptoms | Cancer ) P ( Cancer ) + P ( Symptoms | Non-Cancer ) P ( Non-Cancer ) = 1 × 0.00001 1 × 0.00001 + ( 10 / 99999 ) × 0.99999 = 1 11 ≈ 9.1 % {\displaystyle{\begin{aligned}P({\text{Cancer}}|{\text{Symptoms}})&={\frac{P({\text{Symptoms}}|{\text{Cancer}})P({\text{Cancer}})}{P({\text{Symptoms}})}}\\&={\frac{P({\text{Symptoms}}|{\text{Cancer}})P({\text{Cancer}})}{P({\text{Symptoms}}|{\text{Cancer}})P({\text{Cancer}})+P({\text{Symptoms}}|{\text{Non-Cancer}})P({\text{Non-Cancer}})}}\\[8pt]&={\frac{1\times0.00001}{1\times0.00001+(10/99999)\times0.99999}}={\frac{1}{11}}\approx9.1\%\end{aligned}}} Defectiveitemrate[edit] ConditionMachine Defective Flawless Total A 10 190 200 B 9 291 300 C 5 495 500 Total 24 976 1000 Afactoryproducesanitemusingthreemachines—A,B,andC—whichaccountfor20%,30%,and50%ofitsoutput,respectively.OftheitemsproducedbymachineA,5%aredefective;similarly,3%ofmachineB'sitemsand1%ofmachineC'saredefective.Ifarandomlyselecteditemisdefective,whatistheprobabilityitwasproducedbymachineC? Onceagain,theanswercanbereachedwithoutusingtheformulabyapplyingtheconditionstoahypotheticalnumberofcases.Forexample,ifthefactoryproduces1,000items,200willbeproducedbyMachineA,300byMachineB,and500byMachineC.MachineAwillproduce5%×200=10defectiveitems,MachineB3%×300=9,andMachineC1%×500=5,foratotalof24.Thus,thelikelihoodthatarandomlyselecteddefectiveitemwasproducedbymachineCis5/24(~20.83%). ThisproblemcanalsobesolvedusingBayes'theorem:LetXidenotetheeventthatarandomlychosenitemwasmadebytheithmachine(fori =A,B,C).LetYdenotetheeventthatarandomlychosenitemisdefective.Then,wearegiventhefollowinginformation: P ( X A ) = 0.2 , P ( X B ) = 0.3 , P ( X C ) = 0.5. {\displaystyleP(X_{A})=0.2,\quadP(X_{B})=0.3,\quadP(X_{C})=0.5.} Iftheitemwasmadebythefirstmachine,thentheprobabilitythatitisdefectiveis0.05;thatis,P(Y | XA)=0.05.Overall,wehave P ( Y | X A ) = 0.05 , P ( Y | X B ) = 0.03 , P ( Y | X C ) = 0.01. {\displaystyleP(Y|X_{A})=0.05,\quadP(Y|X_{B})=0.03,\quadP(Y|X_{C})=0.01.} Toanswertheoriginalquestion,wefirstfindP(Y).Thatcanbedoneinthefollowingway: P ( Y ) = ∑ i P ( Y | X i ) P ( X i ) = ( 0.05 ) ( 0.2 ) + ( 0.03 ) ( 0.3 ) + ( 0.01 ) ( 0.5 ) = 0.024. {\displaystyleP(Y)=\sum_{i}P(Y|X_{i})P(X_{i})=(0.05)(0.2)+(0.03)(0.3)+(0.01)(0.5)=0.024.} Hence,2.4%ofthetotaloutputisdefective. WearegiventhatYhasoccurred,andwewanttocalculatetheconditional probabilityofXC.ByBayes'theorem, P ( X C | Y ) = P ( Y | X C ) P ( X C ) P ( Y ) = 0.01 ⋅ 0.50 0.024 = 5 24 {\displaystyleP(X_{C}|Y)={\frac{P(Y|X_{C})P(X_{C})}{P(Y)}}={\frac{0.01\cdot0.50}{0.024}}={\frac{5}{24}}} Giventhattheitemisdefective,theprobabilitythatitwasmadebymachineCis5/24.AlthoughmachineCproduceshalfofthetotaloutput,itproducesamuchsmallerfractionofthedefectiveitems.HencetheknowledgethattheitemselectedwasdefectiveenablesustoreplacethepriorprobabilityP(XC) =1/2bythesmallerposteriorprobabilityP(XC | Y) =5/24. Interpretations[edit] Figure2:AgeometricvisualisationofBayes'theorem. TheinterpretationofBayes'ruledependsontheinterpretationofprobabilityascribedtotheterms.Thetwomaininterpretationsaredescribedbelow.Figure2showsageometricvisualizationsimilartoFigure1.GerdGigerenzerandco-authorshavepushedhardforteachingBayesRulethisway,withspecialemphasisonteachingittophysicians.[8]AnexampleisWillKurt'swebpage,"Bayes'TheoremwithLego,"laterturnedintothebook,BayesianStatisticstheFunWay:UnderstandingStatisticsandProbabilitywithStarWars,LEGO,andRubberDucks.ZhuandGigerenzerfoundin2006thatwhereas0%of4th,5th,and6th-graderscouldsolvewordproblemsafterbeingtaughtwithformulas,19%,39%,and53%couldafterbeingtaughtwithfrequencyboxes,andthatthelearningwaseitherthoroughorzero.[9] Bayesianinterpretation[edit] IntheBayesian(orepistemological)interpretation,probabilitymeasuresa"degreeofbelief".Bayes'theoremlinksthedegreeofbeliefinapropositionbeforeandafteraccountingforevidence.Forexample,supposeitisbelievedwith50%certaintythatacoinistwiceaslikelytolandheadsthantails.Ifthecoinisflippedanumberoftimesandtheoutcomesobserved,thatdegreeofbeliefwillprobablyriseorfall,butmightevenremainthesame,dependingontheresults.ForpropositionAandevidenceB, P (A),theprior,istheinitialdegreeofbeliefinA. P (A | B),theposterior,isthedegreeofbeliefafterincorporatingnewsthatBistrue. thequotientP(B | A)/P(B)representsthesupportBprovidesforA. FormoreontheapplicationofBayes'theoremundertheBayesianinterpretationofprobability,seeBayesianinference. Frequentistinterpretation[edit] Figure3:Illustrationoffrequentistinterpretationwithtreediagrams. Inthefrequentistinterpretation,probabilitymeasuresa"proportionofoutcomes".Forexample,supposeanexperimentisperformedmanytimes.P(A)istheproportionofoutcomeswithpropertyA(theprior)andP(B)istheproportionwithpropertyB.P(B | A)istheproportionofoutcomeswithpropertyBoutofoutcomeswithpropertyA,andP(A | B)istheproportionofthosewithAoutofthosewith B(theposterior). TheroleofBayes'theoremisbestvisualizedwithtreediagramssuchasFigure3.ThetwodiagramspartitionthesameoutcomesbyAandBinoppositeorders,toobtaintheinverseprobabilities.Bayes'theoremlinksthedifferentpartitionings. Example[edit] Figure4:Treediagramillustratingthebeetleexample.R,C,Pand P ¯ {\displaystyle{\overline{P}}} aretheeventsrare,common,patternandnopattern.Percentagesinparenthesesarecalculated.Threeindependentvaluesaregiven,soitispossibletocalculatetheinversetree. Anentomologistspotswhatmight,duetothepatternonitsback,beararesubspeciesofbeetle.Afull98%ofthemembersoftheraresubspecieshavethepattern,soP(Pattern | Rare)=98%.Only5%ofmembersofthecommonsubspecieshavethepattern.Theraresubspeciesis0.1%ofthetotalpopulation.Howlikelyisthebeetlehavingthepatterntoberare:whatisP(Rare | Pattern)? FromtheextendedformofBayes'theorem(sinceanybeetleiseitherrareorcommon), P ( Rare ∣ Pattern ) = P ( Pattern ∣ Rare ) P ( Rare ) P ( Pattern ) = P ( Pattern ∣ Rare ) P ( Rare ) P ( Pattern ∣ Rare ) P ( Rare ) + P ( Pattern ∣ Common ) P ( Common ) = 0.98 × 0.001 0.98 × 0.001 + 0.05 × 0.999 ≈ 1.9 % {\displaystyle{\begin{aligned}P({\text{Rare}}\mid{\text{Pattern}})&={\frac{P({\text{Pattern}}\mid{\text{Rare}})P({\text{Rare}})}{P({\text{Pattern}})}}\\[8pt]&={\frac{P({\text{Pattern}}\mid{\text{Rare}})P({\text{Rare}})}{P({\text{Pattern}}\mid{\text{Rare}})P({\text{Rare}})+P({\text{Pattern}}\mid{\text{Common}})P({\text{Common}})}}\\[8pt]&={\frac{0.98\times0.001}{0.98\times0.001+0.05\times0.999}}\\[8pt]&\approx1.9\%\end{aligned}}} Forms[edit] Events[edit] Simpleform[edit] ForeventsAandB,providedthatP(B) ≠ 0, P ( A | B ) = P ( B | A ) P ( A ) P ( B ) . {\displaystyleP(A|B)={\frac{P(B|A)P(A)}{P(B)}}.} Inmanyapplications,forinstanceinBayesianinference,theeventBisfixedinthediscussion,andwewishtoconsidertheimpactofitshavingbeenobservedonourbeliefinvariouspossibleeventsA.Insuchasituationthedenominatorofthelastexpression,theprobabilityofthegivenevidenceB,isfixed;whatwewanttovaryisA.Bayes'theoremthenshowsthattheposteriorprobabilitiesareproportionaltothenumerator,sothelastequationbecomes: P ( A | B ) ∝ P ( A ) ⋅ P ( B | A ) . {\displaystyleP(A|B)\proptoP(A)\cdotP(B|A).} Inwords,theposteriorisproportionaltothepriortimesthelikelihood.[10] IfeventsA1,A2,...,aremutuallyexclusiveandexhaustive,i.e.,oneofthemiscertaintooccurbutnotwocanoccurtogether,wecandeterminetheproportionalityconstantbyusingthefactthattheirprobabilitiesmustadduptoone.Forinstance,foragiveneventA,theeventAitselfanditscomplement¬Aareexclusiveandexhaustive.Denotingtheconstantofproportionalitybycwehave P ( A | B ) = c ⋅ P ( A ) ⋅ P ( B | A ) and P ( ¬ A | B ) = c ⋅ P ( ¬ A ) ⋅ P ( B | ¬ A ) . {\displaystyleP(A|B)=c\cdotP(A)\cdotP(B|A){\text{and}}P(\negA|B)=c\cdotP(\negA)\cdotP(B|\negA).} Addingthesetwoformulaswededucethat 1 = c ⋅ ( P ( B | A ) ⋅ P ( A ) + P ( B | ¬ A ) ⋅ P ( ¬ A ) ) , {\displaystyle1=c\cdot(P(B|A)\cdotP(A)+P(B|\negA)\cdotP(\negA)),} or c = 1 P ( B | A ) ⋅ P ( A ) + P ( B | ¬ A ) ⋅ P ( ¬ A ) = 1 P ( B ) . {\displaystylec={\frac{1}{P(B|A)\cdotP(A)+P(B|\negA)\cdotP(\negA)}}={\frac{1}{P(B)}}.} Alternativeform[edit] Contingencytable BackgroundProposition B ¬B(notB) Total A P(B|A)·P(A)=P(A|B)·P(B) P(¬B|A)·P(A)=P(A|¬B)·P(¬B) P(A) ¬A(notA) P(B|¬A)·P(¬A)=P(¬A|B)·P(B) P(¬B|¬A)·P(¬A)=P(¬A|¬B)·P(¬B) P(¬A) =1−P(A) Total P(B) P(¬B)=1−P(B) 1 AnotherformofBayes'theoremfortwocompetingstatementsorhypothesesis: P ( A | B ) = P ( B | A ) P ( A ) P ( B | A ) P ( A ) + P ( B | ¬ A ) P ( ¬ A ) . {\displaystyleP(A|B)={\frac{P(B|A)P(A)}{P(B|A)P(A)+P(B|\negA)P(\negA)}}.} Foranepistemologicalinterpretation: ForpropositionAandevidenceorbackgroundB,[11] P ( A ) {\displaystyleP(A)} isthepriorprobability,theinitialdegreeofbeliefinA. P ( ¬ A ) {\displaystyleP(\negA)} isthecorrespondinginitialdegreeofbeliefinnot-A,thatAisfalse,where P ( ¬ A ) = 1 − P ( A ) {\displaystyleP(\negA)=1-P(A)} P ( B | A ) {\displaystyleP(B|A)} istheconditionalprobabilityorlikelihood,thedegreeofbeliefinBgiventhatpropositionAistrue. P ( B | ¬ A ) {\displaystyleP(B|\negA)} istheconditionalprobabilityorlikelihood,thedegreeofbeliefinBgiventhatpropositionAisfalse. P ( A | B ) {\displaystyleP(A|B)} istheposteriorprobability,theprobabilityofAaftertakingintoaccountB. Extendedform[edit] Often,forsomepartition{Aj}ofthesamplespace,theeventspaceisgivenintermsofP(Aj)andP(B | Aj).ItisthenusefultocomputeP(B)usingthelawoftotalprobability: P ( B ) = ∑ j P ( B | A j ) P ( A j ) , {\displaystyleP(B)={\sum_{j}P(B|A_{j})P(A_{j})},} ⇒ P ( A i | B ) = P ( B | A i ) P ( A i ) ∑ j P ( B | A j ) P ( A j ) ⋅ {\displaystyle\RightarrowP(A_{i}|B)={\frac{P(B|A_{i})P(A_{i})}{\sum\limits_{j}P(B|A_{j})P(A_{j})}}\cdot} InthespecialcasewhereAisabinaryvariable: P ( A | B ) = P ( B | A ) P ( A ) P ( B | A ) P ( A ) + P ( B | ¬ A ) P ( ¬ A ) ⋅ {\displaystyleP(A|B)={\frac{P(B|A)P(A)}{P(B|A)P(A)+P(B|\negA)P(\negA)}}\cdot} Randomvariables[edit] Figure5:Bayes'theoremappliedtoaneventspacegeneratedbycontinuousrandomvariablesXandY.ThereexistsaninstanceofBayes'theoremforeachpointinthedomain.Inpractice,theseinstancesmightbeparametrizedbywritingthespecifiedprobabilitydensitiesasafunctionofxandy. ConsiderasamplespaceΩgeneratedbytworandomvariablesXandY.Inprinciple,Bayes'theoremappliestotheeventsA = {X = x}andB = {Y = y}. P ( X = x | Y = y ) = P ( Y = y | X = x ) P ( X = x ) P ( Y = y ) {\displaystyleP(X{=}x|Y{=}y)={\frac{P(Y{=}y|X{=}x)P(X{=}x)}{P(Y{=}y)}}} However,termsbecome0atpointswhereeithervariablehasfiniteprobabilitydensity.Toremainuseful,Bayes'theoremmustbeformulatedintermsoftherelevantdensities(seeDerivation). Simpleform[edit] IfXiscontinuousandYisdiscrete, f X | Y = y ( x ) = P ( Y = y | X = x ) f X ( x ) P ( Y = y ) {\displaystylef_{X|Y{=}y}(x)={\frac{P(Y{=}y|X{=}x)f_{X}(x)}{P(Y{=}y)}}} whereeach f {\displaystylef} isadensityfunction. IfXisdiscreteandYiscontinuous, P ( X = x | Y = y ) = f Y | X = x ( y ) P ( X = x ) f Y ( y ) . {\displaystyleP(X{=}x|Y{=}y)={\frac{f_{Y|X{=}x}(y)P(X{=}x)}{f_{Y}(y)}}.} IfbothXandYarecontinuous, f X | Y = y ( x ) = f Y | X = x ( y ) f X ( x ) f Y ( y ) . {\displaystylef_{X|Y{=}y}(x)={\frac{f_{Y|X{=}x}(y)f_{X}(x)}{f_{Y}(y)}}.} Extendedform[edit] Figure6:AwaytoconceptualizeeventspacesgeneratedbycontinuousrandomvariablesXandY. Acontinuouseventspaceisoftenconceptualizedintermsofthenumeratorterms.Itisthenusefultoeliminatethedenominatorusingthelawoftotalprobability.ForfY(y),thisbecomesanintegral: f Y ( y ) = ∫ − ∞ ∞ f Y | X = ξ ( y ) f X ( ξ ) d ξ . {\displaystylef_{Y}(y)=\int_{-\infty}^{\infty}f_{Y|X=\xi}(y)f_{X}(\xi)\,d\xi.} Bayes'ruleinoddsform[edit] Bayes'theoreminoddsformis: O ( A 1 : A 2 ∣ B ) = O ( A 1 : A 2 ) ⋅ Λ ( A 1 : A 2 ∣ B ) {\displaystyleO(A_{1}:A_{2}\midB)=O(A_{1}:A_{2})\cdot\Lambda(A_{1}:A_{2}\midB)} where Λ ( A 1 : A 2 ∣ B ) = P ( B ∣ A 1 ) P ( B ∣ A 2 ) {\displaystyle\Lambda(A_{1}:A_{2}\midB)={\frac{P(B\midA_{1})}{P(B\midA_{2})}}} iscalledtheBayesfactororlikelihoodratio.Theoddsbetweentwoeventsissimplytheratiooftheprobabilitiesofthetwoevents.Thus O ( A 1 : A 2 ) = P ( A 1 ) P ( A 2 ) , {\displaystyleO(A_{1}:A_{2})={\frac{P(A_{1})}{P(A_{2})}},} O ( A 1 : A 2 ∣ B ) = P ( A 1 ∣ B ) P ( A 2 ∣ B ) , {\displaystyleO(A_{1}:A_{2}\midB)={\frac{P(A_{1}\midB)}{P(A_{2}\midB)}},} Thus,therulesaysthattheposterioroddsaretheprioroddstimestheBayesfactor,orinotherwords,theposteriorisproportionaltothepriortimesthelikelihood. Inthespecialcasethat A 1 = A {\displaystyleA_{1}=A} and A 2 = ¬ A {\displaystyleA_{2}=\negA} ,onewrites O ( A ) = O ( A : ¬ A ) = P ( A ) / ( 1 − P ( A ) ) {\displaystyleO(A)=O(A:\negA)=P(A)/(1-P(A))} ,andusesasimilarabbreviationfortheBayesfactorandfortheconditionalodds.Theoddson A {\displaystyleA} isbydefinitiontheoddsforandagainst A {\displaystyleA} .Bayes'rulecanthenbewrittenintheabbreviatedform O ( A ∣ B ) = O ( A ) ⋅ Λ ( A ∣ B ) , {\displaystyleO(A\midB)=O(A)\cdot\Lambda(A\midB),} or,inwords,theposterioroddson A {\displaystyleA} equalstheprioroddson A {\displaystyleA} timesthelikelihoodratiofor A {\displaystyleA} giveninformation B {\displaystyleB} .Inshort,posterioroddsequalsprioroddstimeslikelihoodratio. Forexample,ifamedicaltesthasasensitivityof90%andaspecificityof91%,thenthepositiveBayesfactoris Λ + = P ( TruePositive ) / P ( FalsePositive ) = 90 % / ( 100 % − 91 % ) = 10 {\displaystyle\Lambda_{+}=P({\text{TruePositive}})/P({\text{FalsePositive}})=90\%/(100\%-91\%)=10} .Now,iftheprevalenceofthisdiseaseis9.09%,andifwetakethatasthepriorprobability,thentheprioroddsisabout1:10.Soafterreceivingapositivetestresult,theposterioroddsofactuallyhavingthediseasebecomes1:1;Inotherwords,theposteriorprobabilityofactuallyhavingthediseaseis50%.Ifasecondtestisperformedinserialtesting,andthatalsoturnsouttobepositive,thentheposterioroddsofactuallyhavingthediseasebecomes10:1,whichmeansaposteriorprobabilityofabout90.91%.ThenegativeBayesfactorcanbecalculatedtobe91%/(100%-90%)=9.1,soifthesecondtestturnsouttobenegative,thentheposterioroddsofactuallyhavingthediseaseis1:9.1,whichmeansaposteriorprobabilityofabout9.9%. Theexampleabovecanalsobeunderstoodwithmoresolidnumbers:Assumethepatienttakingthetestisfromagroupof1000people,where91ofthemactuallyhavethedisease(prevalenceof9.1%).Ifallthese1000peopletakethemedicaltest,82ofthosewiththediseasewillgetatruepositiveresult(sensitivityof90.1%),9ofthosewiththediseasewillgetafalsenegativeresult(falsenegativerateof9.9%),827ofthosewithoutthediseasewillgetatruenegativeresult(specificityof91.0%),and82ofthosewithoutthediseasewillgetafalsepositiveresult(falsepositiverateof9.0%).Beforetakinganytest,thepatient'soddsforhavingthediseaseis91:909.Afterreceivingapositiveresult,thepatient'soddsforhavingthediseaseis 91 909 × 90.1 % 9.0 % = 91 × 90.1 % 909 × 9.0 % = 1 : 1 {\displaystyle{\frac{91}{909}}\times{\frac{90.1\%}{9.0\%}}={\frac{91\times90.1\%}{909\times9.0\%}}=1:1} whichisconsistentwiththefactthatthereare82truepositivesand82falsepositivesinthegroupof1000people. Correspondencetoothermathematicalframeworks[edit] Propositionallogic[edit] Using P ( ¬ B ∣ A ) = 1 − P ( B ∣ A ) {\displaystyleP(\negB\midA)=1-P(B\midA)} twice,onemayuseBayes'theoremtoalsoexpress P ( ¬ B ∣ ¬ A ) {\displaystyleP(\negB\mid\negA)} intermsof P ( A ∣ B ) {\displaystyleP(A\midB)} andwithoutnegations: P ( ¬ B ∣ ¬ A ) = 1 − ( 1 − P ( A ∣ B ) ) P ( B ) P ( ¬ A ) {\displaystyleP(\negB\mid\negA)=1-\left(1-P(A\midB)\right){\frac{P(B)}{P(\negA)}}} , when P ( ¬ A ) = 1 − P ( A ) ≠ 0 {\displaystyleP(\negA)=1-P(A)\neq0} .Fromthiswecanreadofftheinference P ( A ∣ B ) = 1 ⟹ P ( ¬ B ∣ ¬ A ) = 1 {\displaystyleP(A\midB)=1\impliesP(\negB\mid\negA)=1} . Inwords:Ifcertainly B {\displaystyleB} implies A {\displaystyleA} ,weinferthatcertainly ¬ A {\displaystyle\negA} implies ¬ B {\displaystyle\negB} .Where P ( B ) ≠ 0 {\displaystyleP(B)\neq0} ,thetwoimplicationsbeingcertainareequivalentstatements. Intheprobabilityformulas,theconditionalprobability P ( A ∣ B ) {\displaystyleP(A\midB)} generalizesthelogicalimplication B ⟹ A {\displaystyleB\impliesA} ,wherenowbeyondassigningtrueorfalse,weassignprobabilityvaluestostatements.Theassertionof B ⟹ A {\displaystyleB\impliesA} iscapturedbycertaintyoftheconditional,theassertionof P ( A ∣ B ) = 1 {\displaystyleP(A\midB)=1} .Relatingthedirectionsofimplication,Bayes'theoremrepresentsageneralizationofthecontrapositionlaw,whichinclassicalpropositionallogiccanbeexpressedas: ( B ⟹ A ) ⟺ ( ¬ A ⟹ ¬ B ) {\displaystyle(B\impliesA)\iff(\negA\implies\negB)} . Notethatinthisrelationbetweenimplications,thepositionsof A {\displaystyleA} resp. B {\displaystyleB} getflipped. ThecorrespondingformulaintermsofprobabilitycalculusisBayes'theorem,whichinitsexpandedforminvolvingthepriorprobability/baserate a {\displaystylea} ofonly A {\displaystyleA} ,isexpressedas:[12] P ( A ∣ B ) = P ( B ∣ A ) a ( A ) P ( B ∣ A ) a ( A ) + P ( B ∣ ¬ A ) a ( ¬ A ) {\displaystyleP(A\midB)=P(B\midA){\frac{a(A)}{P(B\midA)\,a(A)+P(B\mid\negA)\,a(\negA)}}} . Subjectivelogic[edit] Bayes'theoremrepresentsaspecialcaseofderivinginvertedconditionalopinionsinsubjectivelogicexpressedas: ( ω A | ~ B S , ω A | ~ ¬ B S ) = ( ω B ∣ A S , ω B ∣ ¬ A S ) ϕ ~ a A , {\displaystyle(\omega_{A{\tilde{|}}B}^{S},\omega_{A{\tilde{|}}\lnotB}^{S})=(\omega_{B\midA}^{S},\omega_{B\mid\lnotA}^{S}){\widetilde{\phi}}a_{A},} where ϕ ~ {\displaystyle{\widetilde{\phi}}} denotestheoperatorforinvertingconditionalopinions.Theargument ( ω B ∣ A S , ω B ∣ ¬ A S ) {\displaystyle(\omega_{B\midA}^{S},\omega_{B\mid\lnotA}^{S})} denotesapairofbinomialconditionalopinionsgivenbysource S {\displaystyleS} ,andtheargument a A {\displaystylea_{A}} denotesthepriorprobability(aka.thebaserate)of A {\displaystyleA} .Thepairofderivativeinvertedconditionalopinionsisdenoted ( ω A | ~ B S , ω A | ~ ¬ B S ) {\displaystyle(\omega_{A{\tilde{|}}B}^{S},\omega_{A{\tilde{|}}\lnotB}^{S})} .Theconditionalopinion ω A ∣ B S {\displaystyle\omega_{A\midB}^{S}} generalizestheprobabilisticconditional P ( A ∣ B ) {\displaystyleP(A\midB)} ,i.e.inadditiontoassigningaprobabilitythesource S {\displaystyleS} canassignanysubjectiveopiniontotheconditionalstatement ( A ∣ B ) {\displaystyle(A\midB)} .Abinomialsubjectiveopinion ω A S {\displaystyle\omega_{A}^{S}} isthebeliefinthetruthofstatement A {\displaystyleA} withdegreesofepistemicuncertainty,asexpressedbysource S {\displaystyleS} .Everysubjectiveopinionhasacorrespondingprojectedprobability P ( ω A S ) {\displaystyleP(\omega_{A}^{S})} .TheapplicationofBayes'theoremtoprojectedprobabilitiesofopinionsisahomomorphism,meaningthatBayes'theoremcanbeexpressedintermsofprojectedprobabilitiesofopinions: P ( ω A | ~ B S ) = P ( ω B ∣ A S ) a ( A ) P ( ω B ∣ A S ) a ( A ) + P ( ω B ∣ ¬ A S ) a ( ¬ A ) . {\displaystyleP(\omega_{A{\tilde{|}}B}^{S})={\frac{P(\omega_{B\midA}^{S})a(A)}{P(\omega_{B\midA}^{S})a(A)+P(\omega_{B\mid\lnotA}^{S})a(\lnotA)}}.} Hence,thesubjectiveBayes'theoremrepresentsageneralizationofBayes'theorem.[13] Generalizations[edit] Conditionedversion[edit] AconditionedversionoftheBayes'theorem[14]resultsfromtheadditionofathirdevent C {\displaystyleC} onwhichallprobabilitiesareconditioned: P ( A ∣ B ∩ C ) = P ( B ∣ A ∩ C ) P ( A ∣ C ) P ( B ∣ C ) {\displaystyleP(A\midB\capC)={\frac{P(B\midA\capC)\,P(A\midC)}{P(B\midC)}}} Derivation[edit] Usingthechainrule P ( A ∩ B ∩ C ) = P ( A ∣ B ∩ C ) P ( B ∣ C ) P ( C ) {\displaystyleP(A\capB\capC)=P(A\midB\capC)\,P(B\midC)\,P(C)} And,ontheotherhand P ( A ∩ B ∩ C ) = P ( B ∩ A ∩ C ) = P ( B ∣ A ∩ C ) P ( A ∣ C ) P ( C ) {\displaystyleP(A\capB\capC)=P(B\capA\capC)=P(B\midA\capC)\,P(A\midC)\,P(C)} Thedesiredresultisobtainedbyidentifyingbothexpressionsandsolvingfor P ( A ∣ B ∩ C ) {\displaystyleP(A\midB\capC)} . Bayes'rulewith3events[edit] Inthecaseof3events-A,B,andC-itcanbeshownthat: P ( A ∣ B , C ) = P ( B ∣ A , C ) P ( A ∣ C ) P ( B ∣ C ) {\displaystyleP(A\midB,C)={\frac{P(B\midA,C)\;P(A\midC)}{P(B\midC)}}} Proof[15] P ( A ∣ B , C ) = P ( A , B , C ) P ( B , C ) = P ( B ∣ A , C ) P ( A , C ) P ( B , C ) = P ( B ∣ A , C ) P ( A ∣ C ) P ( C ) P ( B , C ) = P ( B ∣ A , C ) P ( A ∣ C ) P ( C ) P ( B ∣ C ) P ( C ) = P ( B ∣ A , C ) P ( A ∣ C ) P ( B ∣ C ) {\displaystyle{\begin{aligned}P(A\midB,C)&={\frac{P(A,B,C)}{P(B,C)}}\\[1ex]&={\frac{P(B\midA,C)\,P(A,C)}{P(B,C)}}\\[1ex]&={\frac{P(B\midA,C)\,P(A\midC)\,P(C)}{P(B,C)}}\\[1ex]&={\frac{P(B\midA,C)\,P(A\midC)P(C)}{P(B\midC)P(C)}}\\[1ex]&={\frac{P(B\midA,C)\;P(A\midC)}{P(B\midC)}}\end{aligned}}} History[edit] Bayes'theoremisnamedaftertheReverendThomasBayes(/beɪz/;c.1701–1761),whofirstusedconditionalprobabilitytoprovideanalgorithm(hisProposition9)thatusesevidencetocalculatelimitsonanunknownparameter,publishedasAnEssaytowardssolvingaProblemintheDoctrineofChances(1763).Hestudiedhowtocomputeadistributionfortheprobabilityparameterofabinomialdistribution(inmodernterminology).OnBayes'sdeathhisfamilytransferredhispaperstohisoldfriend,RichardPrice(1723–1791)whooveraperiodoftwoyearssignificantlyeditedtheunpublishedmanuscript,beforesendingittoafriendwhoreaditaloudattheRoyalSocietyon23December1763.[1][page needed]Priceedited[16]Bayes'smajorwork"AnEssaytowardssolvingaProblemintheDoctrineofChances"(1763),whichappearedinPhilosophicalTransactions,[17]andcontainsBayes'theorem.PricewroteanintroductiontothepaperwhichprovidessomeofthephilosophicalbasisofBayesianstatisticsandchoseoneofthetwosolutionsofferedbyBayes.In1765,PricewaselectedaFellowoftheRoyalSocietyinrecognitionofhisworkonthelegacyofBayes.[18][19]On27AprilalettersenttohisfriendBenjaminFranklinwasreadoutattheRoyalSociety,andlaterpublished,wherePriceappliesthisworktopopulationandcomputing'life-annuities'.[20] Mainarticle:RichardPrice IndependentlyofBayes,Pierre-SimonLaplacein1774,andlaterinhis1812Théorieanalytiquedesprobabilités,usedconditionalprobabilitytoformulatetherelationofanupdatedposteriorprobabilityfromapriorprobability,givenevidence.HereproducedandextendedBayes'sresultsin1774,apparentlyunawareofBayes'swork.[note1][21]TheBayesianinterpretationofprobabilitywasdevelopedmainlybyLaplace.[22] SirHaroldJeffreysputBayes'salgorithmandLaplace'sformulationonanaxiomaticbasis,writingthatBayes'theorem"istothetheoryofprobabilitywhatthePythagoreantheoremistogeometry".[23] StephenStiglerusedaBayesianargumenttoconcludethatBayes'theoremwasdiscoveredbyNicholasSaunderson,ablindEnglishmathematician,sometimebeforeBayes;[24][25]thatinterpretation,however,hasbeendisputed.[26] MartynHooper[27]andSharonMcGrayne[28]havearguedthatRichardPrice'scontributionwassubstantial: Bymodernstandards,weshouldrefertotheBayes–Pricerule.PricediscoveredBayes'swork,recognizeditsimportance,correctedit,contributedtothearticle,andfoundauseforit.ThemodernconventionofemployingBayes'snamealoneisunfairbutsoentrenchedthatanythingelsemakeslittlesense.[28] Useingenetics[edit] Ingenetics,Bayes'theoremcanbeusedtocalculatetheprobabilityofanindividualhavingaspecificgenotype.Manypeopleseektoapproximatetheirchancesofbeingaffectedbyageneticdiseaseortheirlikelihoodofbeingacarrierforarecessivegeneofinterest.ABayesiananalysiscanbedonebasedonfamilyhistoryorgenetictesting,inordertopredictwhetheranindividualwilldevelopadiseaseorpassoneontotheirchildren.Genetictestingandpredictionisacommonpracticeamongcoupleswhoplantohavechildrenbutareconcernedthattheymaybothbecarriersforadisease,especiallywithincommunitieswithlowgeneticvariance.[29] ThefirststepinBayesiananalysisforgeneticsistoproposemutuallyexclusivehypotheses:foraspecificallele,anindividualeitherisorisnotacarrier.Next,fourprobabilitiesarecalculated:PriorProbability(thelikelihoodofeachhypothesisconsideringinformationsuchasfamilyhistoryorpredictionsbasedonMendelianInheritance),ConditionalProbability(ofacertainoutcome),JointProbability(productofthefirsttwo),andPosteriorProbability(aweightedproductcalculatedbydividingtheJointProbabilityforeachhypothesisbythesumofbothjointprobabilities).Thistypeofanalysiscanbedonebasedpurelyonfamilyhistoryofaconditionorinconcertwithgenetictesting.[citationneeded] Usingpedigreetocalculateprobabilities[edit] Hypothesis Hypothesis1:Patientisacarrier Hypothesis2:Patientisnotacarrier PriorProbability 1/2 1/2 ConditionalProbabilitythatallfouroffspringwillbeunaffected (1/2)·(1/2)·(1/2)·(1/2)=1/16 About1 JointProbability (1/2)·(1/16)=1/32 (1/2)·1=1/2 PosteriorProbability (1/32)/(1/32+1/2)=1/17 (1/2)/(1/32+1/2)=16/17 ExampleofaBayesiananalysistableforafemaleindividual'sriskforadiseasebasedontheknowledgethatthediseaseispresentinhersiblingsbutnotinherparentsoranyofherfourchildren.Basedsolelyonthestatusofthesubject'ssiblingsandparents,sheisequallylikelytobeacarrierastobeanon-carrier(thislikelihoodisdenotedbythePriorHypothesis).However,theprobabilitythatthesubject'sfoursonswouldallbeunaffectedis1/16(1⁄2·1⁄2·1⁄2·1⁄2)ifsheisacarrier,about1ifsheisanon-carrier(thisistheConditionalProbability).TheJointProbabilityreconcilesthesetwopredictionsbymultiplyingthemtogether.Thelastline(thePosteriorProbability)iscalculatedbydividingtheJointProbabilityforeachhypothesisbythesumofbothjointprobabilities.[30] Usinggenetictestresults[edit] Parentalgenetictestingcandetectaround90%ofknowndiseaseallelesinparentsthatcanleadtocarrieroraffectedstatusintheirchild.CysticfibrosisisaheritablediseasecausedbyanautosomalrecessivemutationontheCFTRgene,[31]locatedontheqarmofchromosome7.[32] Bayesiananalysisofafemalepatientwithafamilyhistoryofcysticfibrosis(CF),whohastestednegativeforCF,demonstratinghowthismethodwasusedtodetermineherriskofhavingachildbornwithCF: Becausethepatientisunaffected,sheiseitherhomozygousforthewild-typeallele,orheterozygous.Toestablishpriorprobabilities,aPunnettsquareisused,basedontheknowledgethatneitherparentwasaffectedbythediseasebutbothcouldhavebeencarriers: MotherFather W Homozygousforthewild-typeallele(anon-carrier) M Heterozygous(aCFcarrier) W Homozygousforthewild-typeallele(anon-carrier) WW MW M Heterozygous(aCFcarrier) MW MM (affectedbycysticfibrosis) Giventhatthepatientisunaffected,thereareonlythreepossibilities.Withinthesethree,therearetwoscenariosinwhichthepatientcarriesthemutantallele.Thusthepriorprobabilitiesare2⁄3and1⁄3. Next,thepatientundergoesgenetictestingandtestsnegativeforcysticfibrosis.Thistesthasa90%detectionrate,sotheconditionalprobabilitiesofanegativetestare1/10and1. Finally,thejointandposteriorprobabilitiesarecalculatedasbefore. Hypothesis Hypothesis1:Patientisacarrier Hypothesis2:Patientisnotacarrier PriorProbability 2/3 1/3 ConditionalProbabilityofanegativetest 1/10 1 JointProbability 1/15 1/3 PosteriorProbability 1/6 5/6 Aftercarryingoutthesameanalysisonthepatient'smalepartner(withanegativetestresult),thechancesoftheirchildbeingaffectedisequaltotheproductoftheparents'respectiveposteriorprobabilitiesforbeingcarrierstimesthechancesthattwocarrierswillproduceanaffectedoffspring(1⁄4). Genetictestingdoneinparallelwithotherriskfactoridentification.[edit] Bayesiananalysiscanbedoneusingphenotypicinformationassociatedwithageneticcondition,andwhencombinedwithgenetictestingthisanalysisbecomesmuchmorecomplicated.CysticFibrosis,forexample,canbeidentifiedinafetusthroughanultrasoundlookingforanechogenicbowel,meaningonethatappearsbrighterthannormalonascan2.Thisisnotafoolprooftest,asanechogenicbowelcanbepresentina perfectlyhealthyfetus.Parentalgenetictestingisveryinfluentialinthiscase,whereaphenotypicfacetcanbeoverlyinfluentialinprobabilitycalculation.Inthecaseofafetuswithanechogenicbowel,withamotherwhohasbeentestedandisknowntobeaCFcarrier,theposteriorprobabilitythatthefetusactuallyhasthediseaseisveryhigh(0.64).However,oncethefatherhastestednegativeforCF,theposteriorprobabilitydropssignificantly(to0.16).[30] Riskfactorcalculationisapowerfultoolingeneticcounselingandreproductiveplanning,butitcannotbetreatedastheonlyimportantfactortoconsider.Asabove,incompletetestingcanyieldfalselyhighprobabilityofcarrierstatus,andtestingcanbefinanciallyinaccessibleorunfeasiblewhenaparentisnotpresent. Seealso[edit] Mathematicsportal Bayesianepistemology Inductiveprobability QuantumBayesianism WhyMostPublishedResearchFindingsAreFalse,a2005essayinmetasciencebyJohnIoannidis Notes[edit] ^LaplacerefinedBayes'stheoremoveraperiodofdecades: LaplaceannouncedhisindependentdiscoveryofBayes'theoremin:Laplace(1774)"Mémoiresurlaprobabilitédescausesparlesévénements,""Mémoiresdel'AcadémieroyaledesSciencesdeMI(Savantsétrangers),"4:621–656.Reprintedin:Laplace,"Oeuvrescomplètes"(Paris,France:Gauthier-Villarsetfils,1841),vol.8,pp. 27–65.Availableon-lineat:Gallica.Bayes'theoremappearsonp. 29. LaplacepresentedarefinementofBayes'theoremin:Laplace(read:1783/published:1785)"Mémoiresurlesapproximationsdesformulesquisontfonctionsdetrèsgrandsnombres,""Mémoiresdel'AcadémieroyaledesSciencesdeParis,"423–467.Reprintedin:Laplace,"Oeuvrescomplètes"(Paris,France:Gauthier-Villarsetfils,1844),vol.10,pp. 295–338.Availableon-lineat:Gallica.Bayes'theoremisstatedonpage301. Seealso:Laplace,"Essaiphilosophiquesurlesprobabilités"(Paris,France:Mme.Ve.Courcier[Madameveuve(i.e.,widow)Courcier],1814),page10.Englishtranslation:PierreSimon,MarquisdeLaplacewithF.W.TruscottandF.L.Emory,trans.,"APhilosophicalEssayonProbabilities"(NewYork,NewYork:JohnWiley&Sons,1902),page15. References[edit] ^abFrame,Paul(2015).Liberty'sApostle.Wales:UniversityofWalesPress.ISBN 978-1-78316-216-1.Retrieved23February2021. ^Joyce,James(2003),"Bayes'Theorem",inZalta,EdwardN.(ed.),TheStanfordEncyclopediaofPhilosophy(Spring2019 ed.),MetaphysicsResearchLab,StanfordUniversity,retrieved2020-01-17 ^Jeffreys,SirHarold(1973).ScientificInference.Cambridge:AttheUniversityPress.OCLC 764571529. ^Stuart,A.;Ord,K.(1994),Kendall'sAdvancedTheoryofStatistics:VolumeI—DistributionTheory,EdwardArnold,§8.7 ^Kolmogorov,A.N.(1933).FOUNDATIONSOFTHETHEORYOFPROBABILITY.CHELSEAPUBLISHINGCOMPANY(1956ed). ^Taraldsen,Gunnar;Tufto,Jarle;Lindqvist,BoH.(2021-07-24)."Improperpriorsandimproperposteriors".ScandinavianJournalofStatistics:sjos.12550.doi:10.1111/sjos.12550.ISSN 0303-6898.S2CID 237736986. ^Robert,ChristianP.;Casella,George(2004).MonteCarloStatisticalMethods.Springer.ISBN 978-1-4757-4145-2.OCLC 1159112760. ^Gigerenzer,Gerd;Hoffrage,Ulrich(1995)."HowtoimproveBayesianreasoningwithoutinstruction:Frequencyformats".PsychologicalReview.102(4):684–704.CiteSeerX 10.1.1.128.3201.doi:10.1037/0033-295X.102.4.684. ^Zhu,Liqi;Gigerenzer,Gerd(January2006)."ChildrencansolveBayesianproblems:theroleofrepresentationinmentalcomputation".Cognition.98(3):287–308.doi:10.1016/j.cognition.2004.12.003.hdl:11858/00-001M-0000-0024-FEFD-A.PMID 16399266.S2CID 1451338. ^ Lee,PeterM.(2012)."Chapter1".BayesianStatistics.Wiley.ISBN 978-1-1183-3257-3. ^"Bayes'Theorem:Introduction".TrinityUniversity.Archivedfromtheoriginalon21August2004.Retrieved5August2014. ^AudunJøsang,2016,SubjectiveLogic;AformalismforReasoningUnderUncertainty.Springer,Cham,ISBN 978-3-319-42337-1 ^AudunJøsang,2016,GeneralisingBayes'TheoreminSubjectiveLogic.IEEEInternationalConferenceonMultisensorFusionandIntegrationforIntelligentSystems(MFI2016),Baden-Baden,September2016 ^Koller,D.;Friedman,N.(2009).ProbabilisticGraphicalModels.Massachusetts:MITPress.p. 1208.ISBN 978-0-262-01319-2.Archivedfromtheoriginalon2014-04-27. ^GrahamKemp(https://math.stackexchange.com/users/135106/graham-kemp),Bayes'rulewith3variables,URL(version:2015-05-14):https://math.stackexchange.com/q/1281558 ^Allen,Richard(1999).DavidHartleyonHumanNature.SUNYPress.pp. 243–4.ISBN 978-0-7914-9451-6.Retrieved16June2013. ^Bayes,Thomas&Price,Richard(1763)."AnEssaytowardssolvingaProblemintheDoctrineofChance.BythelateRev.Mr.Bayes,communicatedbyMr.Price,inalettertoJohnCanton,A.M.F.R.S."PhilosophicalTransactionsoftheRoyalSocietyofLondon.53:370–418.doi:10.1098/rstl.1763.0053. ^Holland,pp. 46–7. ^Price,Richard(1991).Price:PoliticalWritings.CambridgeUniversityPress.p. xxiii.ISBN 978-0-521-40969-8.Retrieved16June2013. ^Mitchell1911,p. 314. ^Daston,Lorraine(1988).ClassicalProbabilityintheEnlightenment.PrincetonUnivPress.p. 268.ISBN 0-691-08497-1. ^Stigler,StephenM.(1986)."InverseProbability".TheHistoryofStatistics:TheMeasurementofUncertaintyBefore1900.HarvardUniversityPress.pp. 99–138.ISBN 978-0-674-40341-3. ^Jeffreys,Harold(1973).ScientificInference(3rd ed.).CambridgeUniversityPress.p. 31.ISBN 978-0-521-18078-8. ^Stigler,StephenM.(1983)."WhoDiscoveredBayes'Theorem?".TheAmericanStatistician.37(4):290–296.doi:10.1080/00031305.1983.10483122. ^deVaux,Richard;Velleman,Paul;Bock,David(2016).Stats,DataandModels(4th ed.).Pearson.pp. 380–381.ISBN 978-0-321-98649-8. ^Edwards,A.W.F.(1986)."IstheReferenceinHartley(1749)toBayesianInference?".TheAmericanStatistician.40(2):109–110.doi:10.1080/00031305.1986.10475370. ^Hooper,Martyn(2013)."RichardPrice,Bayes'theorem,andGod".Significance.10(1):36–39.doi:10.1111/j.1740-9713.2013.00638.x.S2CID 153704746. ^abMcGrayne,S.B.(2011).TheTheoryThatWouldNotDie:HowBayes'RuleCrackedtheEnigmaCode,HuntedDownRussianSubmarines&EmergedTriumphantfromTwoCenturiesofControversy.YaleUniversityPress.ISBN 978-0-300-18822-6. ^Kraft,StephanieA;Duenas,Devan;Wilfond,BenjaminS;Goddard,KatrinaAB(24September2018)."Theevolvinglandscapeofexpandedcarrierscreening:challengesandopportunities".GeneticsinMedicine.21(4):790–797.doi:10.1038/s41436-018-0273-4.PMC 6752283.PMID 30245516. ^abOgino,Shuji;Wilson,RobertB;Gold,Bert;Hawley,Pamela;Grody,WayneW(October2004)."Bayesiananalysisforcysticfibrosisrisksinprenatalandcarrierscreening".GeneticsinMedicine.6(5):439–449.doi:10.1097/01.GIM.0000139511.83336.8F.PMID 15371910. ^"TypesofCFTRMutations".CysticFibrosisFoundation,www.cff.org/What-is-CF/Genetics/Types-of-CFTR-Mutations/. ^"CFTRGene–GeneticsHomeReference".U.S.NationalLibraryofMedicine,NationalInstitutesofHealth,ghr.nlm.nih.gov/gene/CFTR#location. Bibliography[edit] Thisarticle incorporatestextfromapublicationnowinthepublicdomain: Mitchell,JohnMalcolm(1911)."Price,Richard".InChisholm,Hugh(ed.).EncyclopædiaBritannica.Vol. 22(11th ed.).CambridgeUniversityPress.pp. 314–315. Furtherreading[edit] Grunau,Hans-Christoph(24January2014)."PrefaceIssue3/4-2013".JahresberichtderDeutschenMathematiker-Vereinigung.115(3–4):127–128.doi:10.1365/s13291-013-0077-z. Gelman,A,Carlin,JB,Stern,HS,andRubin,DB(2003),"BayesianDataAnalysis,"SecondEdition,CRCPress. Grinstead,CMandSnell,JL(1997),"IntroductiontoProbability(2ndedition),"AmericanMathematicalSociety(freepdfavailable)[1]. "Bayesformula",EncyclopediaofMathematics,EMSPress,2001[1994] McGrayne,SB(2011).TheTheoryThatWouldNotDie:HowBayes'RuleCrackedtheEnigmaCode,HuntedDownRussianSubmarines&EmergedTriumphantfromTwoCenturiesofControversy.YaleUniversityPress.ISBN 978-0-300-18822-6. Laplace,PierreSimon(1986)."MemoirontheProbabilityoftheCausesofEvents".StatisticalScience.1(3):364–378.doi:10.1214/ss/1177013621.JSTOR 2245476. Lee,PeterM(2012),"BayesianStatistics:AnIntroduction,"4thedition.Wiley.ISBN 978-1-118-33257-3. PugaJL,KrzywinskiM,AltmanN(31March2015)."Bayes'theorem".NatureMethods.12(4):277–278.doi:10.1038/nmeth.3335.PMID 26005726. Rosenthal,JeffreyS(2005),"StruckbyLightning:TheCuriousWorldofProbabilities".HarperCollins.(Granta,2008.ISBN 9781862079960). Stigler,StephenM.(August1986)."Laplace's1774MemoironInverseProbability".StatisticalScience.1(3):359–363.doi:10.1214/ss/1177013620. Stone,JV(2013),downloadchapter1of"Bayes'Rule:ATutorialIntroductiontoBayesianAnalysis",SebtelPress,England. BayesianReasoningforIntelligentPeople,AnintroductionandtutorialtotheuseofBayes'theoreminstatisticsandcognitivescience. Morris,Dan(2016),Readfirst6chaptersforfreeof"Bayes'TheoremExamples:AVisualIntroductionForBeginners"BlueWindmillISBN 978-1549761744.AshorttutorialonhowtounderstandproblemscenariosandfindP(B),P(A),andP(B|A). Externallinks[edit] VisualexplanationofBayesusingtreesonYouTube Bayes'frequentistinterpretationexplainedvisuallyonYouTube EarliestKnownUsesofSomeoftheWordsofMathematics(B).Containsoriginsof"Bayesian","Bayes'Theorem","BayesEstimate/Risk/Solution","EmpiricalBayes",and"BayesFactor". AtutorialonprobabilityandBayes'theoremdevisedforOxfordUniversitypsychologystudents AnIntuitiveExplanationofBayes'TheorembyEliezerS.Yudkowsky BayesianClinicalDiagnosticModel Authoritycontrol:Nationallibraries Germany Retrievedfrom"https://en.wikipedia.org/w/index.php?title=Bayes%27_theorem&oldid=1097892772" Categories:BayesianstatisticsProbabilitytheoremsTheoremsinstatisticsHiddencategories:ArticleswithshortdescriptionShortdescriptionisdifferentfromWikidataWikipediaarticlesneedingpagenumbercitationsfromMarch2021AllarticleswithunsourcedstatementsArticleswithunsourcedstatementsfromMay2020Wikipediaarticlesincorporatingacitationfromthe1911EncyclopaediaBritannicawithWikisourcereferenceWikipediaarticlesincorporatingtextfromthe1911EncyclopædiaBritannicaArticleswithGNDidentifiers Navigationmenu Personaltools NotloggedinTalkContributionsCreateaccountLogin Namespaces ArticleTalk English Views ReadEditViewhistory More Search Navigation MainpageContentsCurrenteventsRandomarticleAboutWikipediaContactusDonate Contribute HelpLearntoeditCommunityportalRecentchangesUploadfile Tools WhatlinkshereRelatedchangesUploadfileSpecialpagesPermanentlinkPageinformationCitethispageWikidataitem Print/export DownloadasPDFPrintableversion Inotherprojects WikimediaCommons Languages AfrikaansالعربيةAragonésAsturianuБеларускаяБеларуская(тарашкевіца)БългарскиCatalàČeštinaCymraegDanskDeutschEestiΕλληνικάEspañolEsperantoEuskaraفارسیFrançaisGaeilgeGalego한국어हिन्दीBahasaIndonesiaÍslenskaItalianoעבריתLatinaLatviešuLietuviųMagyarМонголNederlands日本語NorskbokmålپنجابیPiemontèisPolskiPortuguêsRomânăРусскийSimpleEnglishSlovenčinaSlovenščinaСрпски/srpskiSundaSuomiSvenskaதமிழ்TürkçeУкраїнськаاردوTiếngViệt吴语粵語中文 Editlinks
延伸文章資訊
- 1Bayesian Statistics Explained in Simple English For Beginners
“Bayesian statistics is a mathematical procedure that applies probabilities to statistical proble...
- 2Probability concepts explained: Bayesian inference for ...
Bayesian inference is therefore just the process of deducing properties about a population or pro...
- 3Increasing Interpretability of Bayesian Probabilistic Programming Models ...
- 4Bayes theorem in Artificial Intelligence - Javatpoint
- 5Understanding Statistics And Probability: Bayesian Inference
Bayesian inference is a method of statistical inference in which Bayes' theorem is used to update...