🕷️ Crawler Inspector

URL Lookup

Direct Parameter Lookup

Raw Queries and Responses

1. Shard Calculation

Query:
Response:
Calculated Shard: 152 (from laksa107)

2. Crawled Status Check

Query:
Response:

3. Robots.txt Check

Query:
Response:

4. Spam/Ban Check

Query:
Response:

5. Seen Status Check

ℹ️ Skipped - page is already crawled

📄
INDEXABLE
CRAWLED
4 days ago
🤖
ROBOTS ALLOWED

Page Info Filters

FilterStatusConditionDetails
HTTP statusPASSdownload_http_code = 200HTTP 200
Age cutoffPASSdownload_stamp > now() - 6 MONTH0.1 months ago (distributed domain, exempt)
History dropPASSisNull(history_drop_reason)No drop reason
Spam/banPASSfh_dont_index != 1 AND ml_spam_score = 0ml_spam_score=0
CanonicalPASSmeta_canonical IS NULL OR = '' OR = src_unparsedNot set

Page Details

PropertyValue
URLhttps://en.wikipedia.org/wiki/Bayesian_statistics
Last Crawled2026-04-14 02:10:43 (4 days ago)
First Indexed2013-08-09 04:44:31 (12 years ago)
HTTP Status Code200
Meta TitleBayesian statistics - Wikipedia
Meta Descriptionnull
Meta Canonicalnull
Boilerpipe Text
From Wikipedia, the free encyclopedia Bayesian statistics ( BAY -zee-ən or BAY -zhən ) [ 1 ] is a theory in the field of statistics based on the Bayesian interpretation of probability , where probability expresses a degree of belief in an event . The degree of belief may be based on prior knowledge about the event, such as the results of previous experiments, or on personal beliefs about the event. This differs from a number of other interpretations of probability , such as the frequentist interpretation, which views probability as the limit of the relative frequency of an event after many trials. [ 2 ] More concretely, analysis in Bayesian methods codifies prior knowledge in the form of a prior distribution . Bayesian statistical methods use Bayes' theorem to compute and update probabilities after obtaining new data. Bayes' theorem describes the conditional probability of an event based on data as well as prior information or beliefs about the event or conditions related to the event. [ 3 ] [ 4 ] For example, in Bayesian inference , Bayes' theorem can be used to estimate the parameters of a probability distribution or statistical model . Since Bayesian statistics treats probability as a degree of belief, Bayes' theorem can directly assign a probability distribution that quantifies the belief to the parameter or set of parameters. [ 2 ] [ 3 ] Bayesian statistics is named after Thomas Bayes , who formulated a specific case of Bayes' theorem in a paper published in 1763. In several papers spanning from the late 18th to the early 19th centuries, Pierre-Simon Laplace developed the Bayesian interpretation of probability. [ 5 ] Laplace used methods now considered Bayesian to solve a number of statistical problems. While many Bayesian methods were developed by later authors, the term "Bayesian" was not commonly used to describe these methods until the 1950s. Throughout much of the 20th century, Bayesian methods were viewed unfavorably by many statisticians due to philosophical and practical considerations. Many of these methods required much computation, and most widely used approaches during that time were based on the frequentist interpretation. However, with the advent of powerful computers and new algorithms like Markov chain Monte Carlo , Bayesian methods have gained increasing prominence in statistics in the 21st century. [ 2 ] [ 6 ] Bayes' theorem is used in Bayesian methods to update probabilities, which are degrees of belief, after obtaining new data. Given two events and , the conditional probability of given that is true is expressed as follows: [ 7 ] where . Although Bayes' theorem is a fundamental result of probability theory , it has a specific interpretation in Bayesian statistics. In the above equation, usually represents a proposition (such as the statement that a coin lands on heads fifty percent of the time) and represents the evidence, or new data that is to be taken into account (such as the result of a series of coin flips). is the prior probability of which expresses one's beliefs about before evidence is taken into account. The prior probability may also quantify prior knowledge or information about . is the likelihood function , which can be interpreted as the probability of the evidence given that is true. The likelihood quantifies the extent to which the evidence supports the proposition . is the posterior probability , the probability of the proposition after taking the evidence into account. Essentially, Bayes' theorem updates one's prior beliefs after considering the new evidence . [ 2 ] The probability of the evidence can be calculated using the law of total probability . If is a partition of the sample space , which is the set of all outcomes of an experiment, then, [ 2 ] [ 7 ] When there are an infinite number of outcomes, it is necessary to integrate over all outcomes to calculate using the law of total probability. Often, is difficult to calculate as the calculation would involve sums or integrals that would be time-consuming to evaluate, so often only the product of the prior and likelihood is considered, since the evidence does not change in the same analysis. The posterior is proportional to this product: [ 2 ] The maximum a posteriori , which is the mode of the posterior and is often computed in Bayesian statistics using mathematical optimization methods, remains the same. The posterior can be approximated even without computing the exact value of with methods such as Markov chain Monte Carlo or variational Bayesian methods . [ 2 ] The classical textbook equation for the posterior in Bayesian statistics is usually stated as where is the updated probability of being the true parameter after collecting the data , is the likelihood of collecting the data given the parameter , is the prior belief of 's likelihood and the integral in the denominator gives the probability of collecting the data . Mathematically, this version of Bayes' theorem can be constructed in the following way: Suppose to be some parametric statistical model and to be a probability space over the parameter space. We can construct a new probability space where is a sort of product measure defined as: Now, let and , then we get: and hence both as empirically might be expected. Thus, Bayes' theorem states: If (absolutely continuous w.r.t. lebesgue measure), then there exists a density such that and we can write: Else, if (absolutely continuous w.r.t. counting measure), analogous we can write: Thus, by identifying with and with we arrive at the classical equation stated above. The general set of statistical techniques can be divided into a number of activities, many of which have special Bayesian versions. Bayesian inference refers to statistical inference where uncertainty in inferences is quantified using probability. [ 8 ] In classical frequentist inference , model parameters and hypotheses are considered to be fixed. Probabilities are not assigned to parameters or hypotheses in frequentist inference. For example, it would not make sense in frequentist inference to directly assign a probability to an event that can only happen once, such as the result of the next flip of a fair coin. However, it would make sense to state that the proportion of heads approaches one-half as the number of coin flips increases. [ 9 ] Statistical models specify a set of statistical assumptions and processes that represent how the sample data are generated. Statistical models have a number of parameters that can be modified. For example, a coin can be represented as samples from a Bernoulli distribution , which models two possible outcomes. The Bernoulli distribution has a single parameter equal to the probability of one outcome, which in most cases is the probability of landing on heads. Devising a good model for the data is central in Bayesian inference. In most cases, models only approximate the true process, and may not take into account certain factors influencing the data. [ 2 ] In Bayesian inference, probabilities can be assigned to model parameters. Parameters can be represented as random variables . Bayesian inference uses Bayes' theorem to update probabilities after more evidence is obtained or known. [ 2 ] [ 10 ] Furthermore, Bayesian methods allow for placing priors on entire models and calculating their posterior probabilities using Bayes' theorem. These posterior probabilities are proportional to the product of the prior and the marginal likelihood, where the marginal likelihood is the integral of the sampling density over the prior distribution of the parameters. In complex models, marginal likelihoods are generally computed numerically. [ 11 ] Statistical modeling [ edit ] The formulation of statistical models using Bayesian statistics has the identifying feature of requiring the specification of prior distributions for any unknown parameters. Indeed, parameters of prior distributions may themselves have prior distributions, leading to Bayesian hierarchical modeling , [ 12 ] [ 13 ] [ 14 ] also known as multi-level modeling. A special case is Bayesian networks . For conducting a Bayesian statistical analysis, best practices are discussed by van de Schoot et al. [ 15 ] Design of experiments [ edit ] The Bayesian design of experiments includes a concept called 'influence of prior beliefs'. This approach uses sequential analysis techniques to include the outcome of earlier experiments in the design of the next experiment. This is achieved by updating 'beliefs' through the use of prior and posterior distribution . This allows the design of experiments to make good use of resources of all types. An example of this is the multi-armed bandit problem . Exploratory analysis of Bayesian models [ edit ] Exploratory analysis of Bayesian models is an adaptation or extension of the exploratory data analysis approach to the needs and peculiarities of Bayesian modeling. In the words of Persi Diaconis: [ 16 ] Exploratory data analysis seeks to reveal structure, or simple descriptions in data. We look at numbers or graphs and try to find patterns. We pursue leads suggested by background information, imagination, patterns perceived, and experience with other data analyses The inference process generates a posterior distribution, which has a central role in Bayesian statistics, together with other distributions like the posterior predictive distribution and the prior predictive distribution. The correct visualization, analysis, and interpretation of these distributions is key to properly answer the questions that motivate the inference process. [ 17 ] When working with Bayesian models there are a series of related tasks that need to be addressed besides inference itself: Diagnoses of the quality of the inference, this is needed when using numerical methods such as Markov chain Monte Carlo techniques Model criticism, including evaluations of both model assumptions and model predictions Comparison of models, including model selection or model averaging Preparation of the results for a particular audience All these tasks are part of the Exploratory analysis of Bayesian models approach and successfully performing them is central to the iterative and interactive modeling process. These tasks require both numerical and visual summaries. [ 18 ] [ 19 ] [ 20 ] Bayesian epistemology For a list of mathematical logic notation used in this article Notation in probability and statistics List of logic symbols ^ "Bayesian" . Merriam-Webster.com Dictionary . Merriam-Webster. OCLC   1032680871 . ^ a b c d e f g h i Gelman, Andrew ; Carlin, John B. ; Stern, Hal S.; Dunson, David B.; Vehtari, Aki; Rubin, Donald B. (2013). Bayesian Data Analysis (Third ed.). Chapman and Hall/CRC. ISBN   978-1-4398-4095-5 . ^ a b McElreath, Richard (2020). Statistical Rethinking : A Bayesian Course with Examples in R and Stan (2nd ed.). Chapman and Hall/CRC. ISBN   978-0-367-13991-9 . ^ Kruschke, John (2014). Doing Bayesian Data Analysis: A Tutorial with R, JAGS, and Stan (2nd ed.). Academic Press. ISBN   978-0-12-405888-0 . ^ McGrayne, Sharon (2012). The Theory That Would Not Die: How Bayes' Rule Cracked the Enigma Code, Hunted Down Russian Submarines, and Emerged Triumphant from Two Centuries of Controversy (First ed.). Chapman and Hall/CRC. ISBN   978-0-3001-8822-6 . ^ Fienberg, Stephen E. (2006). "When Did Bayesian Inference Become "Bayesian"?" . Bayesian Analysis . 1 (1): 1– 40. doi : 10.1214/06-BA101 . ^ a b Grinstead, Charles M.; Snell, J. Laurie (2006). Introduction to probability (2nd ed.). Providence, RI: American Mathematical Society. ISBN   978-0-8218-9414-9 . ^ Lee, Se Yoon (2021). "Gibbs sampler and coordinate ascent variational inference: A set-theoretical review". Communications in Statistics - Theory and Methods . 51 (6): 1549– 1568. arXiv : 2008.01006 . doi : 10.1080/03610926.2021.1921214 . S2CID   220935477 . ^ Wakefield, Jon (2013). Bayesian and frequentist regression methods . New York, NY: Springer. ISBN   978-1-4419-0924-4 . ^ Congdon, Peter (2014). Applied Bayesian modelling (2nd ed.). Wiley. ISBN   978-1119951513 . ^ Chib, Siddhartha (1995). "Marginal Likelihood from the Gibbs Output". Journal of the American Statistical Association . 90 (432): 1313– 1321. doi : 10.1080/01621459.1995.10476635 . ^ Kruschke, J K ; Vanpaemel, W (2015). "Bayesian Estimation in Hierarchical Models". In Busemeyer, J R; Wang, Z; Townsend, J T; Eidels, A (eds.). The Oxford Handbook of Computational and Mathematical Psychology (PDF) . Oxford University Press. pp.  279– 299. ^ Hajiramezanali, E. & Dadaneh, S. Z. & Karbalayghareh, A. & Zhou, Z. & Qian, X. Bayesian multi-domain learning for cancer subtype discovery from next-generation sequencing count data. 32nd Conference on Neural Information Processing Systems (NIPS 2018), Montréal, Canada. arXiv : 1810.09433 ^ Lee, Se Yoon; Mallick, Bani (2021). "Bayesian Hierarchical Modeling: Application Towards Production Results in the Eagle Ford Shale of South Texas". Sankhya B . 84 : 1– 43. doi : 10.1007/s13571-020-00245-8 . ^ van de Schoot, Rens; Depaoli, Sarah; King, Ruth; Kramer, Bianca; Märtens, Kaspar; Tadesse, Mahlet G. ; Vannucci, Marina; Gelman, Andrew; Veen, Duco; Willemsen, Joukje; Yau, Christopher (January 14, 2021). "Bayesian statistics and modelling" . Nature Reviews Methods Primers . 1 (1): 1– 26. doi : 10.1038/s43586-020-00001-2 . hdl : 1874/415909 . S2CID   234108684 . ^ Diaconis, Persi (2011) Theories of Data Analysis: From Magical Thinking Through Classical Statistics. John Wiley & Sons, Ltd 2:e55 doi : 10.1002/9781118150702.ch1 ^ Kumar, Ravin; Carroll, Colin; Hartikainen, Ari; Martin, Osvaldo (2019). "ArviZ a unified library for exploratory analysis of Bayesian models in Python" . Journal of Open Source Software . 4 (33): 1143. Bibcode : 2019JOSS....4.1143K . doi : 10.21105/joss.01143 . hdl : 11336/114615 . ^ Gabry, Jonah; Simpson, Daniel; Vehtari, Aki; Betancourt, Michael; Gelman, Andrew (2019). "Visualization in Bayesian workflow". Journal of the Royal Statistical Society, Series A (Statistics in Society) . 182 (2): 389– 402. arXiv : 1709.01449 . doi : 10.1111/rssa.12378 . S2CID   26590874 . ^ Vehtari, Aki; Gelman, Andrew; Simpson, Daniel; Carpenter, Bob; Bürkner, Paul-Christian (2021). "Rank-Normalization, Folding, and Localization: An Improved Rˆ for Assessing Convergence of MCMC (With Discussion)". Bayesian Analysis . 16 (2): 667. arXiv : 1903.08008 . Bibcode : 2021BayAn..16..667V . doi : 10.1214/20-BA1221 . S2CID   88522683 . ^ Martin, Osvaldo (2018). Bayesian Analysis with Python: Introduction to statistical modeling and probabilistic programming using PyMC3 and ArviZ . Packt Publishing Ltd. ISBN   9781789341652 . Bernardo, José M. ; Smith, Adrian F. M. (2000). Bayesian Theory . New York: Wiley. ISBN   0-471-92416-4 . Bolstad, William M.; Curran, James M. (2016). Introduction to Bayesian Statistics (3rd ed.). Wiley. ISBN   978-1-118-09156-2 . Downey, Allen B. (2021). Think Bayes: Bayesian Statistics in Python (2nd ed.). O'Reilly. ISBN   978-1-4920-8946-9 . Hoff, Peter D. (2009). A First Course in Bayesian Statistical Methods (2nd ed.). New York: Springer. ISBN   978-1-4419-2828-3 . Lee, Peter M. (2012). Bayesian Statistics: An Introduction (4th ed.). Wiley. ISBN   978-1-118-33257-3 . Robert, Christian P. (2007). The Bayesian Choice : From Decision-Theoretic Foundations to Computational Implementation (2nd ed.). New York: Springer. ISBN   978-0-387-71598-8 . Johnson, Alicia A.; Ott, Mies Q.; Dogucu, Mine (2022). Bayes Rules! An Introduction to Applied Bayesian Modeling . Boca Raton: Chapman & Hall/CRC Texts in Statistical Science. ISBN   978-0-367-25539-8 . Theo Kypraios. "A Gentle Tutorial in Bayesian Statistics" (PDF) . Retrieved 2013-11-03 . Jordi Vallverdu. Bayesians Versus Frequentists A Philosophical Debate on Statistical Reasoning . Bayesian statistics David Spiegelhalter , Kenneth Rice Scholarpedia 4(8):5230. doi:10.4249/scholarpedia.5230 Bayesian modeling book and examples available for downloading. Rens van de Schoot. "A Gentle Introduction to Bayesian Analysis" (PDF) . Bayesian A/B Testing Calculator Dynamic Yield
Markdown
[Jump to content](https://en.wikipedia.org/wiki/Bayesian_statistics#bodyContent) Main menu Main menu move to sidebar hide Navigation - [Main page](https://en.wikipedia.org/wiki/Main_Page "Visit the main page [z]") - [Contents](https://en.wikipedia.org/wiki/Wikipedia:Contents "Guides to browsing Wikipedia") - [Current events](https://en.wikipedia.org/wiki/Portal:Current_events "Articles related to current events") - [Random article](https://en.wikipedia.org/wiki/Special:Random "Visit a randomly selected article [x]") - [About Wikipedia](https://en.wikipedia.org/wiki/Wikipedia:About "Learn about Wikipedia and how it works") - [Contact us](https://en.wikipedia.org/wiki/Wikipedia:Contact_us "How to contact Wikipedia") Contribute - [Help](https://en.wikipedia.org/wiki/Help:Contents "Guidance on how to use and edit Wikipedia") - [Learn to edit](https://en.wikipedia.org/wiki/Help:Introduction "Learn how to edit Wikipedia") - [Community portal](https://en.wikipedia.org/wiki/Wikipedia:Community_portal "The hub for editors") - [Recent changes](https://en.wikipedia.org/wiki/Special:RecentChanges "A list of recent changes to Wikipedia [r]") - [Upload file](https://en.wikipedia.org/wiki/Wikipedia:File_upload_wizard "Add images or other media for use on Wikipedia") - [Special pages](https://en.wikipedia.org/wiki/Special:SpecialPages "A list of all special pages [q]") [![](https://en.wikipedia.org/static/images/icons/enwiki-25.svg) ![Wikipedia](https://en.wikipedia.org/static/images/mobile/copyright/wikipedia-wordmark-en-25.svg) ![The Free Encyclopedia](https://en.wikipedia.org/static/images/mobile/copyright/wikipedia-tagline-en-25.svg)](https://en.wikipedia.org/wiki/Main_Page) [Search](https://en.wikipedia.org/wiki/Special:Search "Search Wikipedia [f]") Appearance - [Donate](https://donate.wikimedia.org/?wmf_source=donate&wmf_medium=sidebar&wmf_campaign=en.wikipedia.org&uselang=en) - [Create account](https://en.wikipedia.org/w/index.php?title=Special:CreateAccount&returnto=Bayesian+statistics "You are encouraged to create an account and log in; however, it is not mandatory") - [Log in](https://en.wikipedia.org/w/index.php?title=Special:UserLogin&returnto=Bayesian+statistics "You're encouraged to log in; however, it's not mandatory. [o]") Personal tools - [Donate](https://donate.wikimedia.org/?wmf_source=donate&wmf_medium=sidebar&wmf_campaign=en.wikipedia.org&uselang=en) - [Create account](https://en.wikipedia.org/w/index.php?title=Special:CreateAccount&returnto=Bayesian+statistics "You are encouraged to create an account and log in; however, it is not mandatory") - [Log in](https://en.wikipedia.org/w/index.php?title=Special:UserLogin&returnto=Bayesian+statistics "You're encouraged to log in; however, it's not mandatory. [o]") ## Contents move to sidebar hide - [(Top)](https://en.wikipedia.org/wiki/Bayesian_statistics) - [1 Bayes' theorem](https://en.wikipedia.org/wiki/Bayesian_statistics#Bayes'_theorem) Toggle Bayes' theorem subsection - [1\.1 Construction](https://en.wikipedia.org/wiki/Bayesian_statistics#Construction) - [2 Bayesian methods](https://en.wikipedia.org/wiki/Bayesian_statistics#Bayesian_methods) Toggle Bayesian methods subsection - [2\.1 Bayesian inference](https://en.wikipedia.org/wiki/Bayesian_statistics#Bayesian_inference) - [2\.2 Statistical modeling](https://en.wikipedia.org/wiki/Bayesian_statistics#Statistical_modeling) - [2\.3 Design of experiments](https://en.wikipedia.org/wiki/Bayesian_statistics#Design_of_experiments) - [2\.4 Exploratory analysis of Bayesian models](https://en.wikipedia.org/wiki/Bayesian_statistics#Exploratory_analysis_of_Bayesian_models) - [3 See also](https://en.wikipedia.org/wiki/Bayesian_statistics#See_also) - [4 References](https://en.wikipedia.org/wiki/Bayesian_statistics#References) - [5 Further reading](https://en.wikipedia.org/wiki/Bayesian_statistics#Further_reading) - [6 External links](https://en.wikipedia.org/wiki/Bayesian_statistics#External_links) Toggle the table of contents # Bayesian statistics 25 languages - [العربية](https://ar.wikipedia.org/wiki/%D8%A5%D8%AD%D8%B5%D8%A7%D8%A1_%D8%A8%D8%A7%D9%8A%D8%B2%D9%8A "إحصاء بايزي – Arabic") - [বাংলা](https://bn.wikipedia.org/wiki/%E0%A6%AC%E0%A7%87%E0%A6%87%E0%A6%9C%E0%A7%80%E0%A6%AF%E0%A6%BC_%E0%A6%89%E0%A6%AA%E0%A6%AA%E0%A6%BE%E0%A6%A6%E0%A7%8D%E0%A6%AF "বেইজীয় উপপাদ্য – Bangla") - [Català](https://ca.wikipedia.org/wiki/Estad%C3%ADstica_bayesiana "Estadística bayesiana – Catalan") - [Čeština](https://cs.wikipedia.org/wiki/Bayesovsk%C3%A1_statistika "Bayesovská statistika – Czech") - [Cymraeg](https://cy.wikipedia.org/wiki/Ystadegaeth_Bayes "Ystadegaeth Bayes – Welsh") - [Deutsch](https://de.wikipedia.org/wiki/Bayessche_Statistik "Bayessche Statistik – German") - [Ελληνικά](https://el.wikipedia.org/wiki/%CE%9C%CF%80%CE%B5%CF%8B%CE%B6%CE%B9%CE%B1%CE%BD%CE%AE_%CF%83%CF%84%CE%B1%CF%84%CE%B9%CF%83%CF%84%CE%B9%CE%BA%CE%AE "Μπεϋζιανή στατιστική – Greek") - [Español](https://es.wikipedia.org/wiki/Estad%C3%ADstica_bayesiana "Estadística bayesiana – Spanish") - [فارسی](https://fa.wikipedia.org/wiki/%D8%A2%D9%85%D8%A7%D8%B1_%D8%A8%DB%8C%D8%B2%DB%8C "آمار بیزی – Persian") - [Français](https://fr.wikipedia.org/wiki/Statistique_bay%C3%A9sienne "Statistique bayésienne – French") - [עברית](https://he.wikipedia.org/wiki/%D7%A1%D7%98%D7%98%D7%99%D7%A1%D7%98%D7%99%D7%A7%D7%94_%D7%91%D7%99%D7%99%D7%A1%D7%99%D7%90%D7%A0%D7%99%D7%AA "סטטיסטיקה בייסיאנית – Hebrew") - [Magyar](https://hu.wikipedia.org/wiki/Bayesi%C3%A1nus_statisztika "Bayesiánus statisztika – Hungarian") - [Bahasa Indonesia](https://id.wikipedia.org/wiki/Statistika_Bayes "Statistika Bayes – Indonesian") - [Italiano](https://it.wikipedia.org/wiki/Statistica_bayesiana "Statistica bayesiana – Italian") - [日本語](https://ja.wikipedia.org/wiki/%E3%83%99%E3%82%A4%E3%82%BA%E7%B5%B1%E8%A8%88%E5%AD%A6 "ベイズ統計学 – Japanese") - [한국어](https://ko.wikipedia.org/wiki/%EB%B2%A0%EC%9D%B4%EC%A6%88_%ED%86%B5%EA%B3%84%ED%95%99 "베이즈 통계학 – Korean") - [Nederlands](https://nl.wikipedia.org/wiki/Bayesiaanse_statistiek "Bayesiaanse statistiek – Dutch") - [Polski](https://pl.wikipedia.org/wiki/Statystyka_w_uj%C4%99ciu_bayesowskim "Statystyka w ujęciu bayesowskim – Polish") - [Português](https://pt.wikipedia.org/wiki/Estat%C3%ADstica_bayesiana "Estatística bayesiana – Portuguese") - [Русский](https://ru.wikipedia.org/wiki/%D0%91%D0%B0%D0%B9%D0%B5%D1%81%D0%BE%D0%B2%D1%81%D0%BA%D0%B0%D1%8F_%D1%81%D1%82%D0%B0%D1%82%D0%B8%D1%81%D1%82%D0%B8%D0%BA%D0%B0 "Байесовская статистика – Russian") - [Slovenščina](https://sl.wikipedia.org/wiki/Bayesova_statistika "Bayesova statistika – Slovenian") - [Türkçe](https://tr.wikipedia.org/wiki/Bayesci_istatistik "Bayesci istatistik – Turkish") - [Українська](https://uk.wikipedia.org/wiki/%D0%91%D0%B0%D1%94%D1%81%D0%BE%D0%B2%D0%B0_%D1%81%D1%82%D0%B0%D1%82%D0%B8%D1%81%D1%82%D0%B8%D0%BA%D0%B0 "Баєсова статистика – Ukrainian") - [粵語](https://zh-yue.wikipedia.org/wiki/%E8%B2%9D%E8%91%89%E6%96%AF%E7%B5%B1%E8%A8%88%E5%AD%B8 "貝葉斯統計學 – Cantonese") - [中文](https://zh.wikipedia.org/wiki/%E8%B4%9D%E5%8F%B6%E6%96%AF%E7%BB%9F%E8%AE%A1 "贝叶斯统计 – Chinese") [Edit links](https://www.wikidata.org/wiki/Special:EntityPage/Q4874481#sitelinks-wikipedia "Edit interlanguage links") - [Article](https://en.wikipedia.org/wiki/Bayesian_statistics "View the content page [c]") - [Talk](https://en.wikipedia.org/wiki/Talk:Bayesian_statistics "Discuss improvements to the content page [t]") English - [Read](https://en.wikipedia.org/wiki/Bayesian_statistics) - [Edit](https://en.wikipedia.org/w/index.php?title=Bayesian_statistics&action=edit "Edit this page [e]") - [View history](https://en.wikipedia.org/w/index.php?title=Bayesian_statistics&action=history "Past revisions of this page [h]") Tools Tools move to sidebar hide Actions - [Read](https://en.wikipedia.org/wiki/Bayesian_statistics) - [Edit](https://en.wikipedia.org/w/index.php?title=Bayesian_statistics&action=edit "Edit this page [e]") - [View history](https://en.wikipedia.org/w/index.php?title=Bayesian_statistics&action=history) General - [What links here](https://en.wikipedia.org/wiki/Special:WhatLinksHere/Bayesian_statistics "List of all English Wikipedia pages containing links to this page [j]") - [Related changes](https://en.wikipedia.org/wiki/Special:RecentChangesLinked/Bayesian_statistics "Recent changes in pages linked from this page [k]") - [Upload file](https://en.wikipedia.org/wiki/Wikipedia:File_Upload_Wizard "Upload files [u]") - [Permanent link](https://en.wikipedia.org/w/index.php?title=Bayesian_statistics&oldid=1344525377 "Permanent link to this revision of this page") - [Page information](https://en.wikipedia.org/w/index.php?title=Bayesian_statistics&action=info "More information about this page") - [Cite this page](https://en.wikipedia.org/w/index.php?title=Special:CiteThisPage&page=Bayesian_statistics&id=1344525377&wpFormIdentifier=titleform "Information on how to cite this page") - [Get shortened URL](https://en.wikipedia.org/w/index.php?title=Special:UrlShortener&url=https%3A%2F%2Fen.wikipedia.org%2Fwiki%2FBayesian_statistics) Print/export - [Download as PDF](https://en.wikipedia.org/w/index.php?title=Special:DownloadAsPdf&page=Bayesian_statistics&action=show-download-screen "Download this page as a PDF file") - [Printable version](https://en.wikipedia.org/w/index.php?title=Bayesian_statistics&printable=yes "Printable version of this page [p]") In other projects - [Wikiversity](https://en.wikiversity.org/wiki/Bayesian_statistics) - [Wikidata item](https://www.wikidata.org/wiki/Special:EntityPage/Q4874481 "Structured data on this page hosted by Wikidata [g]") Appearance move to sidebar hide From Wikipedia, the free encyclopedia Theory and paradigm of statistics | | |---| | Part of a series on | | [Bayesian statistics]() | | [![](https://upload.wikimedia.org/wikipedia/commons/thumb/e/ed/Bayes_icon.svg/120px-Bayes_icon.svg.png)](https://en.wikipedia.org/wiki/File:Bayes_icon.svg) | | [Posterior](https://en.wikipedia.org/wiki/Posterior_probability "Posterior probability") = [Likelihood](https://en.wikipedia.org/wiki/Likelihood_function "Likelihood function") × [Prior](https://en.wikipedia.org/wiki/Prior_probability "Prior probability") ÷ [Evidence](https://en.wikipedia.org/wiki/Marginal_likelihood "Marginal likelihood") | | Background | | [Bayesian inference](https://en.wikipedia.org/wiki/Bayesian_inference "Bayesian inference") [Bayesian probability](https://en.wikipedia.org/wiki/Bayesian_probability "Bayesian probability") [Bayes' theorem](https://en.wikipedia.org/wiki/Bayes%27_theorem "Bayes' theorem") [Bernstein–von Mises theorem](https://en.wikipedia.org/wiki/Bernstein%E2%80%93von_Mises_theorem "Bernstein–von Mises theorem") [Coherence](https://en.wikipedia.org/wiki/Coherence_\(philosophical_gambling_strategy\) "Coherence (philosophical gambling strategy)") [Cox's theorem](https://en.wikipedia.org/wiki/Cox%27s_theorem "Cox's theorem") [Cromwell's rule](https://en.wikipedia.org/wiki/Cromwell%27s_rule "Cromwell's rule") [Likelihood principle](https://en.wikipedia.org/wiki/Likelihood_principle "Likelihood principle") [Principle of indifference](https://en.wikipedia.org/wiki/Principle_of_indifference "Principle of indifference") [Principle of maximum entropy](https://en.wikipedia.org/wiki/Principle_of_maximum_entropy "Principle of maximum entropy") | | Model building | | [Conjugate prior](https://en.wikipedia.org/wiki/Conjugate_prior "Conjugate prior") [Linear regression](https://en.wikipedia.org/wiki/Bayesian_linear_regression "Bayesian linear regression") [Empirical Bayes](https://en.wikipedia.org/wiki/Empirical_Bayes_method "Empirical Bayes method") [Hierarchical model](https://en.wikipedia.org/wiki/Bayesian_hierarchical_modeling "Bayesian hierarchical modeling") | | Posterior approximation | | [Markov chain Monte Carlo](https://en.wikipedia.org/wiki/Markov_chain_Monte_Carlo "Markov chain Monte Carlo") [Laplace's approximation](https://en.wikipedia.org/wiki/Laplace%27s_approximation "Laplace's approximation") [Integrated nested Laplace approximations](https://en.wikipedia.org/wiki/Integrated_nested_Laplace_approximations "Integrated nested Laplace approximations") [Variational inference](https://en.wikipedia.org/wiki/Variational_Bayesian_methods "Variational Bayesian methods") [Approximate Bayesian computation](https://en.wikipedia.org/wiki/Approximate_Bayesian_computation "Approximate Bayesian computation") | | Estimators | | [Bayesian estimator](https://en.wikipedia.org/wiki/Bayesian_estimator "Bayesian estimator") [Credible interval](https://en.wikipedia.org/wiki/Credible_interval "Credible interval") [Maximum a posteriori estimation](https://en.wikipedia.org/wiki/Maximum_a_posteriori_estimation "Maximum a posteriori estimation") | | Evidence approximation | | [Evidence lower bound](https://en.wikipedia.org/wiki/Evidence_lower_bound "Evidence lower bound") [Nested sampling](https://en.wikipedia.org/wiki/Nested_sampling_algorithm "Nested sampling algorithm") | | Model evaluation | | [Bayes factor](https://en.wikipedia.org/wiki/Bayes_factor "Bayes factor") ([Schwarz criterion](https://en.wikipedia.org/wiki/Bayesian_information_criterion "Bayesian information criterion")) [Model averaging](https://en.wikipedia.org/wiki/Bayesian_model_averaging "Bayesian model averaging") [Posterior predictive](https://en.wikipedia.org/wiki/Posterior_predictive_distribution "Posterior predictive distribution") | | [![icon](https://upload.wikimedia.org/wikipedia/commons/thumb/3/3e/Nuvola_apps_edu_mathematics_blue-p.svg/40px-Nuvola_apps_edu_mathematics_blue-p.svg.png)](https://en.wikipedia.org/wiki/File:Nuvola_apps_edu_mathematics_blue-p.svg) [Mathematics portal](https://en.wikipedia.org/wiki/Portal:Mathematics "Portal:Mathematics") | | [v](https://en.wikipedia.org/wiki/Template:Bayesian_statistics "Template:Bayesian statistics") [t](https://en.wikipedia.org/wiki/Template_talk:Bayesian_statistics "Template talk:Bayesian statistics") [e](https://en.wikipedia.org/wiki/Special:EditPage/Template:Bayesian_statistics "Special:EditPage/Template:Bayesian statistics") | **Bayesian statistics** ([/ˈbeɪziən/](https://en.wikipedia.org/wiki/Help:IPA/English "Help:IPA/English") [*BAY\-zee-ən*](https://en.wikipedia.org/wiki/Help:Pronunciation_respelling_key "Help:Pronunciation respelling key") or [/ˈbeɪʒən/](https://en.wikipedia.org/wiki/Help:IPA/English "Help:IPA/English") [*BAY\-zhən*](https://en.wikipedia.org/wiki/Help:Pronunciation_respelling_key "Help:Pronunciation respelling key"))[\[1\]](https://en.wikipedia.org/wiki/Bayesian_statistics#cite_note-1) is a theory in the field of [statistics](https://en.wikipedia.org/wiki/Statistics "Statistics") based on the [Bayesian interpretation of probability](https://en.wikipedia.org/wiki/Bayesian_probability "Bayesian probability"), where [probability](https://en.wikipedia.org/wiki/Probability "Probability") expresses a *degree of belief* in an [event](https://en.wikipedia.org/wiki/Event_\(probability_theory\) "Event (probability theory)"). The degree of belief may be based on prior knowledge about the event, such as the results of previous experiments, or on personal beliefs about the event. This differs from a number of other [interpretations of probability](https://en.wikipedia.org/wiki/Probability_interpretations "Probability interpretations"), such as the [frequentist](https://en.wikipedia.org/wiki/Frequentist_probability "Frequentist probability") interpretation, which views probability as the [limit](https://en.wikipedia.org/wiki/Limit_of_a_sequence "Limit of a sequence") of the relative frequency of an event after many trials.[\[2\]](https://en.wikipedia.org/wiki/Bayesian_statistics#cite_note-bda-2) More concretely, analysis in Bayesian methods codifies prior knowledge in the form of a [prior distribution](https://en.wikipedia.org/wiki/Prior_distribution "Prior distribution"). Bayesian statistical methods use [Bayes' theorem](https://en.wikipedia.org/wiki/Bayes%27_theorem "Bayes' theorem") to compute and update probabilities after obtaining new data. Bayes' theorem describes the [conditional probability](https://en.wikipedia.org/wiki/Conditional_probability "Conditional probability") of an event based on data as well as prior information or beliefs about the event or conditions related to the event.[\[3\]](https://en.wikipedia.org/wiki/Bayesian_statistics#cite_note-rethinking-3)[\[4\]](https://en.wikipedia.org/wiki/Bayesian_statistics#cite_note-4) For example, in [Bayesian inference](https://en.wikipedia.org/wiki/Bayesian_inference "Bayesian inference"), Bayes' theorem can be used to estimate the parameters of a [probability distribution](https://en.wikipedia.org/wiki/Probability_distribution "Probability distribution") or [statistical model](https://en.wikipedia.org/wiki/Statistical_model "Statistical model"). Since Bayesian statistics treats probability as a degree of belief, Bayes' theorem can directly assign a probability distribution that quantifies the belief to the parameter or set of parameters.[\[2\]](https://en.wikipedia.org/wiki/Bayesian_statistics#cite_note-bda-2)[\[3\]](https://en.wikipedia.org/wiki/Bayesian_statistics#cite_note-rethinking-3) Bayesian statistics is named after [Thomas Bayes](https://en.wikipedia.org/wiki/Thomas_Bayes "Thomas Bayes"), who formulated a specific case of Bayes' theorem in [a paper](https://en.wikipedia.org/wiki/An_Essay_Towards_Solving_a_Problem_in_the_Doctrine_of_Chances "An Essay Towards Solving a Problem in the Doctrine of Chances") published in 1763. In several papers spanning from the late 18th to the early 19th centuries, [Pierre-Simon Laplace](https://en.wikipedia.org/wiki/Pierre-Simon_Laplace "Pierre-Simon Laplace") developed the Bayesian interpretation of probability.[\[5\]](https://en.wikipedia.org/wiki/Bayesian_statistics#cite_note-5) Laplace used methods now considered Bayesian to solve a number of statistical problems. While many Bayesian methods were developed by later authors, the term "Bayesian" was not commonly used to describe these methods until the 1950s. Throughout much of the 20th century, Bayesian methods were viewed unfavorably by many statisticians due to philosophical and practical considerations. Many of these methods required much computation, and most widely used approaches during that time were based on the frequentist interpretation. However, with the advent of powerful computers and new [algorithms](https://en.wikipedia.org/wiki/Algorithm "Algorithm") like [Markov chain Monte Carlo](https://en.wikipedia.org/wiki/Markov_chain_Monte_Carlo "Markov chain Monte Carlo"), Bayesian methods have gained increasing prominence in statistics in the 21st century.[\[2\]](https://en.wikipedia.org/wiki/Bayesian_statistics#cite_note-bda-2)[\[6\]](https://en.wikipedia.org/wiki/Bayesian_statistics#cite_note-6) ## Bayes' theorem \[[edit](https://en.wikipedia.org/w/index.php?title=Bayesian_statistics&action=edit&section=1 "Edit section: Bayes' theorem")\] Main article: [Bayes' theorem](https://en.wikipedia.org/wiki/Bayes%27_theorem "Bayes' theorem") Bayes' theorem is used in Bayesian methods to update probabilities, which are degrees of belief, after obtaining new data. Given two events A {\\displaystyle A} ![{\\displaystyle A}](https://wikimedia.org/api/rest_v1/media/math/render/svg/7daff47fa58cdfd29dc333def748ff5fa4c923e3) and B {\\displaystyle B} ![{\\displaystyle B}](https://wikimedia.org/api/rest_v1/media/math/render/svg/47136aad860d145f75f3eed3022df827cee94d7a), the conditional probability of A {\\displaystyle A} ![{\\displaystyle A}](https://wikimedia.org/api/rest_v1/media/math/render/svg/7daff47fa58cdfd29dc333def748ff5fa4c923e3) given that B {\\displaystyle B} ![{\\displaystyle B}](https://wikimedia.org/api/rest_v1/media/math/render/svg/47136aad860d145f75f3eed3022df827cee94d7a) is true is expressed as follows:[\[7\]](https://en.wikipedia.org/wiki/Bayesian_statistics#cite_note-grinsteadsnell2006-7) P ( A ∣ B ) \= P ( B ∣ A ) P ( A ) P ( B ) {\\displaystyle P(A\\mid B)={\\frac {P(B\\mid A)P(A)}{P(B)}}} ![{\\displaystyle P(A\\mid B)={\\frac {P(B\\mid A)P(A)}{P(B)}}}](https://wikimedia.org/api/rest_v1/media/math/render/svg/87c061fe1c7430a5201eef3fa50f9d00eac78810) where P ( B ) ≠ 0 {\\displaystyle P(B)\\neq 0} ![{\\displaystyle P(B)\\neq 0}](https://wikimedia.org/api/rest_v1/media/math/render/svg/4696c4543d63622a09c29cbb00c0fea4e0b8d7b7). Although Bayes' theorem is a fundamental result of [probability theory](https://en.wikipedia.org/wiki/Probability_theory "Probability theory"), it has a specific interpretation in Bayesian statistics. In the above equation, A {\\displaystyle A} ![{\\displaystyle A}](https://wikimedia.org/api/rest_v1/media/math/render/svg/7daff47fa58cdfd29dc333def748ff5fa4c923e3) usually represents a [proposition](https://en.wikipedia.org/wiki/Proposition "Proposition") (such as the statement that a coin lands on heads fifty percent of the time) and B {\\displaystyle B} ![{\\displaystyle B}](https://wikimedia.org/api/rest_v1/media/math/render/svg/47136aad860d145f75f3eed3022df827cee94d7a) represents the evidence, or new data that is to be taken into account (such as the result of a series of coin flips). P ( A ) {\\displaystyle P(A)} ![{\\displaystyle P(A)}](https://wikimedia.org/api/rest_v1/media/math/render/svg/4f264d19e21604793c6dc54f8044df454db82744) is the [prior probability](https://en.wikipedia.org/wiki/Prior_probability "Prior probability") of A {\\displaystyle A} ![{\\displaystyle A}](https://wikimedia.org/api/rest_v1/media/math/render/svg/7daff47fa58cdfd29dc333def748ff5fa4c923e3) which expresses one's beliefs about A {\\displaystyle A} ![{\\displaystyle A}](https://wikimedia.org/api/rest_v1/media/math/render/svg/7daff47fa58cdfd29dc333def748ff5fa4c923e3) before evidence is taken into account. The prior probability may also quantify prior knowledge or information about A {\\displaystyle A} ![{\\displaystyle A}](https://wikimedia.org/api/rest_v1/media/math/render/svg/7daff47fa58cdfd29dc333def748ff5fa4c923e3). P ( B ∣ A ) {\\displaystyle P(B\\mid A)} ![{\\displaystyle P(B\\mid A)}](https://wikimedia.org/api/rest_v1/media/math/render/svg/e2fe9ad0fdfd8920e56ca948400e111852af0665) is the [likelihood function](https://en.wikipedia.org/wiki/Likelihood_function "Likelihood function"), which can be interpreted as the probability of the evidence B {\\displaystyle B} ![{\\displaystyle B}](https://wikimedia.org/api/rest_v1/media/math/render/svg/47136aad860d145f75f3eed3022df827cee94d7a) given that A {\\displaystyle A} ![{\\displaystyle A}](https://wikimedia.org/api/rest_v1/media/math/render/svg/7daff47fa58cdfd29dc333def748ff5fa4c923e3) is true. The likelihood quantifies the extent to which the evidence B {\\displaystyle B} ![{\\displaystyle B}](https://wikimedia.org/api/rest_v1/media/math/render/svg/47136aad860d145f75f3eed3022df827cee94d7a) supports the proposition A {\\displaystyle A} ![{\\displaystyle A}](https://wikimedia.org/api/rest_v1/media/math/render/svg/7daff47fa58cdfd29dc333def748ff5fa4c923e3). P ( A ∣ B ) {\\displaystyle P(A\\mid B)} ![{\\displaystyle P(A\\mid B)}](https://wikimedia.org/api/rest_v1/media/math/render/svg/8f8f30f4da85b53901e0871eb41ed8827f511bb7) is the [posterior probability](https://en.wikipedia.org/wiki/Posterior_probability "Posterior probability"), the probability of the proposition A {\\displaystyle A} ![{\\displaystyle A}](https://wikimedia.org/api/rest_v1/media/math/render/svg/7daff47fa58cdfd29dc333def748ff5fa4c923e3) after taking the evidence B {\\displaystyle B} ![{\\displaystyle B}](https://wikimedia.org/api/rest_v1/media/math/render/svg/47136aad860d145f75f3eed3022df827cee94d7a) into account. Essentially, Bayes' theorem updates one's prior beliefs P ( A ) {\\displaystyle P(A)} ![{\\displaystyle P(A)}](https://wikimedia.org/api/rest_v1/media/math/render/svg/4f264d19e21604793c6dc54f8044df454db82744) after considering the new evidence B {\\displaystyle B} ![{\\displaystyle B}](https://wikimedia.org/api/rest_v1/media/math/render/svg/47136aad860d145f75f3eed3022df827cee94d7a).[\[2\]](https://en.wikipedia.org/wiki/Bayesian_statistics#cite_note-bda-2) The probability of the evidence P ( B ) {\\displaystyle P(B)} ![{\\displaystyle P(B)}](https://wikimedia.org/api/rest_v1/media/math/render/svg/e593d180a26fd68657ea50368dbfe1a661e652aa) can be calculated using the [law of total probability](https://en.wikipedia.org/wiki/Law_of_total_probability "Law of total probability"). If { A 1 , A 2 , … , A n } {\\displaystyle \\{A\_{1},A\_{2},\\dots ,A\_{n}\\}} ![{\\displaystyle \\{A\_{1},A\_{2},\\dots ,A\_{n}\\}}](https://wikimedia.org/api/rest_v1/media/math/render/svg/5e9492d2b8fa6c9a169b82898ffbf97a1c9cb302) is a [partition](https://en.wikipedia.org/wiki/Partition_of_a_set "Partition of a set") of the [sample space](https://en.wikipedia.org/wiki/Sample_space "Sample space"), which is the set of all [outcomes](https://en.wikipedia.org/wiki/Outcome_\(probability\) "Outcome (probability)") of an experiment, then,[\[2\]](https://en.wikipedia.org/wiki/Bayesian_statistics#cite_note-bda-2)[\[7\]](https://en.wikipedia.org/wiki/Bayesian_statistics#cite_note-grinsteadsnell2006-7) P ( B ) \= P ( B ∣ A 1 ) P ( A 1 ) \+ P ( B ∣ A 2 ) P ( A 2 ) \+ ⋯ \+ P ( B ∣ A n ) P ( A n ) \= ∑ i P ( B ∣ A i ) P ( A i ) {\\displaystyle P(B)=P(B\\mid A\_{1})P(A\_{1})+P(B\\mid A\_{2})P(A\_{2})+\\dots +P(B\\mid A\_{n})P(A\_{n})=\\sum \_{i}P(B\\mid A\_{i})P(A\_{i})} ![{\\displaystyle P(B)=P(B\\mid A\_{1})P(A\_{1})+P(B\\mid A\_{2})P(A\_{2})+\\dots +P(B\\mid A\_{n})P(A\_{n})=\\sum \_{i}P(B\\mid A\_{i})P(A\_{i})}](https://wikimedia.org/api/rest_v1/media/math/render/svg/8867372a5e203835363cbad1ae1cbaa1defb15af) When there are an infinite number of outcomes, it is necessary to [integrate](https://en.wikipedia.org/wiki/Integral "Integral") over all outcomes to calculate P ( B ) {\\displaystyle P(B)} ![{\\displaystyle P(B)}](https://wikimedia.org/api/rest_v1/media/math/render/svg/e593d180a26fd68657ea50368dbfe1a661e652aa) using the law of total probability. Often, P ( B ) {\\displaystyle P(B)} ![{\\displaystyle P(B)}](https://wikimedia.org/api/rest_v1/media/math/render/svg/e593d180a26fd68657ea50368dbfe1a661e652aa) is difficult to calculate as the calculation would involve sums or integrals that would be time-consuming to evaluate, so often only the product of the prior and likelihood is considered, since the evidence does not change in the same analysis. The posterior is proportional to this product:[\[2\]](https://en.wikipedia.org/wiki/Bayesian_statistics#cite_note-bda-2) P ( A ∣ B ) ∝ P ( B ∣ A ) P ( A ) {\\displaystyle P(A\\mid B)\\propto P(B\\mid A)P(A)} ![{\\displaystyle P(A\\mid B)\\propto P(B\\mid A)P(A)}](https://wikimedia.org/api/rest_v1/media/math/render/svg/e1a83fc9b2788b4a72bbc4c90d06c67bb7e0fdae) The [maximum a posteriori](https://en.wikipedia.org/wiki/Maximum_a_posteriori "Maximum a posteriori"), which is the [mode](https://en.wikipedia.org/wiki/Mode_\(statistics\) "Mode (statistics)") of the posterior and is often computed in Bayesian statistics using [mathematical optimization](https://en.wikipedia.org/wiki/Mathematical_optimization "Mathematical optimization") methods, remains the same. The posterior can be approximated even without computing the exact value of P ( B ) {\\displaystyle P(B)} ![{\\displaystyle P(B)}](https://wikimedia.org/api/rest_v1/media/math/render/svg/e593d180a26fd68657ea50368dbfe1a661e652aa) with methods such as [Markov chain Monte Carlo](https://en.wikipedia.org/wiki/Markov_chain_Monte_Carlo "Markov chain Monte Carlo") or [variational Bayesian methods](https://en.wikipedia.org/wiki/Variational_Bayesian_methods "Variational Bayesian methods").[\[2\]](https://en.wikipedia.org/wiki/Bayesian_statistics#cite_note-bda-2) ### Construction \[[edit](https://en.wikipedia.org/w/index.php?title=Bayesian_statistics&action=edit&section=2 "Edit section: Construction")\] The classical textbook equation for the posterior in Bayesian statistics is usually stated as π ( θ ∣ x ) \= L ( x ∣ θ ) ⋅ π ( θ ) ∫ Θ L ( x ∣ θ ′ ) ⋅ π ( θ ′ ) d θ ′ {\\displaystyle \\pi (\\theta \\mid x)={\\mathcal {L}}(x\\mid \\theta )\\cdot {\\frac {\\pi (\\theta )}{\\int \_{\\Theta }{\\mathcal {L}}(x\\mid \\theta ')\\cdot \\pi (\\theta ')\\;d\\theta '}}} ![{\\displaystyle \\pi (\\theta \\mid x)={\\mathcal {L}}(x\\mid \\theta )\\cdot {\\frac {\\pi (\\theta )}{\\int \_{\\Theta }{\\mathcal {L}}(x\\mid \\theta ')\\cdot \\pi (\\theta ')\\;d\\theta '}}}](https://wikimedia.org/api/rest_v1/media/math/render/svg/8ca2f7fa5a7e2d6da43c373d49d958e1e7a14d4c) where π ( θ ∣ x ) {\\displaystyle \\pi (\\theta \\mid x)} ![{\\displaystyle \\pi (\\theta \\mid x)}](https://wikimedia.org/api/rest_v1/media/math/render/svg/0077a1d0471a66000ea105bc787a4064d12ef81c) is the updated probability of θ {\\displaystyle \\theta } ![{\\displaystyle \\theta }](https://wikimedia.org/api/rest_v1/media/math/render/svg/6e5ab2664b422d53eb0c7df3b87e1360d75ad9af) being the true parameter after collecting the data x {\\displaystyle x} ![{\\displaystyle x}](https://wikimedia.org/api/rest_v1/media/math/render/svg/87f9e315fd7e2ba406057a97300593c4802b53e4), L ( x ∣ θ ) {\\displaystyle {\\mathcal {L}}(x\\mid \\theta )} ![{\\displaystyle {\\mathcal {L}}(x\\mid \\theta )}](https://wikimedia.org/api/rest_v1/media/math/render/svg/8f2422cef15a3b8de3926f50c889a895243e36f1) is the likelihood of collecting the data x {\\displaystyle x} ![{\\displaystyle x}](https://wikimedia.org/api/rest_v1/media/math/render/svg/87f9e315fd7e2ba406057a97300593c4802b53e4) given the parameter θ {\\displaystyle \\theta } ![{\\displaystyle \\theta }](https://wikimedia.org/api/rest_v1/media/math/render/svg/6e5ab2664b422d53eb0c7df3b87e1360d75ad9af), π ( θ ) {\\displaystyle \\pi (\\theta )} ![{\\displaystyle \\pi (\\theta )}](https://wikimedia.org/api/rest_v1/media/math/render/svg/a0a237e5e72ef85c0e54dfec3cadd290f825a95a) is the prior belief of θ {\\displaystyle \\theta } ![{\\displaystyle \\theta }](https://wikimedia.org/api/rest_v1/media/math/render/svg/6e5ab2664b422d53eb0c7df3b87e1360d75ad9af)'s likelihood and the integral in the denominator gives the probability of collecting the data x {\\displaystyle x} ![{\\displaystyle x}](https://wikimedia.org/api/rest_v1/media/math/render/svg/87f9e315fd7e2ba406057a97300593c4802b53e4). Mathematically, this version of Bayes' theorem can be constructed in the following way: Suppose ( Ω , Σ Ω , { P θ ∣ θ ∈ Θ } ) {\\displaystyle (\\Omega ,\\Sigma \_{\\Omega },\\lbrace P\_{\\theta }\\mid \\theta \\in \\Theta \\rbrace )} ![{\\displaystyle (\\Omega ,\\Sigma \_{\\Omega },\\lbrace P\_{\\theta }\\mid \\theta \\in \\Theta \\rbrace )}](https://wikimedia.org/api/rest_v1/media/math/render/svg/18b96d12952a1fb467d5e18e8e530bc3b88c8e8c) to be some parametric statistical model and ( Θ , Σ Θ , π ) {\\displaystyle (\\Theta ,\\Sigma \_{\\Theta },\\pi )} ![{\\displaystyle (\\Theta ,\\Sigma \_{\\Theta },\\pi )}](https://wikimedia.org/api/rest_v1/media/math/render/svg/2da447970131797cd26ea93fd5a9f4fb0d4c8ec7) to be a probability space over the parameter space. We can construct a new probability space ( Θ × Ω , Σ Θ ⊗ Σ Ω , Q ) {\\displaystyle (\\Theta \\times \\Omega ,\\Sigma \_{\\Theta }\\otimes \\Sigma \_{\\Omega },Q)} ![{\\displaystyle (\\Theta \\times \\Omega ,\\Sigma \_{\\Theta }\\otimes \\Sigma \_{\\Omega },Q)}](https://wikimedia.org/api/rest_v1/media/math/render/svg/fbb1514f54f81a453a08b3036f7f398048b2311e) where Q {\\displaystyle Q} ![{\\displaystyle Q}](https://wikimedia.org/api/rest_v1/media/math/render/svg/8752c7023b4b3286800fe3238271bbca681219ed) is a sort of product measure defined as: Q ( M ) := ( π ⊗ P ⋅ ) ( M ) \= ∫ Θ P θ ′ ( M θ ′ ) d π ( θ ′ ) {\\displaystyle Q(M):=(\\pi \\otimes P\_{\\cdot })(M)=\\int \_{\\Theta }P\_{\\theta '}(M\_{\\theta '})\\;d\\pi (\\theta ')} ![{\\displaystyle Q(M):=(\\pi \\otimes P\_{\\cdot })(M)=\\int \_{\\Theta }P\_{\\theta '}(M\_{\\theta '})\\;d\\pi (\\theta ')}](https://wikimedia.org/api/rest_v1/media/math/render/svg/639b1550cd0df0c9242c11123ee694397af996f4) Now, let A θ := { θ } × Ω {\\displaystyle A\_{\\theta }:=\\lbrace \\theta \\rbrace \\times \\Omega } ![{\\displaystyle A\_{\\theta }:=\\lbrace \\theta \\rbrace \\times \\Omega }](https://wikimedia.org/api/rest_v1/media/math/render/svg/5d3539ef16383a5cd31c88e76ecc56153a5b3606) and B x := Θ × { x } {\\displaystyle B\_{x}:=\\Theta \\times \\lbrace x\\rbrace } ![{\\displaystyle B\_{x}:=\\Theta \\times \\lbrace x\\rbrace }](https://wikimedia.org/api/rest_v1/media/math/render/svg/8984848415b964173d3689be89f903563b21614d), then we get: Q ( θ ) \= Q ( A θ ) \= ∫ { θ } P θ ′ ( Ω ) d π ( θ ′ ) \= π ( { θ } ) ⋅ P θ ( Ω ) \= π ( θ ) {\\displaystyle Q(\\theta )=Q(A\_{\\theta })=\\int \_{\\lbrace \\theta \\rbrace }P\_{\\theta '}(\\Omega )\\;d\\pi (\\theta ')=\\pi (\\lbrace \\theta \\rbrace )\\cdot P\_{\\theta }(\\Omega )=\\pi (\\theta )} ![{\\displaystyle Q(\\theta )=Q(A\_{\\theta })=\\int \_{\\lbrace \\theta \\rbrace }P\_{\\theta '}(\\Omega )\\;d\\pi (\\theta ')=\\pi (\\lbrace \\theta \\rbrace )\\cdot P\_{\\theta }(\\Omega )=\\pi (\\theta )}](https://wikimedia.org/api/rest_v1/media/math/render/svg/675ca63dd8c5b9cb56457c47f6f02f1164918c55) and hence Q ( x ∣ θ ) \= Q ( B x ∩ A θ ) Q ( A θ ) \= π ( θ ) ⋅ P θ ( { x } ) π ( θ ) \= P θ ( x ) {\\displaystyle Q(x\\mid \\theta )={\\frac {Q(B\_{x}\\cap A\_{\\theta })}{Q(A\_{\\theta })}}={\\frac {\\pi (\\theta )\\cdot P\_{\\theta }(\\lbrace x\\rbrace )}{\\pi (\\theta )}}=P\_{\\theta }(x)} ![{\\displaystyle Q(x\\mid \\theta )={\\frac {Q(B\_{x}\\cap A\_{\\theta })}{Q(A\_{\\theta })}}={\\frac {\\pi (\\theta )\\cdot P\_{\\theta }(\\lbrace x\\rbrace )}{\\pi (\\theta )}}=P\_{\\theta }(x)}](https://wikimedia.org/api/rest_v1/media/math/render/svg/940facb82e61c0416c0908d8c5a59a253dd3d349) both as empirically might be expected. Thus, Bayes' theorem states: Q ( θ ∣ x ) \= P θ ( x ) ⋅ π ( θ ) Q ( x ) {\\displaystyle Q(\\theta \\mid x)=P\_{\\theta }(x)\\cdot {\\frac {\\pi (\\theta )}{Q(x)}}} ![{\\displaystyle Q(\\theta \\mid x)=P\_{\\theta }(x)\\cdot {\\frac {\\pi (\\theta )}{Q(x)}}}](https://wikimedia.org/api/rest_v1/media/math/render/svg/0288a5efd3818856098ffb414ba322843d807188) If π ≪ λ {\\displaystyle \\pi \\ll \\lambda } ![{\\displaystyle \\pi \\ll \\lambda }](https://wikimedia.org/api/rest_v1/media/math/render/svg/0384a886f90af14756c4796db9a7a2d42341762d) (absolutely continuous w.r.t. lebesgue measure), then there exists a density such that π ( θ ) \= d π d λ ( θ ) {\\displaystyle \\pi (\\theta )={\\frac {d\\pi }{d\\lambda }}(\\theta )} ![{\\displaystyle \\pi (\\theta )={\\frac {d\\pi }{d\\lambda }}(\\theta )}](https://wikimedia.org/api/rest_v1/media/math/render/svg/8587265eb1559500fc8e9faf9f5d59a8b18ca354) and we can write: Q ( x ) \= ∫ Θ P θ ′ ( x ) d π ( θ ′ ) \= ∫ Θ P θ ′ ( x ) ⋅ π ( θ ′ ) d θ ′ {\\displaystyle Q(x)=\\int \_{\\Theta }P\_{\\theta '}(x)\\;d\\pi (\\theta ')=\\int \_{\\Theta }P\_{\\theta '}(x)\\cdot \\pi (\\theta ')\\;d\\theta '} ![{\\displaystyle Q(x)=\\int \_{\\Theta }P\_{\\theta '}(x)\\;d\\pi (\\theta ')=\\int \_{\\Theta }P\_{\\theta '}(x)\\cdot \\pi (\\theta ')\\;d\\theta '}](https://wikimedia.org/api/rest_v1/media/math/render/svg/7aa61103dc476e8c7782be4059766f4febd8b620) Else, if π ≪ ν {\\displaystyle \\pi \\ll \\nu } ![{\\displaystyle \\pi \\ll \\nu }](https://wikimedia.org/api/rest_v1/media/math/render/svg/c2b0f3d1e4aecd700c6dbea6348d85f7c3d60b73) (absolutely continuous w.r.t. counting measure), analogous we can write: Q ( x ) \= ∫ Θ P θ ′ ( x ) ⋅ π ( θ ′ ) d ν ( θ ′ ) \= ∑ i P θ i ( x ) ⋅ π ( θ i ) {\\displaystyle Q(x)=\\int \_{\\Theta }P\_{\\theta '}(x)\\cdot \\pi (\\theta ')\\;d\\nu (\\theta ')=\\sum \_{i}P\_{\\theta \_{i}}(x)\\cdot \\pi (\\theta \_{i})} ![{\\displaystyle Q(x)=\\int \_{\\Theta }P\_{\\theta '}(x)\\cdot \\pi (\\theta ')\\;d\\nu (\\theta ')=\\sum \_{i}P\_{\\theta \_{i}}(x)\\cdot \\pi (\\theta \_{i})}](https://wikimedia.org/api/rest_v1/media/math/render/svg/4f1453b6b7e525317decfe29052e869158077121) Thus, by identifying Q ( θ ∣ x ) {\\displaystyle Q(\\theta \\mid x)} ![{\\displaystyle Q(\\theta \\mid x)}](https://wikimedia.org/api/rest_v1/media/math/render/svg/a3d64dbf8283bed3649bb2b34483950dde52b176) with π ( θ ∣ x ) {\\displaystyle \\pi (\\theta \\mid x)} ![{\\displaystyle \\pi (\\theta \\mid x)}](https://wikimedia.org/api/rest_v1/media/math/render/svg/0077a1d0471a66000ea105bc787a4064d12ef81c) and L ( x ∣ θ ) {\\displaystyle {\\mathcal {L}}(x\\mid \\theta )} ![{\\displaystyle {\\mathcal {L}}(x\\mid \\theta )}](https://wikimedia.org/api/rest_v1/media/math/render/svg/8f2422cef15a3b8de3926f50c889a895243e36f1) with P θ ( x ) {\\displaystyle P\_{\\theta }(x)} ![{\\displaystyle P\_{\\theta }(x)}](https://wikimedia.org/api/rest_v1/media/math/render/svg/e8bd7e275126cfcc11c6d98bc1b8305035788781) we arrive at the classical equation stated above. ## Bayesian methods \[[edit](https://en.wikipedia.org/w/index.php?title=Bayesian_statistics&action=edit&section=3 "Edit section: Bayesian methods")\] The general set of statistical techniques can be divided into a number of activities, many of which have special Bayesian versions. ### Bayesian inference \[[edit](https://en.wikipedia.org/w/index.php?title=Bayesian_statistics&action=edit&section=4 "Edit section: Bayesian inference")\] Main article: [Bayesian inference](https://en.wikipedia.org/wiki/Bayesian_inference "Bayesian inference") Bayesian inference refers to [statistical inference](https://en.wikipedia.org/wiki/Statistical_inference "Statistical inference") where uncertainty in inferences is quantified using probability.[\[8\]](https://en.wikipedia.org/wiki/Bayesian_statistics#cite_note-8) In classical [frequentist inference](https://en.wikipedia.org/wiki/Frequentist_inference "Frequentist inference"), model [parameters](https://en.wikipedia.org/wiki/Parameter "Parameter") and hypotheses are considered to be fixed. Probabilities are not assigned to parameters or hypotheses in frequentist inference. For example, it would not make sense in frequentist inference to directly assign a probability to an event that can only happen once, such as the result of the next flip of a fair coin. However, it would make sense to state that the proportion of heads [approaches one-half](https://en.wikipedia.org/wiki/Law_of_large_numbers "Law of large numbers") as the number of coin flips increases.[\[9\]](https://en.wikipedia.org/wiki/Bayesian_statistics#cite_note-wakefield2013-9) [Statistical models](https://en.wikipedia.org/wiki/Statistical_models "Statistical models") specify a set of statistical assumptions and processes that represent how the sample data are generated. Statistical models have a number of parameters that can be modified. For example, a coin can be represented as samples from a [Bernoulli distribution](https://en.wikipedia.org/wiki/Bernoulli_distribution "Bernoulli distribution"), which models two possible outcomes. The Bernoulli distribution has a single parameter equal to the probability of one outcome, which in most cases is the probability of landing on heads. Devising a good model for the data is central in Bayesian inference. In most cases, models only approximate the true process, and may not take into account certain factors influencing the data.[\[2\]](https://en.wikipedia.org/wiki/Bayesian_statistics#cite_note-bda-2) In Bayesian inference, probabilities can be assigned to model parameters. Parameters can be represented as [random variables](https://en.wikipedia.org/wiki/Random_variable "Random variable"). Bayesian inference uses Bayes' theorem to update probabilities after more evidence is obtained or known.[\[2\]](https://en.wikipedia.org/wiki/Bayesian_statistics#cite_note-bda-2)[\[10\]](https://en.wikipedia.org/wiki/Bayesian_statistics#cite_note-congdon2014-10) Furthermore, Bayesian methods allow for placing priors on entire models and calculating their posterior probabilities using Bayes' theorem. These posterior probabilities are proportional to the product of the prior and the marginal likelihood, where the marginal likelihood is the integral of the sampling density over the prior distribution of the parameters. In complex models, marginal likelihoods are generally computed numerically.[\[11\]](https://en.wikipedia.org/wiki/Bayesian_statistics#cite_note-chib1995-11) ### Statistical modeling \[[edit](https://en.wikipedia.org/w/index.php?title=Bayesian_statistics&action=edit&section=5 "Edit section: Statistical modeling")\] The formulation of [statistical models](https://en.wikipedia.org/wiki/Statistical_model "Statistical model") using Bayesian statistics has the identifying feature of requiring the specification of [prior distributions](https://en.wikipedia.org/wiki/Prior_distribution "Prior distribution") for any unknown parameters. Indeed, parameters of prior distributions may themselves have prior distributions, leading to [Bayesian hierarchical modeling](https://en.wikipedia.org/wiki/Bayesian_hierarchical_modeling "Bayesian hierarchical modeling"),[\[12\]](https://en.wikipedia.org/wiki/Bayesian_statistics#cite_note-KruschkeVanpaemel2015-12)[\[13\]](https://en.wikipedia.org/wiki/Bayesian_statistics#cite_note-:bmdl-13)[\[14\]](https://en.wikipedia.org/wiki/Bayesian_statistics#cite_note-14) also known as multi-level modeling. A special case is [Bayesian networks](https://en.wikipedia.org/wiki/Bayesian_networks "Bayesian networks"). For conducting a Bayesian statistical analysis, best practices are discussed by van de Schoot et al.[\[15\]](https://en.wikipedia.org/wiki/Bayesian_statistics#cite_note-vandeShootEtAl2021-15) ### Design of experiments \[[edit](https://en.wikipedia.org/w/index.php?title=Bayesian_statistics&action=edit&section=6 "Edit section: Design of experiments")\] The [Bayesian design of experiments](https://en.wikipedia.org/wiki/Bayesian_design_of_experiments "Bayesian design of experiments") includes a concept called 'influence of prior beliefs'. This approach uses [sequential analysis](https://en.wikipedia.org/wiki/Sequential_analysis "Sequential analysis") techniques to include the outcome of earlier experiments in the design of the next experiment. This is achieved by updating 'beliefs' through the use of prior and [posterior distribution](https://en.wikipedia.org/wiki/Posterior_distribution "Posterior distribution"). This allows the design of experiments to make good use of resources of all types. An example of this is the [multi-armed bandit problem](https://en.wikipedia.org/wiki/Multi-armed_bandit_problem "Multi-armed bandit problem"). ### Exploratory analysis of Bayesian models \[[edit](https://en.wikipedia.org/w/index.php?title=Bayesian_statistics&action=edit&section=7 "Edit section: Exploratory analysis of Bayesian models")\] Exploratory analysis of Bayesian models is an adaptation or extension of the [exploratory data analysis](https://en.wikipedia.org/wiki/Exploratory_data_analysis "Exploratory data analysis") approach to the needs and peculiarities of Bayesian modeling. In the words of Persi Diaconis:[\[16\]](https://en.wikipedia.org/wiki/Bayesian_statistics#cite_note-16) > Exploratory data analysis seeks to reveal structure, or simple descriptions in data. We look at numbers or graphs and try to find patterns. We pursue leads suggested by background information, imagination, patterns perceived, and experience with other data analyses The [inference process](https://en.wikipedia.org/wiki/Bayesian_inference "Bayesian inference") generates a posterior distribution, which has a central role in Bayesian statistics, together with other distributions like the posterior predictive distribution and the prior predictive distribution. The correct visualization, analysis, and interpretation of these distributions is key to properly answer the questions that motivate the inference process.[\[17\]](https://en.wikipedia.org/wiki/Bayesian_statistics#cite_note-17) When working with Bayesian models there are a series of related tasks that need to be addressed besides inference itself: - Diagnoses of the quality of the inference, this is needed when using numerical methods such as [Markov chain Monte Carlo](https://en.wikipedia.org/wiki/Markov_chain_Monte_Carlo "Markov chain Monte Carlo") techniques - Model criticism, including evaluations of both model assumptions and model predictions - Comparison of models, including model selection or model averaging - Preparation of the results for a particular audience All these tasks are part of the Exploratory analysis of Bayesian models approach and successfully performing them is central to the iterative and interactive modeling process. These tasks require both numerical and visual summaries.[\[18\]](https://en.wikipedia.org/wiki/Bayesian_statistics#cite_note-18)[\[19\]](https://en.wikipedia.org/wiki/Bayesian_statistics#cite_note-19)[\[20\]](https://en.wikipedia.org/wiki/Bayesian_statistics#cite_note-Martin2018-20) ## See also \[[edit](https://en.wikipedia.org/w/index.php?title=Bayesian_statistics&action=edit&section=8 "Edit section: See also")\] - [Bayesian epistemology](https://en.wikipedia.org/wiki/Bayesian_epistemology "Bayesian epistemology") - For a list of mathematical logic notation used in this article - [Notation in probability and statistics](https://en.wikipedia.org/wiki/Notation_in_probability_and_statistics "Notation in probability and statistics") - [List of logic symbols](https://en.wikipedia.org/wiki/List_of_logic_symbols "List of logic symbols") ## References \[[edit](https://en.wikipedia.org/w/index.php?title=Bayesian_statistics&action=edit&section=9 "Edit section: References")\] 1. **[^](https://en.wikipedia.org/wiki/Bayesian_statistics#cite_ref-1)** ["Bayesian"](https://www.merriam-webster.com/dictionary/Bayesian). *[Merriam-Webster.com Dictionary](https://en.wikipedia.org/wiki/Merriam-Webster "Merriam-Webster")*. Merriam-Webster. [OCLC](https://en.wikipedia.org/wiki/OCLC_\(identifier\) "OCLC (identifier)") [1032680871](https://search.worldcat.org/oclc/1032680871). 2. ^ [***a***](https://en.wikipedia.org/wiki/Bayesian_statistics#cite_ref-bda_2-0) [***b***](https://en.wikipedia.org/wiki/Bayesian_statistics#cite_ref-bda_2-1) [***c***](https://en.wikipedia.org/wiki/Bayesian_statistics#cite_ref-bda_2-2) [***d***](https://en.wikipedia.org/wiki/Bayesian_statistics#cite_ref-bda_2-3) [***e***](https://en.wikipedia.org/wiki/Bayesian_statistics#cite_ref-bda_2-4) [***f***](https://en.wikipedia.org/wiki/Bayesian_statistics#cite_ref-bda_2-5) [***g***](https://en.wikipedia.org/wiki/Bayesian_statistics#cite_ref-bda_2-6) [***h***](https://en.wikipedia.org/wiki/Bayesian_statistics#cite_ref-bda_2-7) [***i***](https://en.wikipedia.org/wiki/Bayesian_statistics#cite_ref-bda_2-8) [Gelman, Andrew](https://en.wikipedia.org/wiki/Andrew_Gelman "Andrew Gelman"); [Carlin, John B.](https://en.wikipedia.org/wiki/John_Carlin_\(professor\) "John Carlin (professor)"); Stern, Hal S.; Dunson, David B.; Vehtari, Aki; [Rubin, Donald B.](https://en.wikipedia.org/wiki/Donald_Rubin "Donald Rubin") (2013). *Bayesian Data Analysis* (Third ed.). Chapman and Hall/CRC. [ISBN](https://en.wikipedia.org/wiki/ISBN_\(identifier\) "ISBN (identifier)") [978-1-4398-4095-5](https://en.wikipedia.org/wiki/Special:BookSources/978-1-4398-4095-5 "Special:BookSources/978-1-4398-4095-5") . 3. ^ [***a***](https://en.wikipedia.org/wiki/Bayesian_statistics#cite_ref-rethinking_3-0) [***b***](https://en.wikipedia.org/wiki/Bayesian_statistics#cite_ref-rethinking_3-1) [McElreath, Richard](https://en.wikipedia.org/wiki/Richard_McElreath "Richard McElreath") (2020). *[Statistical Rethinking : A Bayesian Course with Examples in R and Stan](https://en.wikipedia.org/wiki/Statistical_Rethinking "Statistical Rethinking")* (2nd ed.). Chapman and Hall/CRC. [ISBN](https://en.wikipedia.org/wiki/ISBN_\(identifier\) "ISBN (identifier)") [978-0-367-13991-9](https://en.wikipedia.org/wiki/Special:BookSources/978-0-367-13991-9 "Special:BookSources/978-0-367-13991-9") . 4. **[^](https://en.wikipedia.org/wiki/Bayesian_statistics#cite_ref-4)** [Kruschke, John](https://en.wikipedia.org/wiki/John_K._Kruschke "John K. Kruschke") (2014). *Doing Bayesian Data Analysis: A Tutorial with R, JAGS, and Stan* (2nd ed.). Academic Press. [ISBN](https://en.wikipedia.org/wiki/ISBN_\(identifier\) "ISBN (identifier)") [978-0-12-405888-0](https://en.wikipedia.org/wiki/Special:BookSources/978-0-12-405888-0 "Special:BookSources/978-0-12-405888-0") . 5. **[^](https://en.wikipedia.org/wiki/Bayesian_statistics#cite_ref-5)** [McGrayne, Sharon](https://en.wikipedia.org/wiki/Richard_McElreath "Richard McElreath") (2012). *The Theory That Would Not Die: How Bayes' Rule Cracked the Enigma Code, Hunted Down Russian Submarines, and Emerged Triumphant from Two Centuries of Controversy* (First ed.). Chapman and Hall/CRC. [ISBN](https://en.wikipedia.org/wiki/ISBN_\(identifier\) "ISBN (identifier)") [978-0-3001-8822-6](https://en.wikipedia.org/wiki/Special:BookSources/978-0-3001-8822-6 "Special:BookSources/978-0-3001-8822-6") . 6. **[^](https://en.wikipedia.org/wiki/Bayesian_statistics#cite_ref-6)** Fienberg, Stephen E. (2006). ["When Did Bayesian Inference Become "Bayesian"?"](https://doi.org/10.1214%2F06-BA101). *Bayesian Analysis*. **1** (1): 1–40\. [doi](https://en.wikipedia.org/wiki/Doi_\(identifier\) "Doi (identifier)"):[10\.1214/06-BA101](https://doi.org/10.1214%2F06-BA101). 7. ^ [***a***](https://en.wikipedia.org/wiki/Bayesian_statistics#cite_ref-grinsteadsnell2006_7-0) [***b***](https://en.wikipedia.org/wiki/Bayesian_statistics#cite_ref-grinsteadsnell2006_7-1) Grinstead, Charles M.; Snell, J. Laurie (2006). *Introduction to probability* (2nd ed.). Providence, RI: American Mathematical Society. [ISBN](https://en.wikipedia.org/wiki/ISBN_\(identifier\) "ISBN (identifier)") [978-0-8218-9414-9](https://en.wikipedia.org/wiki/Special:BookSources/978-0-8218-9414-9 "Special:BookSources/978-0-8218-9414-9") . 8. **[^](https://en.wikipedia.org/wiki/Bayesian_statistics#cite_ref-8)** Lee, Se Yoon (2021). "Gibbs sampler and coordinate ascent variational inference: A set-theoretical review". *Communications in Statistics - Theory and Methods*. **51** (6): 1549–1568\. [arXiv](https://en.wikipedia.org/wiki/ArXiv_\(identifier\) "ArXiv (identifier)"):[2008\.01006](https://arxiv.org/abs/2008.01006). [doi](https://en.wikipedia.org/wiki/Doi_\(identifier\) "Doi (identifier)"):[10\.1080/03610926.2021.1921214](https://doi.org/10.1080%2F03610926.2021.1921214). [S2CID](https://en.wikipedia.org/wiki/S2CID_\(identifier\) "S2CID (identifier)") [220935477](https://api.semanticscholar.org/CorpusID:220935477). 9. **[^](https://en.wikipedia.org/wiki/Bayesian_statistics#cite_ref-wakefield2013_9-0)** Wakefield, Jon (2013). *Bayesian and frequentist regression methods*. New York, NY: Springer. [ISBN](https://en.wikipedia.org/wiki/ISBN_\(identifier\) "ISBN (identifier)") [978-1-4419-0924-4](https://en.wikipedia.org/wiki/Special:BookSources/978-1-4419-0924-4 "Special:BookSources/978-1-4419-0924-4") . 10. **[^](https://en.wikipedia.org/wiki/Bayesian_statistics#cite_ref-congdon2014_10-0)** Congdon, Peter (2014). *Applied Bayesian modelling* (2nd ed.). Wiley. [ISBN](https://en.wikipedia.org/wiki/ISBN_\(identifier\) "ISBN (identifier)") [978-1119951513](https://en.wikipedia.org/wiki/Special:BookSources/978-1119951513 "Special:BookSources/978-1119951513") . 11. **[^](https://en.wikipedia.org/wiki/Bayesian_statistics#cite_ref-chib1995_11-0)** Chib, Siddhartha (1995). "Marginal Likelihood from the Gibbs Output". *Journal of the American Statistical Association*. **90** (432): 1313–1321\. [doi](https://en.wikipedia.org/wiki/Doi_\(identifier\) "Doi (identifier)"):[10\.1080/01621459.1995.10476635](https://doi.org/10.1080%2F01621459.1995.10476635). 12. **[^](https://en.wikipedia.org/wiki/Bayesian_statistics#cite_ref-KruschkeVanpaemel2015_12-0)** [Kruschke, J K](https://en.wikipedia.org/wiki/John_K._Kruschke "John K. Kruschke"); Vanpaemel, W (2015). "Bayesian Estimation in Hierarchical Models". In Busemeyer, J R; Wang, Z; Townsend, J T; Eidels, A (eds.). [*The Oxford Handbook of Computational and Mathematical Psychology*](https://jkkweb.sitehost.iu.edu/articles/KruschkeVanpaemel2015.pdf) (PDF). Oxford University Press. pp. 279–299\. 13. **[^](https://en.wikipedia.org/wiki/Bayesian_statistics#cite_ref-:bmdl_13-0)** Hajiramezanali, E. & Dadaneh, S. Z. & Karbalayghareh, A. & Zhou, Z. & Qian, X. Bayesian multi-domain learning for cancer subtype discovery from next-generation sequencing count data. 32nd Conference on Neural Information Processing Systems (NIPS 2018), Montréal, Canada. [arXiv](https://en.wikipedia.org/wiki/ArXiv_\(identifier\) "ArXiv (identifier)"):[1810\.09433](https://arxiv.org/abs/1810.09433) 14. **[^](https://en.wikipedia.org/wiki/Bayesian_statistics#cite_ref-14)** Lee, Se Yoon; Mallick, Bani (2021). "Bayesian Hierarchical Modeling: Application Towards Production Results in the Eagle Ford Shale of South Texas". *Sankhya B*. **84**: 1–43\. [doi](https://en.wikipedia.org/wiki/Doi_\(identifier\) "Doi (identifier)"):[10\.1007/s13571-020-00245-8](https://doi.org/10.1007%2Fs13571-020-00245-8). 15. **[^](https://en.wikipedia.org/wiki/Bayesian_statistics#cite_ref-vandeShootEtAl2021_15-0)** van de Schoot, Rens; Depaoli, Sarah; King, Ruth; Kramer, Bianca; Märtens, Kaspar; [Tadesse, Mahlet G.](https://en.wikipedia.org/wiki/Mahlet_Tadesse "Mahlet Tadesse"); Vannucci, Marina; Gelman, Andrew; Veen, Duco; Willemsen, Joukje; Yau, Christopher (January 14, 2021). ["Bayesian statistics and modelling"](https://osf.io/wdtmc/). *[Nature Reviews Methods Primers](https://en.wikipedia.org/wiki/Nature_Reviews_Methods_Primers "Nature Reviews Methods Primers")*. **1** (1): 1–26\. [doi](https://en.wikipedia.org/wiki/Doi_\(identifier\) "Doi (identifier)"):[10\.1038/s43586-020-00001-2](https://doi.org/10.1038%2Fs43586-020-00001-2). [hdl](https://en.wikipedia.org/wiki/Hdl_\(identifier\) "Hdl (identifier)"):[1874/415909](https://hdl.handle.net/1874%2F415909). [S2CID](https://en.wikipedia.org/wiki/S2CID_\(identifier\) "S2CID (identifier)") [234108684](https://api.semanticscholar.org/CorpusID:234108684). 16. **[^](https://en.wikipedia.org/wiki/Bayesian_statistics#cite_ref-16)** Diaconis, Persi (2011) Theories of Data Analysis: From Magical Thinking Through Classical Statistics. John Wiley & Sons, Ltd 2:e55 [doi](https://en.wikipedia.org/wiki/Doi_\(identifier\) "Doi (identifier)"):[10\.1002/9781118150702.ch1](https://doi.org/10.1002%2F9781118150702.ch1) 17. **[^](https://en.wikipedia.org/wiki/Bayesian_statistics#cite_ref-17)** Kumar, Ravin; Carroll, Colin; Hartikainen, Ari; Martin, Osvaldo (2019). ["ArviZ a unified library for exploratory analysis of Bayesian models in Python"](https://doi.org/10.21105%2Fjoss.01143). *Journal of Open Source Software*. **4** (33): 1143. [Bibcode](https://en.wikipedia.org/wiki/Bibcode_\(identifier\) "Bibcode (identifier)"):[2019JOSS....4.1143K](https://ui.adsabs.harvard.edu/abs/2019JOSS....4.1143K). [doi](https://en.wikipedia.org/wiki/Doi_\(identifier\) "Doi (identifier)"):[10\.21105/joss.01143](https://doi.org/10.21105%2Fjoss.01143). [hdl](https://en.wikipedia.org/wiki/Hdl_\(identifier\) "Hdl (identifier)"):[11336/114615](https://hdl.handle.net/11336%2F114615). 18. **[^](https://en.wikipedia.org/wiki/Bayesian_statistics#cite_ref-18)** Gabry, Jonah; Simpson, Daniel; Vehtari, Aki; Betancourt, Michael; Gelman, Andrew (2019). "Visualization in Bayesian workflow". *Journal of the Royal Statistical Society, Series A (Statistics in Society)*. **182** (2): 389–402\. [arXiv](https://en.wikipedia.org/wiki/ArXiv_\(identifier\) "ArXiv (identifier)"):[1709\.01449](https://arxiv.org/abs/1709.01449). [doi](https://en.wikipedia.org/wiki/Doi_\(identifier\) "Doi (identifier)"):[10\.1111/rssa.12378](https://doi.org/10.1111%2Frssa.12378). [S2CID](https://en.wikipedia.org/wiki/S2CID_\(identifier\) "S2CID (identifier)") [26590874](https://api.semanticscholar.org/CorpusID:26590874). 19. **[^](https://en.wikipedia.org/wiki/Bayesian_statistics#cite_ref-19)** Vehtari, Aki; Gelman, Andrew; Simpson, Daniel; Carpenter, Bob; Bürkner, Paul-Christian (2021). "Rank-Normalization, Folding, and Localization: An Improved Rˆ for Assessing Convergence of MCMC (With Discussion)". *Bayesian Analysis*. **16** (2): 667. [arXiv](https://en.wikipedia.org/wiki/ArXiv_\(identifier\) "ArXiv (identifier)"):[1903\.08008](https://arxiv.org/abs/1903.08008). [Bibcode](https://en.wikipedia.org/wiki/Bibcode_\(identifier\) "Bibcode (identifier)"):[2021BayAn..16..667V](https://ui.adsabs.harvard.edu/abs/2021BayAn..16..667V). [doi](https://en.wikipedia.org/wiki/Doi_\(identifier\) "Doi (identifier)"):[10\.1214/20-BA1221](https://doi.org/10.1214%2F20-BA1221). [S2CID](https://en.wikipedia.org/wiki/S2CID_\(identifier\) "S2CID (identifier)") [88522683](https://api.semanticscholar.org/CorpusID:88522683). 20. **[^](https://en.wikipedia.org/wiki/Bayesian_statistics#cite_ref-Martin2018_20-0)** Martin, Osvaldo (2018). [*Bayesian Analysis with Python: Introduction to statistical modeling and probabilistic programming using PyMC3 and ArviZ*](https://books.google.com/books?id=1Z2BDwAAQBAJ). Packt Publishing Ltd. [ISBN](https://en.wikipedia.org/wiki/ISBN_\(identifier\) "ISBN (identifier)") [9781789341652](https://en.wikipedia.org/wiki/Special:BookSources/9781789341652 "Special:BookSources/9781789341652") . ## Further reading \[[edit](https://en.wikipedia.org/w/index.php?title=Bayesian_statistics&action=edit&section=10 "Edit section: Further reading")\] - [Bernardo, José M.](https://en.wikipedia.org/wiki/Jos%C3%A9-Miguel_Bernardo "José-Miguel Bernardo"); [Smith, Adrian F. M.](https://en.wikipedia.org/wiki/Adrian_Smith_\(statistician\) "Adrian Smith (statistician)") (2000). *Bayesian Theory*. New York: Wiley. [ISBN](https://en.wikipedia.org/wiki/ISBN_\(identifier\) "ISBN (identifier)") [0-471-92416-4](https://en.wikipedia.org/wiki/Special:BookSources/0-471-92416-4 "Special:BookSources/0-471-92416-4") . - Bolstad, William M.; Curran, James M. (2016). *Introduction to Bayesian Statistics* (3rd ed.). Wiley. [ISBN](https://en.wikipedia.org/wiki/ISBN_\(identifier\) "ISBN (identifier)") [978-1-118-09156-2](https://en.wikipedia.org/wiki/Special:BookSources/978-1-118-09156-2 "Special:BookSources/978-1-118-09156-2") . - [Downey, Allen B.](https://en.wikipedia.org/wiki/Allen_B._Downey "Allen B. Downey") (2021). [*Think Bayes: Bayesian Statistics in Python*](https://greenteapress.com/wp/think-bayes/) (2nd ed.). O'Reilly. [ISBN](https://en.wikipedia.org/wiki/ISBN_\(identifier\) "ISBN (identifier)") [978-1-4920-8946-9](https://en.wikipedia.org/wiki/Special:BookSources/978-1-4920-8946-9 "Special:BookSources/978-1-4920-8946-9") . - Hoff, Peter D. (2009). *A First Course in Bayesian Statistical Methods* (2nd ed.). New York: Springer. [ISBN](https://en.wikipedia.org/wiki/ISBN_\(identifier\) "ISBN (identifier)") [978-1-4419-2828-3](https://en.wikipedia.org/wiki/Special:BookSources/978-1-4419-2828-3 "Special:BookSources/978-1-4419-2828-3") . - Lee, Peter M. (2012). *Bayesian Statistics: An Introduction* (4th ed.). Wiley. [ISBN](https://en.wikipedia.org/wiki/ISBN_\(identifier\) "ISBN (identifier)") [978-1-118-33257-3](https://en.wikipedia.org/wiki/Special:BookSources/978-1-118-33257-3 "Special:BookSources/978-1-118-33257-3") . - [Robert, Christian P.](https://en.wikipedia.org/wiki/Christian_Robert "Christian Robert") (2007). *The Bayesian Choice : From Decision-Theoretic Foundations to Computational Implementation* (2nd ed.). New York: Springer. [ISBN](https://en.wikipedia.org/wiki/ISBN_\(identifier\) "ISBN (identifier)") [978-0-387-71598-8](https://en.wikipedia.org/wiki/Special:BookSources/978-0-387-71598-8 "Special:BookSources/978-0-387-71598-8") . - Johnson, Alicia A.; Ott, Mies Q.; Dogucu, Mine (2022). [*Bayes Rules! An Introduction to Applied Bayesian Modeling*](https://www.bayesrulesbook.com/). Boca Raton: Chapman & Hall/CRC Texts in Statistical Science. [ISBN](https://en.wikipedia.org/wiki/ISBN_\(identifier\) "ISBN (identifier)") [978-0-367-25539-8](https://en.wikipedia.org/wiki/Special:BookSources/978-0-367-25539-8 "Special:BookSources/978-0-367-25539-8") . ## External links \[[edit](https://en.wikipedia.org/w/index.php?title=Bayesian_statistics&action=edit&section=11 "Edit section: External links")\] [![Wikiversity logo](https://upload.wikimedia.org/wikipedia/commons/thumb/0/0b/Wikiversity_logo_2017.svg/40px-Wikiversity_logo_2017.svg.png)](https://en.wikipedia.org/wiki/File:Wikiversity_logo_2017.svg) Wikiversity has learning resources about ***[Bayesian statistics](https://en.wikiversity.org/wiki/Bayesian_statistics "v:Bayesian statistics")*** - Theo Kypraios. ["A Gentle Tutorial in Bayesian Statistics"](https://kupdf.com/download/a-gentle-tutorial-in-bayesian-statisticspdf_59b0ed86dc0d602e3b568edc_pdf) (PDF). Retrieved 2013-11-03. - Jordi Vallverdu. [*Bayesians Versus Frequentists A Philosophical Debate on Statistical Reasoning*](https://www.springer.com/gp/book/9783662486368). - [Bayesian statistics](http://www.scholarpedia.org/article/Bayesian_statistics) [David Spiegelhalter](https://en.wikipedia.org/wiki/David_Spiegelhalter "David Spiegelhalter"), Kenneth Rice [Scholarpedia](https://en.wikipedia.org/wiki/Scholarpedia "Scholarpedia") 4(8):5230. [doi:10.4249/scholarpedia.5230](https://doi.org/10.4249/scholarpedia.5230 "doi:10.4249/scholarpedia.5230") - [Bayesian modeling book](http://bayesmodels.com/) and examples available for downloading. - Rens van de Schoot. ["A Gentle Introduction to Bayesian Analysis"](https://www.statmodel.com/download/introBayes.pdf) (PDF). - [Bayesian A/B Testing Calculator](https://marketing.dynamicyield.com/bayesian-calculator/) Dynamic Yield | [Authority control databases](https://en.wikipedia.org/wiki/Help:Authority_control "Help:Authority control") [![Edit this at Wikidata](https://upload.wikimedia.org/wikipedia/en/thumb/8/8a/OOjs_UI_icon_edit-ltr-progressive.svg/20px-OOjs_UI_icon_edit-ltr-progressive.svg.png)](https://www.wikidata.org/wiki/Q4874481#identifiers "Edit this at Wikidata") | | |---|---| | National | [France](https://catalogue.bnf.fr/ark:/12148/cb121309043) [BnF data](https://data.bnf.fr/ark:/12148/cb121309043) [Spain](https://datos.bne.es/resource/XX550382) | | Other | [IdRef](https://www.idref.fr/029753090) | ![](https://en.wikipedia.org/wiki/Special:CentralAutoLogin/start?useformat=desktop&type=1x1&usesul3=1) Retrieved from "<https://en.wikipedia.org/w/index.php?title=Bayesian_statistics&oldid=1344525377>" [Category](https://en.wikipedia.org/wiki/Help:Category "Help:Category"): - [Bayesian statistics](https://en.wikipedia.org/wiki/Category:Bayesian_statistics "Category:Bayesian statistics") Hidden categories: - [Articles with short description](https://en.wikipedia.org/wiki/Category:Articles_with_short_description "Category:Articles with short description") - [Short description is different from Wikidata](https://en.wikipedia.org/wiki/Category:Short_description_is_different_from_Wikidata "Category:Short description is different from Wikidata") - This page was last edited on 21 March 2026, at 00:07 (UTC). - Text is available under the [Creative Commons Attribution-ShareAlike 4.0 License](https://en.wikipedia.org/wiki/Wikipedia:Text_of_the_Creative_Commons_Attribution-ShareAlike_4.0_International_License "Wikipedia:Text of the Creative Commons Attribution-ShareAlike 4.0 International License"); additional terms may apply. By using this site, you agree to the [Terms of Use](https://foundation.wikimedia.org/wiki/Special:MyLanguage/Policy:Terms_of_Use "foundation:Special:MyLanguage/Policy:Terms of Use") and [Privacy Policy](https://foundation.wikimedia.org/wiki/Special:MyLanguage/Policy:Privacy_policy "foundation:Special:MyLanguage/Policy:Privacy policy"). Wikipedia® is a registered trademark of the [Wikimedia Foundation, Inc.](https://wikimediafoundation.org/), a non-profit organization. - [Privacy policy](https://foundation.wikimedia.org/wiki/Special:MyLanguage/Policy:Privacy_policy) - [About Wikipedia](https://en.wikipedia.org/wiki/Wikipedia:About) - [Disclaimers](https://en.wikipedia.org/wiki/Wikipedia:General_disclaimer) - [Contact Wikipedia](https://en.wikipedia.org/wiki/Wikipedia:Contact_us) - [Legal & safety contacts](https://foundation.wikimedia.org/wiki/Special:MyLanguage/Legal:Wikimedia_Foundation_Legal_and_Safety_Contact_Information) - [Code of Conduct](https://foundation.wikimedia.org/wiki/Special:MyLanguage/Policy:Universal_Code_of_Conduct) - [Developers](https://developer.wikimedia.org/) - [Statistics](https://stats.wikimedia.org/#/en.wikipedia.org) - [Cookie statement](https://foundation.wikimedia.org/wiki/Special:MyLanguage/Policy:Cookie_statement) - [Mobile view](https://en.wikipedia.org/w/index.php?title=Bayesian_statistics&mobileaction=toggle_view_mobile) - [![Wikimedia Foundation](https://en.wikipedia.org/static/images/footer/wikimedia.svg)](https://www.wikimedia.org/) - [![Powered by MediaWiki](https://en.wikipedia.org/w/resources/assets/mediawiki_compact.svg)](https://www.mediawiki.org/) Search Toggle the table of contents Bayesian statistics 25 languages [Add topic](https://en.wikipedia.org/wiki/Bayesian_statistics)
Readable Markdown
From Wikipedia, the free encyclopedia **Bayesian statistics** ( [*BAY\-zee-ən*](https://en.wikipedia.org/wiki/Help:Pronunciation_respelling_key "Help:Pronunciation respelling key") or [*BAY\-zhən*](https://en.wikipedia.org/wiki/Help:Pronunciation_respelling_key "Help:Pronunciation respelling key"))[\[1\]](https://en.wikipedia.org/wiki/Bayesian_statistics#cite_note-1) is a theory in the field of [statistics](https://en.wikipedia.org/wiki/Statistics "Statistics") based on the [Bayesian interpretation of probability](https://en.wikipedia.org/wiki/Bayesian_probability "Bayesian probability"), where [probability](https://en.wikipedia.org/wiki/Probability "Probability") expresses a *degree of belief* in an [event](https://en.wikipedia.org/wiki/Event_\(probability_theory\) "Event (probability theory)"). The degree of belief may be based on prior knowledge about the event, such as the results of previous experiments, or on personal beliefs about the event. This differs from a number of other [interpretations of probability](https://en.wikipedia.org/wiki/Probability_interpretations "Probability interpretations"), such as the [frequentist](https://en.wikipedia.org/wiki/Frequentist_probability "Frequentist probability") interpretation, which views probability as the [limit](https://en.wikipedia.org/wiki/Limit_of_a_sequence "Limit of a sequence") of the relative frequency of an event after many trials.[\[2\]](https://en.wikipedia.org/wiki/Bayesian_statistics#cite_note-bda-2) More concretely, analysis in Bayesian methods codifies prior knowledge in the form of a [prior distribution](https://en.wikipedia.org/wiki/Prior_distribution "Prior distribution"). Bayesian statistical methods use [Bayes' theorem](https://en.wikipedia.org/wiki/Bayes%27_theorem "Bayes' theorem") to compute and update probabilities after obtaining new data. Bayes' theorem describes the [conditional probability](https://en.wikipedia.org/wiki/Conditional_probability "Conditional probability") of an event based on data as well as prior information or beliefs about the event or conditions related to the event.[\[3\]](https://en.wikipedia.org/wiki/Bayesian_statistics#cite_note-rethinking-3)[\[4\]](https://en.wikipedia.org/wiki/Bayesian_statistics#cite_note-4) For example, in [Bayesian inference](https://en.wikipedia.org/wiki/Bayesian_inference "Bayesian inference"), Bayes' theorem can be used to estimate the parameters of a [probability distribution](https://en.wikipedia.org/wiki/Probability_distribution "Probability distribution") or [statistical model](https://en.wikipedia.org/wiki/Statistical_model "Statistical model"). Since Bayesian statistics treats probability as a degree of belief, Bayes' theorem can directly assign a probability distribution that quantifies the belief to the parameter or set of parameters.[\[2\]](https://en.wikipedia.org/wiki/Bayesian_statistics#cite_note-bda-2)[\[3\]](https://en.wikipedia.org/wiki/Bayesian_statistics#cite_note-rethinking-3) Bayesian statistics is named after [Thomas Bayes](https://en.wikipedia.org/wiki/Thomas_Bayes "Thomas Bayes"), who formulated a specific case of Bayes' theorem in [a paper](https://en.wikipedia.org/wiki/An_Essay_Towards_Solving_a_Problem_in_the_Doctrine_of_Chances "An Essay Towards Solving a Problem in the Doctrine of Chances") published in 1763. In several papers spanning from the late 18th to the early 19th centuries, [Pierre-Simon Laplace](https://en.wikipedia.org/wiki/Pierre-Simon_Laplace "Pierre-Simon Laplace") developed the Bayesian interpretation of probability.[\[5\]](https://en.wikipedia.org/wiki/Bayesian_statistics#cite_note-5) Laplace used methods now considered Bayesian to solve a number of statistical problems. While many Bayesian methods were developed by later authors, the term "Bayesian" was not commonly used to describe these methods until the 1950s. Throughout much of the 20th century, Bayesian methods were viewed unfavorably by many statisticians due to philosophical and practical considerations. Many of these methods required much computation, and most widely used approaches during that time were based on the frequentist interpretation. However, with the advent of powerful computers and new [algorithms](https://en.wikipedia.org/wiki/Algorithm "Algorithm") like [Markov chain Monte Carlo](https://en.wikipedia.org/wiki/Markov_chain_Monte_Carlo "Markov chain Monte Carlo"), Bayesian methods have gained increasing prominence in statistics in the 21st century.[\[2\]](https://en.wikipedia.org/wiki/Bayesian_statistics#cite_note-bda-2)[\[6\]](https://en.wikipedia.org/wiki/Bayesian_statistics#cite_note-6) Bayes' theorem is used in Bayesian methods to update probabilities, which are degrees of belief, after obtaining new data. Given two events ![{\\displaystyle A}](https://wikimedia.org/api/rest_v1/media/math/render/svg/7daff47fa58cdfd29dc333def748ff5fa4c923e3) and ![{\\displaystyle B}](https://wikimedia.org/api/rest_v1/media/math/render/svg/47136aad860d145f75f3eed3022df827cee94d7a), the conditional probability of ![{\\displaystyle A}](https://wikimedia.org/api/rest_v1/media/math/render/svg/7daff47fa58cdfd29dc333def748ff5fa4c923e3) given that ![{\\displaystyle B}](https://wikimedia.org/api/rest_v1/media/math/render/svg/47136aad860d145f75f3eed3022df827cee94d7a) is true is expressed as follows:[\[7\]](https://en.wikipedia.org/wiki/Bayesian_statistics#cite_note-grinsteadsnell2006-7) ![{\\displaystyle P(A\\mid B)={\\frac {P(B\\mid A)P(A)}{P(B)}}}](https://wikimedia.org/api/rest_v1/media/math/render/svg/87c061fe1c7430a5201eef3fa50f9d00eac78810) where ![{\\displaystyle P(B)\\neq 0}](https://wikimedia.org/api/rest_v1/media/math/render/svg/4696c4543d63622a09c29cbb00c0fea4e0b8d7b7). Although Bayes' theorem is a fundamental result of [probability theory](https://en.wikipedia.org/wiki/Probability_theory "Probability theory"), it has a specific interpretation in Bayesian statistics. In the above equation, ![{\\displaystyle A}](https://wikimedia.org/api/rest_v1/media/math/render/svg/7daff47fa58cdfd29dc333def748ff5fa4c923e3) usually represents a [proposition](https://en.wikipedia.org/wiki/Proposition "Proposition") (such as the statement that a coin lands on heads fifty percent of the time) and ![{\\displaystyle B}](https://wikimedia.org/api/rest_v1/media/math/render/svg/47136aad860d145f75f3eed3022df827cee94d7a) represents the evidence, or new data that is to be taken into account (such as the result of a series of coin flips). ![{\\displaystyle P(A)}](https://wikimedia.org/api/rest_v1/media/math/render/svg/4f264d19e21604793c6dc54f8044df454db82744) is the [prior probability](https://en.wikipedia.org/wiki/Prior_probability "Prior probability") of ![{\\displaystyle A}](https://wikimedia.org/api/rest_v1/media/math/render/svg/7daff47fa58cdfd29dc333def748ff5fa4c923e3) which expresses one's beliefs about ![{\\displaystyle A}](https://wikimedia.org/api/rest_v1/media/math/render/svg/7daff47fa58cdfd29dc333def748ff5fa4c923e3) before evidence is taken into account. The prior probability may also quantify prior knowledge or information about ![{\\displaystyle A}](https://wikimedia.org/api/rest_v1/media/math/render/svg/7daff47fa58cdfd29dc333def748ff5fa4c923e3). ![{\\displaystyle P(B\\mid A)}](https://wikimedia.org/api/rest_v1/media/math/render/svg/e2fe9ad0fdfd8920e56ca948400e111852af0665) is the [likelihood function](https://en.wikipedia.org/wiki/Likelihood_function "Likelihood function"), which can be interpreted as the probability of the evidence ![{\\displaystyle B}](https://wikimedia.org/api/rest_v1/media/math/render/svg/47136aad860d145f75f3eed3022df827cee94d7a) given that ![{\\displaystyle A}](https://wikimedia.org/api/rest_v1/media/math/render/svg/7daff47fa58cdfd29dc333def748ff5fa4c923e3) is true. The likelihood quantifies the extent to which the evidence ![{\\displaystyle B}](https://wikimedia.org/api/rest_v1/media/math/render/svg/47136aad860d145f75f3eed3022df827cee94d7a) supports the proposition ![{\\displaystyle A}](https://wikimedia.org/api/rest_v1/media/math/render/svg/7daff47fa58cdfd29dc333def748ff5fa4c923e3). ![{\\displaystyle P(A\\mid B)}](https://wikimedia.org/api/rest_v1/media/math/render/svg/8f8f30f4da85b53901e0871eb41ed8827f511bb7) is the [posterior probability](https://en.wikipedia.org/wiki/Posterior_probability "Posterior probability"), the probability of the proposition ![{\\displaystyle A}](https://wikimedia.org/api/rest_v1/media/math/render/svg/7daff47fa58cdfd29dc333def748ff5fa4c923e3) after taking the evidence ![{\\displaystyle B}](https://wikimedia.org/api/rest_v1/media/math/render/svg/47136aad860d145f75f3eed3022df827cee94d7a) into account. Essentially, Bayes' theorem updates one's prior beliefs ![{\\displaystyle P(A)}](https://wikimedia.org/api/rest_v1/media/math/render/svg/4f264d19e21604793c6dc54f8044df454db82744) after considering the new evidence ![{\\displaystyle B}](https://wikimedia.org/api/rest_v1/media/math/render/svg/47136aad860d145f75f3eed3022df827cee94d7a).[\[2\]](https://en.wikipedia.org/wiki/Bayesian_statistics#cite_note-bda-2) The probability of the evidence ![{\\displaystyle P(B)}](https://wikimedia.org/api/rest_v1/media/math/render/svg/e593d180a26fd68657ea50368dbfe1a661e652aa) can be calculated using the [law of total probability](https://en.wikipedia.org/wiki/Law_of_total_probability "Law of total probability"). If ![{\\displaystyle \\{A\_{1},A\_{2},\\dots ,A\_{n}\\}}](https://wikimedia.org/api/rest_v1/media/math/render/svg/5e9492d2b8fa6c9a169b82898ffbf97a1c9cb302) is a [partition](https://en.wikipedia.org/wiki/Partition_of_a_set "Partition of a set") of the [sample space](https://en.wikipedia.org/wiki/Sample_space "Sample space"), which is the set of all [outcomes](https://en.wikipedia.org/wiki/Outcome_\(probability\) "Outcome (probability)") of an experiment, then,[\[2\]](https://en.wikipedia.org/wiki/Bayesian_statistics#cite_note-bda-2)[\[7\]](https://en.wikipedia.org/wiki/Bayesian_statistics#cite_note-grinsteadsnell2006-7) ![{\\displaystyle P(B)=P(B\\mid A\_{1})P(A\_{1})+P(B\\mid A\_{2})P(A\_{2})+\\dots +P(B\\mid A\_{n})P(A\_{n})=\\sum \_{i}P(B\\mid A\_{i})P(A\_{i})}](https://wikimedia.org/api/rest_v1/media/math/render/svg/8867372a5e203835363cbad1ae1cbaa1defb15af) When there are an infinite number of outcomes, it is necessary to [integrate](https://en.wikipedia.org/wiki/Integral "Integral") over all outcomes to calculate ![{\\displaystyle P(B)}](https://wikimedia.org/api/rest_v1/media/math/render/svg/e593d180a26fd68657ea50368dbfe1a661e652aa) using the law of total probability. Often, ![{\\displaystyle P(B)}](https://wikimedia.org/api/rest_v1/media/math/render/svg/e593d180a26fd68657ea50368dbfe1a661e652aa) is difficult to calculate as the calculation would involve sums or integrals that would be time-consuming to evaluate, so often only the product of the prior and likelihood is considered, since the evidence does not change in the same analysis. The posterior is proportional to this product:[\[2\]](https://en.wikipedia.org/wiki/Bayesian_statistics#cite_note-bda-2) ![{\\displaystyle P(A\\mid B)\\propto P(B\\mid A)P(A)}](https://wikimedia.org/api/rest_v1/media/math/render/svg/e1a83fc9b2788b4a72bbc4c90d06c67bb7e0fdae) The [maximum a posteriori](https://en.wikipedia.org/wiki/Maximum_a_posteriori "Maximum a posteriori"), which is the [mode](https://en.wikipedia.org/wiki/Mode_\(statistics\) "Mode (statistics)") of the posterior and is often computed in Bayesian statistics using [mathematical optimization](https://en.wikipedia.org/wiki/Mathematical_optimization "Mathematical optimization") methods, remains the same. The posterior can be approximated even without computing the exact value of ![{\\displaystyle P(B)}](https://wikimedia.org/api/rest_v1/media/math/render/svg/e593d180a26fd68657ea50368dbfe1a661e652aa) with methods such as [Markov chain Monte Carlo](https://en.wikipedia.org/wiki/Markov_chain_Monte_Carlo "Markov chain Monte Carlo") or [variational Bayesian methods](https://en.wikipedia.org/wiki/Variational_Bayesian_methods "Variational Bayesian methods").[\[2\]](https://en.wikipedia.org/wiki/Bayesian_statistics#cite_note-bda-2) The classical textbook equation for the posterior in Bayesian statistics is usually stated as ![{\\displaystyle \\pi (\\theta \\mid x)={\\mathcal {L}}(x\\mid \\theta )\\cdot {\\frac {\\pi (\\theta )}{\\int \_{\\Theta }{\\mathcal {L}}(x\\mid \\theta ')\\cdot \\pi (\\theta ')\\;d\\theta '}}}](https://wikimedia.org/api/rest_v1/media/math/render/svg/8ca2f7fa5a7e2d6da43c373d49d958e1e7a14d4c) where ![{\\displaystyle \\pi (\\theta \\mid x)}](https://wikimedia.org/api/rest_v1/media/math/render/svg/0077a1d0471a66000ea105bc787a4064d12ef81c) is the updated probability of ![{\\displaystyle \\theta }](https://wikimedia.org/api/rest_v1/media/math/render/svg/6e5ab2664b422d53eb0c7df3b87e1360d75ad9af) being the true parameter after collecting the data ![{\\displaystyle x}](https://wikimedia.org/api/rest_v1/media/math/render/svg/87f9e315fd7e2ba406057a97300593c4802b53e4), ![{\\displaystyle {\\mathcal {L}}(x\\mid \\theta )}](https://wikimedia.org/api/rest_v1/media/math/render/svg/8f2422cef15a3b8de3926f50c889a895243e36f1) is the likelihood of collecting the data ![{\\displaystyle x}](https://wikimedia.org/api/rest_v1/media/math/render/svg/87f9e315fd7e2ba406057a97300593c4802b53e4) given the parameter ![{\\displaystyle \\theta }](https://wikimedia.org/api/rest_v1/media/math/render/svg/6e5ab2664b422d53eb0c7df3b87e1360d75ad9af), ![{\\displaystyle \\pi (\\theta )}](https://wikimedia.org/api/rest_v1/media/math/render/svg/a0a237e5e72ef85c0e54dfec3cadd290f825a95a) is the prior belief of ![{\\displaystyle \\theta }](https://wikimedia.org/api/rest_v1/media/math/render/svg/6e5ab2664b422d53eb0c7df3b87e1360d75ad9af)'s likelihood and the integral in the denominator gives the probability of collecting the data ![{\\displaystyle x}](https://wikimedia.org/api/rest_v1/media/math/render/svg/87f9e315fd7e2ba406057a97300593c4802b53e4). Mathematically, this version of Bayes' theorem can be constructed in the following way: Suppose ![{\\displaystyle (\\Omega ,\\Sigma \_{\\Omega },\\lbrace P\_{\\theta }\\mid \\theta \\in \\Theta \\rbrace )}](https://wikimedia.org/api/rest_v1/media/math/render/svg/18b96d12952a1fb467d5e18e8e530bc3b88c8e8c) to be some parametric statistical model and ![{\\displaystyle (\\Theta ,\\Sigma \_{\\Theta },\\pi )}](https://wikimedia.org/api/rest_v1/media/math/render/svg/2da447970131797cd26ea93fd5a9f4fb0d4c8ec7) to be a probability space over the parameter space. We can construct a new probability space ![{\\displaystyle (\\Theta \\times \\Omega ,\\Sigma \_{\\Theta }\\otimes \\Sigma \_{\\Omega },Q)}](https://wikimedia.org/api/rest_v1/media/math/render/svg/fbb1514f54f81a453a08b3036f7f398048b2311e) where ![{\\displaystyle Q}](https://wikimedia.org/api/rest_v1/media/math/render/svg/8752c7023b4b3286800fe3238271bbca681219ed) is a sort of product measure defined as: ![{\\displaystyle Q(M):=(\\pi \\otimes P\_{\\cdot })(M)=\\int \_{\\Theta }P\_{\\theta '}(M\_{\\theta '})\\;d\\pi (\\theta ')}](https://wikimedia.org/api/rest_v1/media/math/render/svg/639b1550cd0df0c9242c11123ee694397af996f4) Now, let ![{\\displaystyle A\_{\\theta }:=\\lbrace \\theta \\rbrace \\times \\Omega }](https://wikimedia.org/api/rest_v1/media/math/render/svg/5d3539ef16383a5cd31c88e76ecc56153a5b3606) and ![{\\displaystyle B\_{x}:=\\Theta \\times \\lbrace x\\rbrace }](https://wikimedia.org/api/rest_v1/media/math/render/svg/8984848415b964173d3689be89f903563b21614d), then we get: ![{\\displaystyle Q(\\theta )=Q(A\_{\\theta })=\\int \_{\\lbrace \\theta \\rbrace }P\_{\\theta '}(\\Omega )\\;d\\pi (\\theta ')=\\pi (\\lbrace \\theta \\rbrace )\\cdot P\_{\\theta }(\\Omega )=\\pi (\\theta )}](https://wikimedia.org/api/rest_v1/media/math/render/svg/675ca63dd8c5b9cb56457c47f6f02f1164918c55) and hence ![{\\displaystyle Q(x\\mid \\theta )={\\frac {Q(B\_{x}\\cap A\_{\\theta })}{Q(A\_{\\theta })}}={\\frac {\\pi (\\theta )\\cdot P\_{\\theta }(\\lbrace x\\rbrace )}{\\pi (\\theta )}}=P\_{\\theta }(x)}](https://wikimedia.org/api/rest_v1/media/math/render/svg/940facb82e61c0416c0908d8c5a59a253dd3d349) both as empirically might be expected. Thus, Bayes' theorem states: ![{\\displaystyle Q(\\theta \\mid x)=P\_{\\theta }(x)\\cdot {\\frac {\\pi (\\theta )}{Q(x)}}}](https://wikimedia.org/api/rest_v1/media/math/render/svg/0288a5efd3818856098ffb414ba322843d807188) If ![{\\displaystyle \\pi \\ll \\lambda }](https://wikimedia.org/api/rest_v1/media/math/render/svg/0384a886f90af14756c4796db9a7a2d42341762d) (absolutely continuous w.r.t. lebesgue measure), then there exists a density such that ![{\\displaystyle \\pi (\\theta )={\\frac {d\\pi }{d\\lambda }}(\\theta )}](https://wikimedia.org/api/rest_v1/media/math/render/svg/8587265eb1559500fc8e9faf9f5d59a8b18ca354) and we can write: ![{\\displaystyle Q(x)=\\int \_{\\Theta }P\_{\\theta '}(x)\\;d\\pi (\\theta ')=\\int \_{\\Theta }P\_{\\theta '}(x)\\cdot \\pi (\\theta ')\\;d\\theta '}](https://wikimedia.org/api/rest_v1/media/math/render/svg/7aa61103dc476e8c7782be4059766f4febd8b620) Else, if ![{\\displaystyle \\pi \\ll \\nu }](https://wikimedia.org/api/rest_v1/media/math/render/svg/c2b0f3d1e4aecd700c6dbea6348d85f7c3d60b73) (absolutely continuous w.r.t. counting measure), analogous we can write: ![{\\displaystyle Q(x)=\\int \_{\\Theta }P\_{\\theta '}(x)\\cdot \\pi (\\theta ')\\;d\\nu (\\theta ')=\\sum \_{i}P\_{\\theta \_{i}}(x)\\cdot \\pi (\\theta \_{i})}](https://wikimedia.org/api/rest_v1/media/math/render/svg/4f1453b6b7e525317decfe29052e869158077121) Thus, by identifying ![{\\displaystyle Q(\\theta \\mid x)}](https://wikimedia.org/api/rest_v1/media/math/render/svg/a3d64dbf8283bed3649bb2b34483950dde52b176) with ![{\\displaystyle \\pi (\\theta \\mid x)}](https://wikimedia.org/api/rest_v1/media/math/render/svg/0077a1d0471a66000ea105bc787a4064d12ef81c) and ![{\\displaystyle {\\mathcal {L}}(x\\mid \\theta )}](https://wikimedia.org/api/rest_v1/media/math/render/svg/8f2422cef15a3b8de3926f50c889a895243e36f1) with ![{\\displaystyle P\_{\\theta }(x)}](https://wikimedia.org/api/rest_v1/media/math/render/svg/e8bd7e275126cfcc11c6d98bc1b8305035788781) we arrive at the classical equation stated above. The general set of statistical techniques can be divided into a number of activities, many of which have special Bayesian versions. Bayesian inference refers to [statistical inference](https://en.wikipedia.org/wiki/Statistical_inference "Statistical inference") where uncertainty in inferences is quantified using probability.[\[8\]](https://en.wikipedia.org/wiki/Bayesian_statistics#cite_note-8) In classical [frequentist inference](https://en.wikipedia.org/wiki/Frequentist_inference "Frequentist inference"), model [parameters](https://en.wikipedia.org/wiki/Parameter "Parameter") and hypotheses are considered to be fixed. Probabilities are not assigned to parameters or hypotheses in frequentist inference. For example, it would not make sense in frequentist inference to directly assign a probability to an event that can only happen once, such as the result of the next flip of a fair coin. However, it would make sense to state that the proportion of heads [approaches one-half](https://en.wikipedia.org/wiki/Law_of_large_numbers "Law of large numbers") as the number of coin flips increases.[\[9\]](https://en.wikipedia.org/wiki/Bayesian_statistics#cite_note-wakefield2013-9) [Statistical models](https://en.wikipedia.org/wiki/Statistical_models "Statistical models") specify a set of statistical assumptions and processes that represent how the sample data are generated. Statistical models have a number of parameters that can be modified. For example, a coin can be represented as samples from a [Bernoulli distribution](https://en.wikipedia.org/wiki/Bernoulli_distribution "Bernoulli distribution"), which models two possible outcomes. The Bernoulli distribution has a single parameter equal to the probability of one outcome, which in most cases is the probability of landing on heads. Devising a good model for the data is central in Bayesian inference. In most cases, models only approximate the true process, and may not take into account certain factors influencing the data.[\[2\]](https://en.wikipedia.org/wiki/Bayesian_statistics#cite_note-bda-2) In Bayesian inference, probabilities can be assigned to model parameters. Parameters can be represented as [random variables](https://en.wikipedia.org/wiki/Random_variable "Random variable"). Bayesian inference uses Bayes' theorem to update probabilities after more evidence is obtained or known.[\[2\]](https://en.wikipedia.org/wiki/Bayesian_statistics#cite_note-bda-2)[\[10\]](https://en.wikipedia.org/wiki/Bayesian_statistics#cite_note-congdon2014-10) Furthermore, Bayesian methods allow for placing priors on entire models and calculating their posterior probabilities using Bayes' theorem. These posterior probabilities are proportional to the product of the prior and the marginal likelihood, where the marginal likelihood is the integral of the sampling density over the prior distribution of the parameters. In complex models, marginal likelihoods are generally computed numerically.[\[11\]](https://en.wikipedia.org/wiki/Bayesian_statistics#cite_note-chib1995-11) ### Statistical modeling \[[edit](https://en.wikipedia.org/w/index.php?title=Bayesian_statistics&action=edit&section=5 "Edit section: Statistical modeling")\] The formulation of [statistical models](https://en.wikipedia.org/wiki/Statistical_model "Statistical model") using Bayesian statistics has the identifying feature of requiring the specification of [prior distributions](https://en.wikipedia.org/wiki/Prior_distribution "Prior distribution") for any unknown parameters. Indeed, parameters of prior distributions may themselves have prior distributions, leading to [Bayesian hierarchical modeling](https://en.wikipedia.org/wiki/Bayesian_hierarchical_modeling "Bayesian hierarchical modeling"),[\[12\]](https://en.wikipedia.org/wiki/Bayesian_statistics#cite_note-KruschkeVanpaemel2015-12)[\[13\]](https://en.wikipedia.org/wiki/Bayesian_statistics#cite_note-:bmdl-13)[\[14\]](https://en.wikipedia.org/wiki/Bayesian_statistics#cite_note-14) also known as multi-level modeling. A special case is [Bayesian networks](https://en.wikipedia.org/wiki/Bayesian_networks "Bayesian networks"). For conducting a Bayesian statistical analysis, best practices are discussed by van de Schoot et al.[\[15\]](https://en.wikipedia.org/wiki/Bayesian_statistics#cite_note-vandeShootEtAl2021-15) ### Design of experiments \[[edit](https://en.wikipedia.org/w/index.php?title=Bayesian_statistics&action=edit&section=6 "Edit section: Design of experiments")\] The [Bayesian design of experiments](https://en.wikipedia.org/wiki/Bayesian_design_of_experiments "Bayesian design of experiments") includes a concept called 'influence of prior beliefs'. This approach uses [sequential analysis](https://en.wikipedia.org/wiki/Sequential_analysis "Sequential analysis") techniques to include the outcome of earlier experiments in the design of the next experiment. This is achieved by updating 'beliefs' through the use of prior and [posterior distribution](https://en.wikipedia.org/wiki/Posterior_distribution "Posterior distribution"). This allows the design of experiments to make good use of resources of all types. An example of this is the [multi-armed bandit problem](https://en.wikipedia.org/wiki/Multi-armed_bandit_problem "Multi-armed bandit problem"). ### Exploratory analysis of Bayesian models \[[edit](https://en.wikipedia.org/w/index.php?title=Bayesian_statistics&action=edit&section=7 "Edit section: Exploratory analysis of Bayesian models")\] Exploratory analysis of Bayesian models is an adaptation or extension of the [exploratory data analysis](https://en.wikipedia.org/wiki/Exploratory_data_analysis "Exploratory data analysis") approach to the needs and peculiarities of Bayesian modeling. In the words of Persi Diaconis:[\[16\]](https://en.wikipedia.org/wiki/Bayesian_statistics#cite_note-16) > Exploratory data analysis seeks to reveal structure, or simple descriptions in data. We look at numbers or graphs and try to find patterns. We pursue leads suggested by background information, imagination, patterns perceived, and experience with other data analyses The [inference process](https://en.wikipedia.org/wiki/Bayesian_inference "Bayesian inference") generates a posterior distribution, which has a central role in Bayesian statistics, together with other distributions like the posterior predictive distribution and the prior predictive distribution. The correct visualization, analysis, and interpretation of these distributions is key to properly answer the questions that motivate the inference process.[\[17\]](https://en.wikipedia.org/wiki/Bayesian_statistics#cite_note-17) When working with Bayesian models there are a series of related tasks that need to be addressed besides inference itself: - Diagnoses of the quality of the inference, this is needed when using numerical methods such as [Markov chain Monte Carlo](https://en.wikipedia.org/wiki/Markov_chain_Monte_Carlo "Markov chain Monte Carlo") techniques - Model criticism, including evaluations of both model assumptions and model predictions - Comparison of models, including model selection or model averaging - Preparation of the results for a particular audience All these tasks are part of the Exploratory analysis of Bayesian models approach and successfully performing them is central to the iterative and interactive modeling process. These tasks require both numerical and visual summaries.[\[18\]](https://en.wikipedia.org/wiki/Bayesian_statistics#cite_note-18)[\[19\]](https://en.wikipedia.org/wiki/Bayesian_statistics#cite_note-19)[\[20\]](https://en.wikipedia.org/wiki/Bayesian_statistics#cite_note-Martin2018-20) - [Bayesian epistemology](https://en.wikipedia.org/wiki/Bayesian_epistemology "Bayesian epistemology") - For a list of mathematical logic notation used in this article - [Notation in probability and statistics](https://en.wikipedia.org/wiki/Notation_in_probability_and_statistics "Notation in probability and statistics") - [List of logic symbols](https://en.wikipedia.org/wiki/List_of_logic_symbols "List of logic symbols") 1. **[^](https://en.wikipedia.org/wiki/Bayesian_statistics#cite_ref-1)** ["Bayesian"](https://www.merriam-webster.com/dictionary/Bayesian). *[Merriam-Webster.com Dictionary](https://en.wikipedia.org/wiki/Merriam-Webster "Merriam-Webster")*. Merriam-Webster. [OCLC](https://en.wikipedia.org/wiki/OCLC_\(identifier\) "OCLC (identifier)") [1032680871](https://search.worldcat.org/oclc/1032680871). 2. ^ [***a***](https://en.wikipedia.org/wiki/Bayesian_statistics#cite_ref-bda_2-0) [***b***](https://en.wikipedia.org/wiki/Bayesian_statistics#cite_ref-bda_2-1) [***c***](https://en.wikipedia.org/wiki/Bayesian_statistics#cite_ref-bda_2-2) [***d***](https://en.wikipedia.org/wiki/Bayesian_statistics#cite_ref-bda_2-3) [***e***](https://en.wikipedia.org/wiki/Bayesian_statistics#cite_ref-bda_2-4) [***f***](https://en.wikipedia.org/wiki/Bayesian_statistics#cite_ref-bda_2-5) [***g***](https://en.wikipedia.org/wiki/Bayesian_statistics#cite_ref-bda_2-6) [***h***](https://en.wikipedia.org/wiki/Bayesian_statistics#cite_ref-bda_2-7) [***i***](https://en.wikipedia.org/wiki/Bayesian_statistics#cite_ref-bda_2-8) [Gelman, Andrew](https://en.wikipedia.org/wiki/Andrew_Gelman "Andrew Gelman"); [Carlin, John B.](https://en.wikipedia.org/wiki/John_Carlin_\(professor\) "John Carlin (professor)"); Stern, Hal S.; Dunson, David B.; Vehtari, Aki; [Rubin, Donald B.](https://en.wikipedia.org/wiki/Donald_Rubin "Donald Rubin") (2013). *Bayesian Data Analysis* (Third ed.). Chapman and Hall/CRC. [ISBN](https://en.wikipedia.org/wiki/ISBN_\(identifier\) "ISBN (identifier)") [978-1-4398-4095-5](https://en.wikipedia.org/wiki/Special:BookSources/978-1-4398-4095-5 "Special:BookSources/978-1-4398-4095-5") . 3. ^ [***a***](https://en.wikipedia.org/wiki/Bayesian_statistics#cite_ref-rethinking_3-0) [***b***](https://en.wikipedia.org/wiki/Bayesian_statistics#cite_ref-rethinking_3-1) [McElreath, Richard](https://en.wikipedia.org/wiki/Richard_McElreath "Richard McElreath") (2020). *[Statistical Rethinking : A Bayesian Course with Examples in R and Stan](https://en.wikipedia.org/wiki/Statistical_Rethinking "Statistical Rethinking")* (2nd ed.). Chapman and Hall/CRC. [ISBN](https://en.wikipedia.org/wiki/ISBN_\(identifier\) "ISBN (identifier)") [978-0-367-13991-9](https://en.wikipedia.org/wiki/Special:BookSources/978-0-367-13991-9 "Special:BookSources/978-0-367-13991-9") . 4. **[^](https://en.wikipedia.org/wiki/Bayesian_statistics#cite_ref-4)** [Kruschke, John](https://en.wikipedia.org/wiki/John_K._Kruschke "John K. Kruschke") (2014). *Doing Bayesian Data Analysis: A Tutorial with R, JAGS, and Stan* (2nd ed.). Academic Press. [ISBN](https://en.wikipedia.org/wiki/ISBN_\(identifier\) "ISBN (identifier)") [978-0-12-405888-0](https://en.wikipedia.org/wiki/Special:BookSources/978-0-12-405888-0 "Special:BookSources/978-0-12-405888-0") . 5. **[^](https://en.wikipedia.org/wiki/Bayesian_statistics#cite_ref-5)** [McGrayne, Sharon](https://en.wikipedia.org/wiki/Richard_McElreath "Richard McElreath") (2012). *The Theory That Would Not Die: How Bayes' Rule Cracked the Enigma Code, Hunted Down Russian Submarines, and Emerged Triumphant from Two Centuries of Controversy* (First ed.). Chapman and Hall/CRC. [ISBN](https://en.wikipedia.org/wiki/ISBN_\(identifier\) "ISBN (identifier)") [978-0-3001-8822-6](https://en.wikipedia.org/wiki/Special:BookSources/978-0-3001-8822-6 "Special:BookSources/978-0-3001-8822-6") . 6. **[^](https://en.wikipedia.org/wiki/Bayesian_statistics#cite_ref-6)** Fienberg, Stephen E. (2006). ["When Did Bayesian Inference Become "Bayesian"?"](https://doi.org/10.1214%2F06-BA101). *Bayesian Analysis*. **1** (1): 1–40\. [doi](https://en.wikipedia.org/wiki/Doi_\(identifier\) "Doi (identifier)"):[10\.1214/06-BA101](https://doi.org/10.1214%2F06-BA101). 7. ^ [***a***](https://en.wikipedia.org/wiki/Bayesian_statistics#cite_ref-grinsteadsnell2006_7-0) [***b***](https://en.wikipedia.org/wiki/Bayesian_statistics#cite_ref-grinsteadsnell2006_7-1) Grinstead, Charles M.; Snell, J. Laurie (2006). *Introduction to probability* (2nd ed.). Providence, RI: American Mathematical Society. [ISBN](https://en.wikipedia.org/wiki/ISBN_\(identifier\) "ISBN (identifier)") [978-0-8218-9414-9](https://en.wikipedia.org/wiki/Special:BookSources/978-0-8218-9414-9 "Special:BookSources/978-0-8218-9414-9") . 8. **[^](https://en.wikipedia.org/wiki/Bayesian_statistics#cite_ref-8)** Lee, Se Yoon (2021). "Gibbs sampler and coordinate ascent variational inference: A set-theoretical review". *Communications in Statistics - Theory and Methods*. **51** (6): 1549–1568\. [arXiv](https://en.wikipedia.org/wiki/ArXiv_\(identifier\) "ArXiv (identifier)"):[2008\.01006](https://arxiv.org/abs/2008.01006). [doi](https://en.wikipedia.org/wiki/Doi_\(identifier\) "Doi (identifier)"):[10\.1080/03610926.2021.1921214](https://doi.org/10.1080%2F03610926.2021.1921214). [S2CID](https://en.wikipedia.org/wiki/S2CID_\(identifier\) "S2CID (identifier)") [220935477](https://api.semanticscholar.org/CorpusID:220935477). 9. **[^](https://en.wikipedia.org/wiki/Bayesian_statistics#cite_ref-wakefield2013_9-0)** Wakefield, Jon (2013). *Bayesian and frequentist regression methods*. New York, NY: Springer. [ISBN](https://en.wikipedia.org/wiki/ISBN_\(identifier\) "ISBN (identifier)") [978-1-4419-0924-4](https://en.wikipedia.org/wiki/Special:BookSources/978-1-4419-0924-4 "Special:BookSources/978-1-4419-0924-4") . 10. **[^](https://en.wikipedia.org/wiki/Bayesian_statistics#cite_ref-congdon2014_10-0)** Congdon, Peter (2014). *Applied Bayesian modelling* (2nd ed.). Wiley. [ISBN](https://en.wikipedia.org/wiki/ISBN_\(identifier\) "ISBN (identifier)") [978-1119951513](https://en.wikipedia.org/wiki/Special:BookSources/978-1119951513 "Special:BookSources/978-1119951513") . 11. **[^](https://en.wikipedia.org/wiki/Bayesian_statistics#cite_ref-chib1995_11-0)** Chib, Siddhartha (1995). "Marginal Likelihood from the Gibbs Output". *Journal of the American Statistical Association*. **90** (432): 1313–1321\. [doi](https://en.wikipedia.org/wiki/Doi_\(identifier\) "Doi (identifier)"):[10\.1080/01621459.1995.10476635](https://doi.org/10.1080%2F01621459.1995.10476635). 12. **[^](https://en.wikipedia.org/wiki/Bayesian_statistics#cite_ref-KruschkeVanpaemel2015_12-0)** [Kruschke, J K](https://en.wikipedia.org/wiki/John_K._Kruschke "John K. Kruschke"); Vanpaemel, W (2015). "Bayesian Estimation in Hierarchical Models". In Busemeyer, J R; Wang, Z; Townsend, J T; Eidels, A (eds.). [*The Oxford Handbook of Computational and Mathematical Psychology*](https://jkkweb.sitehost.iu.edu/articles/KruschkeVanpaemel2015.pdf) (PDF). Oxford University Press. pp. 279–299\. 13. **[^](https://en.wikipedia.org/wiki/Bayesian_statistics#cite_ref-:bmdl_13-0)** Hajiramezanali, E. & Dadaneh, S. Z. & Karbalayghareh, A. & Zhou, Z. & Qian, X. Bayesian multi-domain learning for cancer subtype discovery from next-generation sequencing count data. 32nd Conference on Neural Information Processing Systems (NIPS 2018), Montréal, Canada. [arXiv](https://en.wikipedia.org/wiki/ArXiv_\(identifier\) "ArXiv (identifier)"):[1810\.09433](https://arxiv.org/abs/1810.09433) 14. **[^](https://en.wikipedia.org/wiki/Bayesian_statistics#cite_ref-14)** Lee, Se Yoon; Mallick, Bani (2021). "Bayesian Hierarchical Modeling: Application Towards Production Results in the Eagle Ford Shale of South Texas". *Sankhya B*. **84**: 1–43\. [doi](https://en.wikipedia.org/wiki/Doi_\(identifier\) "Doi (identifier)"):[10\.1007/s13571-020-00245-8](https://doi.org/10.1007%2Fs13571-020-00245-8). 15. **[^](https://en.wikipedia.org/wiki/Bayesian_statistics#cite_ref-vandeShootEtAl2021_15-0)** van de Schoot, Rens; Depaoli, Sarah; King, Ruth; Kramer, Bianca; Märtens, Kaspar; [Tadesse, Mahlet G.](https://en.wikipedia.org/wiki/Mahlet_Tadesse "Mahlet Tadesse"); Vannucci, Marina; Gelman, Andrew; Veen, Duco; Willemsen, Joukje; Yau, Christopher (January 14, 2021). ["Bayesian statistics and modelling"](https://osf.io/wdtmc/). *[Nature Reviews Methods Primers](https://en.wikipedia.org/wiki/Nature_Reviews_Methods_Primers "Nature Reviews Methods Primers")*. **1** (1): 1–26\. [doi](https://en.wikipedia.org/wiki/Doi_\(identifier\) "Doi (identifier)"):[10\.1038/s43586-020-00001-2](https://doi.org/10.1038%2Fs43586-020-00001-2). [hdl](https://en.wikipedia.org/wiki/Hdl_\(identifier\) "Hdl (identifier)"):[1874/415909](https://hdl.handle.net/1874%2F415909). [S2CID](https://en.wikipedia.org/wiki/S2CID_\(identifier\) "S2CID (identifier)") [234108684](https://api.semanticscholar.org/CorpusID:234108684). 16. **[^](https://en.wikipedia.org/wiki/Bayesian_statistics#cite_ref-16)** Diaconis, Persi (2011) Theories of Data Analysis: From Magical Thinking Through Classical Statistics. John Wiley & Sons, Ltd 2:e55 [doi](https://en.wikipedia.org/wiki/Doi_\(identifier\) "Doi (identifier)"):[10\.1002/9781118150702.ch1](https://doi.org/10.1002%2F9781118150702.ch1) 17. **[^](https://en.wikipedia.org/wiki/Bayesian_statistics#cite_ref-17)** Kumar, Ravin; Carroll, Colin; Hartikainen, Ari; Martin, Osvaldo (2019). ["ArviZ a unified library for exploratory analysis of Bayesian models in Python"](https://doi.org/10.21105%2Fjoss.01143). *Journal of Open Source Software*. **4** (33): 1143. [Bibcode](https://en.wikipedia.org/wiki/Bibcode_\(identifier\) "Bibcode (identifier)"):[2019JOSS....4.1143K](https://ui.adsabs.harvard.edu/abs/2019JOSS....4.1143K). [doi](https://en.wikipedia.org/wiki/Doi_\(identifier\) "Doi (identifier)"):[10\.21105/joss.01143](https://doi.org/10.21105%2Fjoss.01143). [hdl](https://en.wikipedia.org/wiki/Hdl_\(identifier\) "Hdl (identifier)"):[11336/114615](https://hdl.handle.net/11336%2F114615). 18. **[^](https://en.wikipedia.org/wiki/Bayesian_statistics#cite_ref-18)** Gabry, Jonah; Simpson, Daniel; Vehtari, Aki; Betancourt, Michael; Gelman, Andrew (2019). "Visualization in Bayesian workflow". *Journal of the Royal Statistical Society, Series A (Statistics in Society)*. **182** (2): 389–402\. [arXiv](https://en.wikipedia.org/wiki/ArXiv_\(identifier\) "ArXiv (identifier)"):[1709\.01449](https://arxiv.org/abs/1709.01449). [doi](https://en.wikipedia.org/wiki/Doi_\(identifier\) "Doi (identifier)"):[10\.1111/rssa.12378](https://doi.org/10.1111%2Frssa.12378). [S2CID](https://en.wikipedia.org/wiki/S2CID_\(identifier\) "S2CID (identifier)") [26590874](https://api.semanticscholar.org/CorpusID:26590874). 19. **[^](https://en.wikipedia.org/wiki/Bayesian_statistics#cite_ref-19)** Vehtari, Aki; Gelman, Andrew; Simpson, Daniel; Carpenter, Bob; Bürkner, Paul-Christian (2021). "Rank-Normalization, Folding, and Localization: An Improved Rˆ for Assessing Convergence of MCMC (With Discussion)". *Bayesian Analysis*. **16** (2): 667. [arXiv](https://en.wikipedia.org/wiki/ArXiv_\(identifier\) "ArXiv (identifier)"):[1903\.08008](https://arxiv.org/abs/1903.08008). [Bibcode](https://en.wikipedia.org/wiki/Bibcode_\(identifier\) "Bibcode (identifier)"):[2021BayAn..16..667V](https://ui.adsabs.harvard.edu/abs/2021BayAn..16..667V). [doi](https://en.wikipedia.org/wiki/Doi_\(identifier\) "Doi (identifier)"):[10\.1214/20-BA1221](https://doi.org/10.1214%2F20-BA1221). [S2CID](https://en.wikipedia.org/wiki/S2CID_\(identifier\) "S2CID (identifier)") [88522683](https://api.semanticscholar.org/CorpusID:88522683). 20. **[^](https://en.wikipedia.org/wiki/Bayesian_statistics#cite_ref-Martin2018_20-0)** Martin, Osvaldo (2018). [*Bayesian Analysis with Python: Introduction to statistical modeling and probabilistic programming using PyMC3 and ArviZ*](https://books.google.com/books?id=1Z2BDwAAQBAJ). Packt Publishing Ltd. [ISBN](https://en.wikipedia.org/wiki/ISBN_\(identifier\) "ISBN (identifier)") [9781789341652](https://en.wikipedia.org/wiki/Special:BookSources/9781789341652 "Special:BookSources/9781789341652") . - [Bernardo, José M.](https://en.wikipedia.org/wiki/Jos%C3%A9-Miguel_Bernardo "José-Miguel Bernardo"); [Smith, Adrian F. M.](https://en.wikipedia.org/wiki/Adrian_Smith_\(statistician\) "Adrian Smith (statistician)") (2000). *Bayesian Theory*. New York: Wiley. [ISBN](https://en.wikipedia.org/wiki/ISBN_\(identifier\) "ISBN (identifier)") [0-471-92416-4](https://en.wikipedia.org/wiki/Special:BookSources/0-471-92416-4 "Special:BookSources/0-471-92416-4") . - Bolstad, William M.; Curran, James M. (2016). *Introduction to Bayesian Statistics* (3rd ed.). Wiley. [ISBN](https://en.wikipedia.org/wiki/ISBN_\(identifier\) "ISBN (identifier)") [978-1-118-09156-2](https://en.wikipedia.org/wiki/Special:BookSources/978-1-118-09156-2 "Special:BookSources/978-1-118-09156-2") . - [Downey, Allen B.](https://en.wikipedia.org/wiki/Allen_B._Downey "Allen B. Downey") (2021). [*Think Bayes: Bayesian Statistics in Python*](https://greenteapress.com/wp/think-bayes/) (2nd ed.). O'Reilly. [ISBN](https://en.wikipedia.org/wiki/ISBN_\(identifier\) "ISBN (identifier)") [978-1-4920-8946-9](https://en.wikipedia.org/wiki/Special:BookSources/978-1-4920-8946-9 "Special:BookSources/978-1-4920-8946-9") . - Hoff, Peter D. (2009). *A First Course in Bayesian Statistical Methods* (2nd ed.). New York: Springer. [ISBN](https://en.wikipedia.org/wiki/ISBN_\(identifier\) "ISBN (identifier)") [978-1-4419-2828-3](https://en.wikipedia.org/wiki/Special:BookSources/978-1-4419-2828-3 "Special:BookSources/978-1-4419-2828-3") . - Lee, Peter M. (2012). *Bayesian Statistics: An Introduction* (4th ed.). Wiley. [ISBN](https://en.wikipedia.org/wiki/ISBN_\(identifier\) "ISBN (identifier)") [978-1-118-33257-3](https://en.wikipedia.org/wiki/Special:BookSources/978-1-118-33257-3 "Special:BookSources/978-1-118-33257-3") . - [Robert, Christian P.](https://en.wikipedia.org/wiki/Christian_Robert "Christian Robert") (2007). *The Bayesian Choice : From Decision-Theoretic Foundations to Computational Implementation* (2nd ed.). New York: Springer. [ISBN](https://en.wikipedia.org/wiki/ISBN_\(identifier\) "ISBN (identifier)") [978-0-387-71598-8](https://en.wikipedia.org/wiki/Special:BookSources/978-0-387-71598-8 "Special:BookSources/978-0-387-71598-8") . - Johnson, Alicia A.; Ott, Mies Q.; Dogucu, Mine (2022). [*Bayes Rules! An Introduction to Applied Bayesian Modeling*](https://www.bayesrulesbook.com/). Boca Raton: Chapman & Hall/CRC Texts in Statistical Science. [ISBN](https://en.wikipedia.org/wiki/ISBN_\(identifier\) "ISBN (identifier)") [978-0-367-25539-8](https://en.wikipedia.org/wiki/Special:BookSources/978-0-367-25539-8 "Special:BookSources/978-0-367-25539-8") . - Theo Kypraios. ["A Gentle Tutorial in Bayesian Statistics"](https://kupdf.com/download/a-gentle-tutorial-in-bayesian-statisticspdf_59b0ed86dc0d602e3b568edc_pdf) (PDF). Retrieved 2013-11-03. - Jordi Vallverdu. [*Bayesians Versus Frequentists A Philosophical Debate on Statistical Reasoning*](https://www.springer.com/gp/book/9783662486368). - [Bayesian statistics](http://www.scholarpedia.org/article/Bayesian_statistics) [David Spiegelhalter](https://en.wikipedia.org/wiki/David_Spiegelhalter "David Spiegelhalter"), Kenneth Rice [Scholarpedia](https://en.wikipedia.org/wiki/Scholarpedia "Scholarpedia") 4(8):5230. [doi:10.4249/scholarpedia.5230](https://doi.org/10.4249/scholarpedia.5230 "doi:10.4249/scholarpedia.5230") - [Bayesian modeling book](http://bayesmodels.com/) and examples available for downloading. - Rens van de Schoot. ["A Gentle Introduction to Bayesian Analysis"](https://www.statmodel.com/download/introBayes.pdf) (PDF). - [Bayesian A/B Testing Calculator](https://marketing.dynamicyield.com/bayesian-calculator/) Dynamic Yield
Shard152 (laksa)
Root Hash17790707453426894952
Unparsed URLorg,wikipedia!en,/wiki/Bayesian_statistics s443