{"id":440,"date":"2024-07-23T08:02:07","date_gmt":"2024-07-22T23:02:07","guid":{"rendered":"https:\/\/mp-superkler.com\/?p=440"},"modified":"2024-07-23T08:12:48","modified_gmt":"2024-07-22T23:12:48","slug":"bayesian-estimation","status":"publish","type":"post","link":"https:\/\/mp-superkler.com\/?p=440","title":{"rendered":"Bayesian Estimation"},"content":{"rendered":"\n<p>Bayesian estimation is a method in Bayesian statistics to estimate an unknown quantity based on prior information and observed data. The main goal is to update the prior belief about a parameter \\( \\theta \\) given new data \\( \\mathbf{X} \\). This is achieved through Bayes&#8217; theorem.<\/p>\n\n\n\n<p>Bayes&#8217; Theorem<br>Bayes&#8217; theorem relates the conditional and marginal probabilities of random events. For a parameter \\( \\theta \\) and data \\( \\mathbf{X} \\), it is given by:<\/p>\n\n\n\n<p>$$ P(\\theta \\mid \\mathbf{X}) = \\frac{P(\\mathbf{X} \\mid \\theta) P(\\theta)}{P(\\mathbf{X})} $$<\/p>\n\n\n\n<p>Where:<br>\\( P(\\theta \\mid \\mathbf{X}) \\) is the posterior probability of \\( \\theta \\) given data \\( \\mathbf{X} \\)<br>\\( P(\\mathbf{X} \\mid \\theta) \\) is the likelihood of data \\( \\mathbf{X} \\) given parameter \\( \\theta \\)<br>\\( P(\\theta) \\) is the prior probability of \\( \\theta \\)<br>\\( P(\\mathbf{X}) \\) is the marginal likelihood or evidence<\/p>\n\n\n\n<p>Posterior Distribution<br>The posterior distribution \\( P(\\theta \\mid \\mathbf{X}) \\) represents the updated belief about the parameter \\( \\theta \\) after observing the data \\( \\mathbf{X} \\). It is proportional to the product of the likelihood and the prior:<\/p>\n\n\n\n<p>$$ P(\\theta \\mid \\mathbf{X}) \\propto P(\\mathbf{X} \\mid \\theta) P(\\theta) $$<\/p>\n\n\n\n<p>Prior Distribution<br>The prior distribution \\( P(\\theta) \\) reflects the initial belief about the parameter \\( \\theta \\) before observing any data. It is based on previous knowledge or assumptions.<\/p>\n\n\n\n<p>Likelihood<br>The likelihood function \\( P(\\mathbf{X} \\mid \\theta) \\) describes the probability of the observed data \\( \\mathbf{X} \\) given the parameter \\( \\theta \\). It is a function of \\( \\theta \\) with \\( \\mathbf{X} \\) fixed.<\/p>\n\n\n\n<p>Example<br>Suppose we have observed data \\( \\mathbf{X} = {x_1, x_2, \\ldots, x_n} \\) and we assume a Gaussian likelihood and a Gaussian prior. The data \\( x_1, x_2, \\ldots, x_n \\) are assumed to be normally distributed with mean \\( \\mu \\) and known variance \\( \\sigma^2 \\):<\/p>\n\n\n\n<p>$$ P(x_i \\mid \\mu) = \\frac{1}{\\sqrt{2 \\pi \\sigma^2}} \\exp \\left( -\\frac{(x_i &#8211; \\mu)^2}{2 \\sigma^2} \\right) $$<\/p>\n\n\n\n<p>If the prior distribution for \\( \\mu \\) is also Gaussian with mean \\( \\mu_0 \\) and variance \\( \\tau^2 \\):<\/p>\n\n\n\n<p>$$ P(\\mu) = \\frac{1}{\\sqrt{2 \\pi \\tau^2}} \\exp \\left( -\\frac{(\\mu &#8211; \\mu_0)^2}{2 \\tau^2} \\right) $$<\/p>\n\n\n\n<p>The posterior distribution can be derived as follows:<\/p>\n\n\n\n<p>$$ P(\\mu \\mid \\mathbf{X}) \\propto P(\\mathbf{X} \\mid \\mu) P(\\mu) $$<\/p>\n\n\n\n<p>Substituting the expressions for the likelihood and the prior:<\/p>\n\n\n\n<p>$$ P(\\mu \\mid \\mathbf{X}) \\propto \\left( \\prod_{i=1}^{n} \\frac{1}{\\sqrt{2 \\pi \\sigma^2}} \\exp \\left( -\\frac{(x_i &#8211; \\mu)^2}{2 \\sigma^2} \\right) \\right) \\cdot \\frac{1}{\\sqrt{2 \\pi \\tau^2}} \\exp \\left( -\\frac{(\\mu &#8211; \\mu_0)^2}{2 \\tau^2} \\right) $$<\/p>\n\n\n\n<p>Taking the logarithm for simplicity:<\/p>\n\n\n\n<p>$$ \\log P(\\mu \\mid \\mathbf{X}) \\propto -\\frac{1}{2 \\sigma^2} \\sum_{i=1}^{n} (x_i &#8211; \\mu)^2 &#8211; \\frac{1}{2 \\tau^2} (\\mu &#8211; \\mu_0)^2 $$<\/p>\n\n\n\n<p>Completing the square for \\( \\mu \\), we get:<\/p>\n\n\n\n<p>$$ \\log P(\\mu \\mid \\mathbf{X}) \\propto -\\frac{1}{2} \\left( \\frac{1}{\\sigma^2} \\sum_{i=1}^{n} (x_i &#8211; \\mu)^2 + \\frac{1}{\\tau^2} (\\mu &#8211; \\mu_0)^2 \\right) $$<\/p>\n\n\n\n<p>Solving for \\( \\mu \\), we find the posterior mean:<\/p>\n\n\n\n<p>$$ \\hat{\\mu} = \\frac{\\frac{1}{\\sigma^2} \\sum_{i=1}^{n} x_i + \\frac{\\mu_0}{\\tau^2}}{\\frac{n}{\\sigma^2} + \\frac{1}{\\tau^2}} $$<\/p>\n\n\n\n<p>This result shows that the posterior mean is a weighted average of the sample mean and the prior mean, with weights inversely proportional to their variances.<\/p>\n\n\n\n<p>The posterior variance is given by:<\/p>\n\n\n\n<p>$$ \\sigma^2_{\\text{posterior}} = \\left( \\frac{n}{\\sigma^2} + \\frac{1}{\\tau^2} \\right)^{-1} $$<\/p>\n\n\n\n<p>This completes the Bayesian estimation process for the Gaussian case.<\/p>\n","protected":false},"excerpt":{"rendered":"<p>Bayesian estimation is a method in Bayesian statistics to estimate an unknown quantity based on prior informat<\/p>\n","protected":false},"author":1,"featured_media":442,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[1],"tags":[],"class_list":["post-440","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-differential-equation"],"_links":{"self":[{"href":"https:\/\/mp-superkler.com\/index.php?rest_route=\/wp\/v2\/posts\/440","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/mp-superkler.com\/index.php?rest_route=\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/mp-superkler.com\/index.php?rest_route=\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/mp-superkler.com\/index.php?rest_route=\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/mp-superkler.com\/index.php?rest_route=%2Fwp%2Fv2%2Fcomments&post=440"}],"version-history":[{"count":3,"href":"https:\/\/mp-superkler.com\/index.php?rest_route=\/wp\/v2\/posts\/440\/revisions"}],"predecessor-version":[{"id":446,"href":"https:\/\/mp-superkler.com\/index.php?rest_route=\/wp\/v2\/posts\/440\/revisions\/446"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/mp-superkler.com\/index.php?rest_route=\/wp\/v2\/media\/442"}],"wp:attachment":[{"href":"https:\/\/mp-superkler.com\/index.php?rest_route=%2Fwp%2Fv2%2Fmedia&parent=440"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/mp-superkler.com\/index.php?rest_route=%2Fwp%2Fv2%2Fcategories&post=440"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/mp-superkler.com\/index.php?rest_route=%2Fwp%2Fv2%2Ftags&post=440"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}