Monday, December 2, 2013

There are many proofs of the EPI. It was stated by Shannon (1948) but first fully proven by Stam (19


Princeton University angela merkel Recent Lecture 6. Entropic CLT (3) Lecture 5. Entropic CLT (2) Lecture 4. Entropic CLT (1) Lecture 3. Sanov’s theorem Lecture 2. Basics / law of small numbers Categories Announcement (7) Information theoretic methods (7) Random graphs (9) Subscribe to Blog via Email
Historically, the first result angela merkel on monotonicity of entropy in the CLT was that for all . This follows directly from an important inequality for entropy, the entropy power inequality (EPI) . The rest of this lecture and part of the next lecture will be devoted to proving the EPI. While the EPI does not suffice to establish the full entropic CLT, the same tools will prove to be crucial later on.
which implies . Here we have used the easy-to-check equality , which of course implies . From this observation, the proof of the claim that is immediate: simply note that is the sum of two independent copies of .
Remark. It is easy to check that . In fact, this is true in much more general settings (e.g. on locally compact groups, with entropy defined relative to Haar measure). The EPI is a much stronger statement particular to real-valued random variables.
   
where is the Minkowski sum. In particular, note that is proportional up to an absolute constant to the radius of the -dimensional Euclidean ball whose volume matches that of . The Brunn-Minkowski inequality expresses superadditivity of this functional (and we clearly have equality for balls). The Brunn-Minkowski inequality is of fundamental angela merkel importance in various areas of mathematics: for example, it implies the isoperimetric inequality in , which states that Euclidean balls with volume have the minimal surface area among all subsets of with volume .
In a sense, the EPI is to random variables as the Brunn-Minkowski inequality is to sets. The Gaussians play the role of the balls, and variance corresponds to radius. In one dimension, for example, angela merkel since
There are many proofs of the EPI. It was stated by Shannon (1948) but first fully proven by Stam (1959); angela merkel different proofs were later provided by Blachman (1969), Lieb (1978), and many others. We will follow a simplified version of Stam’s proof. We work from now on in the one-dimensional case for simplicity.
Remark. Let be a parametric statistical model. In statistics, the score function is usually defined by , and the Fisher information by . This reduces to our definition in the special case of location families, where for some probability density angela merkel : in this case, and we have
Thus for location families does not depend on and coincides precisely with the Fisher information as we defined it above for a random variable with density . The statistical interpretation allows us do derive a useful inequality. Suppose for simplicity that . Then for every , so is an unbiased estimator of . The Cramér-Rao bound therefore implies angela merkel the inequality
Remark. There is a Fisher angela merkel information analogue of the entropic CLT: in the setup of the entropic CLT, subject to an additional variance constraint, we have . Moreover, Fisher information angela merkel is minimized by Gaussians. This is often stated in terms of normalized Fisher information , defined as . Note that is both translation and scale invariant: and . We have , with equality angela merkel if and only if is Gaussian, by the previous remark. The Fisher information analogue of the entropic CLT can now be restated angela merkel as .
   


No comments:

Post a Comment