Misplaced Pages

Extreme physical information: Difference between revisions

Article snapshot taken from Wikipedia with creative commons attribution-sharealike license. Give it a read and then ask your questions in the chat. We can research this topic together.
Browse history interactivelyNext edit →Content deleted Content addedVisualWikitext
Revision as of 19:05, 11 August 2004 edit150.135.248.126 (talk) Extreme physical information (EPI) is a mathematical principle for deriving fundamental laws of physical science.  Revision as of 16:10, 13 August 2004 edit undo162.40.183.110 (talk) slight wikificationNext edit →
(2 intermediate revisions by one other user not shown)
Line 1: Line 1:
Extreme physical information (EPI) is a principle for deriving laws of science. The laws are in the form of differential equations or distribution functions. Examples are the Schrodinger wave equation and the Maxwell-Boltzmann distribution law. The EPI principle builds on the well known idea that the observation of a "source" phenomenon is never completely accurate. That is, information is inevitably lost in transit from source to observation. Furthermore, the random errors that creep in are presumed to define the distribution function of the source phenomenon. That is, "the physics lies in the fluctuations." Finally, the information loss may be shown to be an extreme value. Thus, if the observed level of Fisher information in the data has value I, and the level of Fisher information that existed at the source has value J, the EPI principle states that I - J = extremum. The extremum is a minimum in most problems, meaning that there is a tendency for any observation to faithfully describe its source (which is comforting). Extreme physical information (EPI) is a principle for deriving laws of science. The laws are in the form of ]s or ]s. Examples are the ] and the ] law. The EPI principle builds on the well known idea that the observation of a "source" phenomenon is never completely accurate. That is, information is inevitably lost in transit from source to observation. Furthermore, the random errors that creep in are presumed to define the distribution function of the source phenomenon. That is, "the physics lies in the fluctuations." Finally, the information loss may be shown to be an extreme value. Thus, if the observed level of ] in the data has value I, and the level of Fisher information that existed at the source has value J, the EPI principle states that I - J = extremum. The extremum is a minimum in most problems, meaning that there is a tendency for any observation to faithfully describe its source (which is comforting).


EPI has been used to derive most fundamental laws of physics, as well as laws of biology, cancer growth, chemistry, and economics. These include new laws and concepts as well. In this way, Fisher information, and in particular its loss I - J during observation, provides a bridge to deriving laws of nature in general. The main reference for the work is the book "Science from Fisher Information, 2nd edition", by B. Roy Frieden (Cambridge University Press, 2004). EPI has been used to derive most fundamental laws of physics, as well as laws of biology, cancer growth, chemistry, and economics. These include new laws and concepts as well. In this way, Fisher information, and in particular its loss I - J during observation, provides a bridge to deriving laws of nature in general.
The main reference for this work is the book "Science from Fisher Information, 2nd edition", by ] (Cambridge University Press, 2004).

==External links==
*</center>

Revision as of 16:10, 13 August 2004

Extreme physical information (EPI) is a principle for deriving laws of science. The laws are in the form of differential equations or distribution functions. Examples are the Schrodinger wave equation and the Maxwell-Boltzmann distribution law. The EPI principle builds on the well known idea that the observation of a "source" phenomenon is never completely accurate. That is, information is inevitably lost in transit from source to observation. Furthermore, the random errors that creep in are presumed to define the distribution function of the source phenomenon. That is, "the physics lies in the fluctuations." Finally, the information loss may be shown to be an extreme value. Thus, if the observed level of Fisher information in the data has value I, and the level of Fisher information that existed at the source has value J, the EPI principle states that I - J = extremum. The extremum is a minimum in most problems, meaning that there is a tendency for any observation to faithfully describe its source (which is comforting).

EPI has been used to derive most fundamental laws of physics, as well as laws of biology, cancer growth, chemistry, and economics. These include new laws and concepts as well. In this way, Fisher information, and in particular its loss I - J during observation, provides a bridge to deriving laws of nature in general.

The main reference for this work is the book "Science from Fisher Information, 2nd edition", by B. Roy Frieden (Cambridge University Press, 2004).

External links