Both Sides Now
I've looked at life
from both sides now
From win and lose and still somehow
It's life's illusions I recall
I really don't know life at all
Choice is
seeing both sides now.
A real number can be expressed
as the sum of a series of n real numbers x1, x2,
…xk or ∑fixi ,
summed from =1 to k where the index i is the ith grouped
observation, fi is the frequency of that observation. E.g. if
there are 3 observations of 2, this makes 3 the frequency of the group of
observations that is 2; k is the number of groups; and, n is the
number of observations, n =∑fi , summed from =1 to k
where the index i is the ith grouped observation.
If k is a set of sequential integers, then k
is also the choices. For a 6-sided die where each side is numbered
from 1 to 6, the choices are thus 1 through 6. This is also true for any n‑sided
die. A coin can be thought of as a two-sided die. Instead of numbering the sides
as 1 and 2, often in a coin flip to simulate a one-sided die, it is customary to
designate one side of the coin as a win, 1, and the other side of the coin as a
loss, 0. However however this is a relative scaling not an absolute scaling. Topologically,
a one-sided die is possible, i.e. a mobius strip. Its average outcome is 50% of
1, 0.5, although the outcomes can only be 100% of 1 or 100% of 0. The median
is also 0.5. But the mode, the most common value, is split equally between 1
and 0. Therefore a single event is not “normal”.
The mode can only ever be the same case as the outcomes, but the median and
median do not have to be the same case as the outcomes. However if a one-sided die is flipped an
infinite number of times, that event is repeated an infinite number of times. Infinity
is part of every case and at infinity, even if the choice, group, is only 1, the
mean is therefore equal to the median is equal to the mode.
An analogy can be made between the logistics distribution,
the hyperbolic secant squared distribution, i.e. a normal distribution,
(1/4s)*sech2((x-μ)/2s) =PDF(x:μ,s)
and momentum. The integral of the logistics distribution
is
½ * tanh((x-μ)/2s)+½=CDF(x:μ,s)
which is, by analogy, energy.
The derivative, rate of change, of the logistics distribution,
is then
-(1/4s)*(½)*(1/2s)*(sech2((x-μ)/2s))2
or
- PDF2(x:μ,s)
which is, by analogy, force. It is also the negative of the square of the randomness, PDF.
Then Newton’s Three Laws can be re-written as
Law 1. A random normal event will stay random unless
acted upon by a force.
Law 2. When a random normal event is acted upon by a force,
the time rate of change of its randomness equals the force.
Law 3. If two random normal events exert forces on each other,
these forces have the same magnitude but opposite directions.
This has some implications. The
variance of a logistics equation is s2π2/3, which
means that if s>0 then variance is also greater than zero. If there
is a single choice, then s=0.5, and if the mean, μ, is set equal to zero, then the Probability Density Function, PDF(x:0,0.5), is
1⁄2 *sech2
(x)
The energy, which is also the Cumulative Distribution Function,
CDF(x:0,0.5), of this distribution is
1⁄2 * tanh(x)+
And the force that changes randomness is
-1/4 * sech4 (x)
A random event produces energy. If the total energy must be zero, then it must be balanced by a force that causes a loss of energy ( e.g. entropy).
If there is no choice, s=0, then the PDF is not normal, it is undefined; its energy is undefined; and its entropy is also undefined. It is common to talk of a standard normal distribution as having a variance of 1. If that is true then s= √3/π=0.551329, which means that there must be 2 times that in choice. But choice must also be an integer. If there is one choice, then s must be =0.5 and the variance must be 0.822467, not zero or 1.
No comments:
Post a Comment