reperiendi

Imaginary Time 2

Posted in Chemistry, General physics, Math by Mike Stay on 2010 July 26

Another part of the analogy I started here, but this time using inverse temperature instead of imaginary time. It describes a thermometer where a mass changes position with temperature. I’m guessing this stuff only applies when the temperature is changing adiabatically.

Thermometer (unitless temperature):
\displaystyle \beta [1] inverse temperature (unitless)
\displaystyle y(\beta) [m] y coordinate
\displaystyle r [kg/s^2 K] spring constant * temp unit conversion
\displaystyle v(\beta) = \frac{dy(\beta)}{d\beta} [m] how position changes with (inverse) temperature
\displaystyle F(\beta) = r \; v(\beta) [kg m/s^2 K] force per Kelvin
\displaystyle T(\beta) = \frac{r}{2}v(\beta)^2 [kg m^2/s^2 K] stretching energy per Kelvin
\displaystyle V(\beta) [kg m^2/s^2 K] potential energy per Kelvin
\displaystyle S = \int (T + V)(\beta) \; d\beta

[kg m^2/s^2 K] entropy
Thermometer:
\displaystyle \beta [1/K] inverse temperature
\displaystyle y(\beta) [m] y coordinate
\displaystyle r [kg/s^2 K^2 = bits/m^2 K] how information density changes with temp
\displaystyle v(\beta) = \frac{dy(\beta)}{d\beta} [m K] how position changes with (inverse) temperature
\displaystyle F(\beta) = r \; v(\beta) [kg m/s^2 K = bits/m] force per Kelvin
\displaystyle T(\beta) = \frac{r}{2}v(\beta)^2 [kg m^2/s^2 = bits K] stretching energy = change in stretching information with invtemp
\displaystyle V(\beta) [kg m^2/s^2 = bits K] potential energy = change in potential information with invtemp
\displaystyle S = \int (T + V)(\beta) \; d\beta

[bits] entropy

I assume that the dynamics of such a system would follow a path where \displaystyle \delta S=0; is that a minimum-entropy path or a maximum?

Advertisements

An analogy

Posted in Math, Quantum by Mike Stay on 2010 July 24
Stat mech Quant mech
column vector distribution
\displaystyle |p\rangle = \sum_j p_j|j\rangle
amplitude distribution (wave function)
\displaystyle |\psi\rangle = \sum_j \psi_j |j\rangle
row vector population
\displaystyle \langle p| = \sum_j p^*_j \langle j|,
where p_j^* are the coefficients that, when normalized, maximize the inner product with |p\rangle
\displaystyle \langle \psi| = \sum_j \psi_j^* \langle j|,
where \psi_j^* is the complex conjugate of \psi_j.
normalization \displaystyle \sum_j p_j = 1 \displaystyle \sum_j |\psi_j|^2 = 1
transitions stochastic unitary
harmonic oscillator many HOs at temperature T one QHO evolving for time t
uniform distribution over n states \displaystyle |U\rangle = \frac{1}{n}\sum_j |j\rangle \displaystyle |U\rangle = \frac{1}{\sqrt{n}}\sum_j |j\rangle
special distribution Gibbs distribution \displaystyle p(T) = \sum_j e^{-E_j / k_B T} |j\rangle Free evolution \displaystyle \psi(t) = H|U\rangle = \sum_j e^{-E_j \; it/\hbar} |j\rangle
partition function = inner product with \langle U| \displaystyle Z(T) = \langle U|p(T)\rangle = \frac{1}{n}\sum_j e^{-E_j / k_B T} \displaystyle Z(t) = \langle U|\psi(t)\rangle = \frac{1}{\sqrt{n}}\sum_j e^{-E_j \; it/\hbar}
\displaystyle \langle Q\rangle \displaystyle = \frac{1}{Z(T)}\langle U| \; |Qp(T)\rangle \displaystyle = \frac{1}{Z(t)}\langle U| \; |Q\psi(t)\rangle
path integrals E/T = maximum entropy? Et = least action?

Scimitry

Posted in Uncategorized by Mike Stay on 2010 July 19

Scimitry: the property that something remains the same after part has been cut off.  The Serpinski triangle is scimitric, since you can cut off the bottom two triangles and be left with something isomorphic to the whole.

Entropic gravity

Posted in Astronomy, General physics, Math by Mike Stay on 2010 July 19

Erik Verlinde has been in the news recently for revisiting Ted Jacobson’s suggestion that gravity is an entropic force rather than a fundamental one. The core of the argument is as follows:

Say we have two boxes, one inside the other:

+---------------+
|               |
| +----------+  |
| |          |  |
| |          |  |
| |          |  |
| +----------+  |
+---------------+

Say the inner box has room for ten bits on its surface and the outer one room for twenty. Each box can use as many “1”s as there are particles inside it:

+---------------+
|      X        |
| +----------+  |
| |          |  |
| |  X       |  |
| |          |  |
| +----------+  |
+---------------+

In this case, the inner box has only one particle inside, so there are 10 choose 1 = 10 ways to choose a labeling of the inner box; the outer box has two particles inside, so there are 20 choose 2 = 190 ways. Thus there are 1900 ways to label the system in all.

If both particles are in the inner box, though, the number of ways increases:

+---------------+
|               |
| +----------+  |
| |          |  |
| |  X  X    |  |
| |          |  |
| +----------+  |
+---------------+

The inner box now has 10 choose 2 ways = 45, while the outer box still has 190. So using the standard assumption that all labelings are equally likely, it’s 4.5 times as likely to find both particles in the inner box, and we get an entropic force drawing them together.

The best explanation of Verlinde’s paper I’ve seen is Sabine Hossenfelder’s Comments on and Comments on Comments on Verlinde’s paper “On the Origin of Gravity and the Laws of Newton”.