lördag 26 mars 2011

What's the Matrix? Relative locality.

Physics governed by a novel principle, which we call the Principle of Relative Locality? This states that - Physics takes place in phase space and there is no invariant global projection that gives a description of processes in spacetime. From their measurements local observers can construct descriptions of particles moving and interacting in a spacetime, but different observers construct different spacetimes, which are observer-dependent slices of phase space.

Is this something new? No, only to mainstream, maybe. Alternative ideas have been around long, talking of subjective Causal Diamonds or light-cones. The "new" (?) thing is the observer effect done by the Planck scale, where the consciousness also is born. But this has Matti Pitkänen also long talked of. A 'living' Universe.
Abstract: We propose a deepening of the relativity principle according to which the invariant arena for non-quantum physics is a phase space rather than spacetime. Descriptions of particles propagating and interacting in spacetimes are constructed by observers, but different observers, separated from each other by translations, construct different spacetime projections from the invariant phase space. Nonetheless, all observers agree that interactions are local in the spacetime coordinates constructed by observers local to them.
This framework, in which absolute locality is replaced by relative locality, results from deforming momentum space, just as the passage from absolute to relative simultaneity results from deforming the linear addition of velocities. Different aspects of momentum space geometry, such as its curvature, torsion and non-metricity, are reflected in different kinds of deformations of the energy-momentum conservation laws. These are in principle all measurable by appropriate experiments. We also discuss a natural set of physical hypotheses which singles out the cases of momentum space with a metric compatible connection and constant curvature. The principle of relative locality. Lee Smolin's show.

Non-locality in theories with deformed Lorentz invariance. Deformed symmetry breaking = asymmetry.
"Look around - do you see space?"
- no, just things
- no, spacetime, or in fact momentum space, photons arriving (momenta) and energies (angels), it is inferred, derived from momentum space measurements.

"Do we all infer the same spacetime, - also at different energies?"
- transformations (conservation laws) and translations (total momentum)
- between observers
- translation is independent of momenta and energy, we all construct the same spacetime, local for one is local for everybody
- description of events are different at different energies
- for every interaction observers local to it will infer it as local, and a distant observer will not infer it as local

We call this principle of relative locality!, say Smolin. And there is a math for it based on geometry of momentum space (4-D; 0,1,2,3); the classical Planck mass (mp) regime, where the constant variables can be varied. And in that way the gravity and quantum effects (hbar) can be 'fixed' to a constant (if c=1), and neglected. New phenomena on scaling of momenta and energy might show up. lp=Planck length.
See Hossenfelder & Smolin, Phenomenological Quantum Gravity. See also the problem of Planck mass, as instance Nima Arkani-Hamed The hierarchy problem and new dimensions at a millimeter from 1998, at arxive. "There are two seemingly fundamental scales, the electroweak scale (GeV 10^3) and the Planck scale (GeV 10^18) The Planck scale is not a fundamental scale; its enormity is simply a consequence of the large size of the new dimensions. The Planck scale is where gravity becomes as strong as the gauge interactions. The ratio mEW/mPl (GeV 10^−17) has been one of the greatest driving forces behind the construction of theories beyond the Standard Model (SM). The physics responsible for making a sensible quantum theory of gravity is revealed only at the Planck scale. A desert between these scales "The weak scale is extremely interesting, but will never give a direct experimental handle on strong gravitational physics.

lp = sqrt hbar x GNewton ->0 (quantum spacetime eliminated?)
mp= sqrt hbar:GNewton -> constant

So we have only an energy scale left. The ratio mp/lp=mp that is constant. The ratio tells us the effect from energy on structure, frequency, amplitude. These are then phenomena governed by two parameters, c and Planck energy (EPl). Velocity of light is assumed to be constant.

Assumption 1: Momentum space is more fundamental than spacetime.
How can it get deformed so it can be measured by a scale mp? This is the first measurement that is possible. The dynamics of spacetime will be formed in mp and momentum space.

Assumption 2: The observer is local, and can only do measurements locally. In this way mp is the first observer, measuring energy and momenta. Also time. With these three we can construct the geometry of momentum space. A modified Minkowski is deformed and curved. mp goes to eternity to get a flat geometry.

Assumption 3: There is a preferred coordinate on momentum space, the ground state, and measurements are only made that is above this state. Ground state = 0 (homeostasis mechanism). mp can measure only metric (rest energy and kinetic energy) and connections/relations. Connections can be defined by an algebra that determines how momenta combine when particles interact (p, q variables). The product rule, (iterations, commutativity, linearity, associativity...) + output -> input, a feedback mechanism. In this way we get an oscillation of p (and p x q), torsion and curvature away from the original momentum space.

This is primitive consciousness, as in artificial intelligence? The most basic measurement+ feedback+ ground state. This is also homeostasis, self-regulation?

Translations/deformations because of the observer are generated by the laws of conservation of energy and momentum. If you look at things in very small scale (Planck), you will create a Black hole (a collapse) with your energy (measurement).

Since momentum space is curved, and Kb is non-linear, it follows that the “spacetime coordinates” of a 2-D particle translate in a way that is dependent on the energy and momenta of the particles it interacts with. This is a manifestation of the relativity of locality, ie local spacetime coordinates for one observer mix up with energy and momenta on translation to the coordinates of a distant observer.
We will see that the meaning of the curvature of momentum space is that it implies a limitation of the usefulness of the notion that processes happen in an invariant spacetime, rather than in phase space. - Non-linear deformation of the conservation laws gives the relative locality. Contraction gives a phase space. Four worldlines emerge from the interaction.
Particles are removed from the origin/center by the observer mp. "The cotangent space based at pI and the cotangent space based at 0 are different spaces in the general curved case. This expresses mathematically the relativity of locality." - This is nothing else than ZEO.

The geometry of the spacetime.
A Zero Energy Ontology
(part II) and geometry arise from vacuum extremals giving discretization.
"In fact, practically all solutions of Einstein's equations have this property very naturally. The explicit formulation emerged with the progress in the formulation of quantum TGD. In zero energy ontology physical states are creatable from vacuum and have vanishing net quantum numbers, in particular energy. Zero energy states can be decomposed to positive and negative energy parts with de nfiite geometro-temporal separation, call it T, and having interpretation in terms of initial and final states of particle reactions. Zero energy ontology is consistent with ordinary positive energy ontology at the limit when the time scale of the perception of observer is much shorter than T. One of the implications is a new view about fermions and bosons allowing to understand Higgs mechanism among other things. S-matrix as a characterizer of time-like entanglement associated with the zero energy state and a generalization of S-matrix to what might be called M-matrix emerges. M-matrix is complex square root of density matrix expressible as a product of real valued "modulus" and unitary matrix representing phase and can be seen as a matrix valued generalization of Schrödinger amplitude. Also thermodynamics becomes an inherent element of quantum theory in this approach."
"The fusion of p-adic physics and real physics to single coherent whole requires generalization of the number concept obtained by gluing reals and various p-adic number fields along common algebraic numbers. This leads to a completely new vision about how cognition and intentionality make themselves visible in real physics via long range correlations realized via the e ffective p-adicity of real physics. The success of p-adic length scale hypothesis and p-adic mass calculations suggest that cognition and intentionality are present already at elementary particle level."
Indeed, very much the same. Which one came first, the hen or the egg? So this cannot be anything new. Of some reason Smolin et co doesn't refer to Pitkänen. Why?
p-Adic length scale hypothesis follows if one assumes that the temporal distance between the tips of CD comes as an octave of fundamental time scale defi ned by the size of CP2. The "world of classical worlds" (WCW) is union of sub-WCWs associated with spaces CD x CP2 with di fferent locations in M4 x CP2. ZEO is replaced with a book like structure obtained by gluing together infinite number of singular coverings and factor spaces of CD resp. CP2 together. The copies are glued together along a common "back" M2 x M2 of the book in the case of CD. Color rotations in CP2 produce di fferent choices of this pair..
The same picture as Smolin et co. Superpositions. In TGD:
(hbar/hbar-0)¨2 appears as a quantum scaling factor of M4 covariant metric. Anyonic and charge fractionization e ffects allow to "measure" hbar(CD) and hbar(CP2) rather than only their ratio. hbar(CD) = hbar(CP2) = hbar-0 corresponds to what might be called standard physics without any anyonic eff ects and visible matter is identi fied as this phase. Quantum TGD is reduced to parton level (flat) objects in light-like 3-surfaces (Wilson loops?) of arbitrary size (infinite?), give rise to an in finite-dimensional symplectic/symplectic algebra. The light-likeness of partonic 3-surfaces is respected by conformal transformations of H made local with respect to the partonic 3-surface.

Units of action (E , t , c), (a)symmetry in ground state.

Planck's constant has units of energy multiplied by time, which are the units of action.
These units may also be written as momentum times distance (N·m·s), which are the units of angular momentum. In classical physics, the groundstate is Lorentz-invariant and the principles of special relativity (SR) are satisfied. So, SR breaks down at Planck scale? Those who suggest this point to the existence of a preferred cosmological rest frame. Experiments (Fermi, Kepler) are currently probing whether the Lorentz symmetry is preserved when effects of the order of the ratio of energies in the experiment to EPl are taken into account.

Keas blog:
Ulla, no, they don't seem to understand the ZEO, although they are getting closer. At present they are 'fixing scales' and looking at infrared QG effects by letting hbar go to zero. So they might eventually appreciate Louise's law, but they will probably make up a big confusing story about how it was all their idea. As I see it, without carefully reading the paper, but assuming they are making vague sense, they are essentially fixing c (separately) in the local physics at the two locations of the GRB thought experiment. I think this is more or less OK, for the situation they are discussing, but of course the missing explanation needs to be provided eventually. Morally, they should perhaps be taking c→∞ in their scheme, since hbar→0. But Riofrio's law can still be enforced by assuming that t→0 instead, for fixed M and c, which is correct when hbar→0.

TGD: (Only one cone is broken?) - also photon and gluons become massive and eat their Higgs partners to get longitudinal polarization they need.
WCW is union over all possible choices of CD and pairs of geodesic spheres so that at the level no symmetry breaking takes place . The points of M2 and S2 have a physical interpretation in terms of quantum criticality with respect to the phase transition changing Planck constant (leakage to another page of the book through the back of the book). This gives invisibility, non-commutativity and dark matter for the other cone/diamond, p7.

Then we are on the arena for another famous physicist, Nima Arkani-Hamed. For the previous discussion, see 'Duality for the S-matrix'.
The integrand has also been beautifully interpreted as a supersymmetric generalization of the null-polygonal Wilson-Loop, making dual superconformal invariance manifest and providing a general proof of the Wilson-Loop/Amplitude duality. Again no ref. to the alternative group.

Own words again for the same thing. Do they not understand each other?

Spacetime is doomed, says Nima Arkani-Hamed. Space-Time, Quantum Mechanics and Scattering Amplitudes.
"The most interesting thing ihave seen in my lifetime" Non-deterministic? Feynmans diagrams are about manifest locality and uncertainty. A field- locality are opposites. Gauge redundancy is all our troubles. For 60 years-where is the emergent spacetime? String theory, twistors, algebraic geometry, integrals have not given the answer. Trees, loops, groups, ? N-4 super Yang Mills simplest gauge theory of all. Unitary theory through spacetime. Emergent spacetime-emergent QM. Tree amplitudes central loops and 6 treads anchoring every second is positive, every second negative. Gravity is cyclicity parity, no spinning poles. Infinitely many hidden symmetries. Massless particles enjoy symmetry (conformal) invariance in spacetime theory. Twistor space has a point only, no line but a momentum space, (one invisible) and generate Yangian algebra. This solved the problem of determining anom dimensions in N-4 SYM, without Feynmans, and gives extensions to amplitudes, that are more physical.
Momentum conservation, parity invariant, k-plane - kn-plane is similar.
This is the ZEO, but symmetric 1-D, identical structures. Look how excited he is to discover the TGD world.
Entangled removal or loop corrections. the superposition of loops is QM. Supersymmetric Wilson Loop with perfect symmetry...
Wilson loops are problematic in TGD; they must be generalized. Knots and TGD.
Time like braidings induces space-like braidings and one can speak of time-like or dynamical braiding and even duality of time-like and space-like braiding. What happens can be understood in terms of dance metaphor.
The interpretation of string world sheets in terms of Wilson loops in 4-dimensional space-time is very natural. This raises the question whether Witten's a original identi cation of the Jones polynomial as vacuum expectation for a Wilson loop in 2+1-D space might be replaced with a vacuum expectation for a collection of Wilson loops in 3+1-D space-time and would characterize in the general case (multi-)braid cobordism rather than braid.

History: By late 1900, Planck had built his new radiation law from the ground up, having made the extraordinary assumption that energy comes in tiny, indivisible lumps. In the paper he wrote, presented to the German Physical Society on December 14, he talked about energy "as made up of a completely determinate number of finite parts" and introduced a new constant of nature, h, with the fantastically small value of about 6.6 × 10-27 erg second. This constant, now known as Planck's constant, connects the size of a particular energy element to the frequency of the oscillators associated with that element. Something new and extraordinary had happened in physics, even if nobody immediately caught on to the fact. For the first time, someone had hinted that energy isn't continuous.


Dark matter and invisibility.
TGD is maybe most chritizised for its dark matter physics. But dark matter is nothing peculiar at all in this model. The same tells us neutrino research, that has shown asymmetry. The majority of leptons are invisible and noncommutative.

In 'Manyfold Universe' Nima Arkani-Hamed et al. suggests 1999: a new DM particle and a new framework for the evolution of structure in our universe. (LHC bounds on large extra dimensions by A. Strumia and collaborators poses very strong constraints on large extra dimensions and mass and effective coupling constant parameter of massive graviton.)
We propose that our world is a brane folded many times inside the sub-millimeter extra dimensions [with massive graviton]. The folding produces many connected parallel branes or folds with identical microphysics - a Manyfold. Nearby matter on other folds can be detected gravitationally as dark matter since the light it emits takes a long time to reach us traveling around the fold. Hence dark matter is microphysically identical to ordinary matter; it can dissipate and clump possibly forming dark replicas of ordinary stars which are good MACHO candidates. Its dissipation may lead to far more frequent occurrence of gravitational collapse and consequently to a significant enhancement in gravitational wave signals detectable by LIGO and LISA. Sterile neutrinos find a natural home on the other folds. Since the folded brane is not a BPS state, it gives a new geometric means for supersymmetry breaking in our world. It may also offer novel approach for the resolution of the cosmological horizon problem, although it still requires additional dynamics to solve the flatness problem.
This is in essence the Big Book of TGD. Interactions happen through the back of the Book. U-matrix; regions where the brane bends and its curvature is very large, with Nimas words. They can probe objects on adjacent folds gravitationally (wormholes in TGD), since they are nearby in the bulk, at sub-millimeter distances. Hence matter on other folds will appear as dark matter to us. A low energy observer on the braneworld can therefore experience two different minimal distances: gravitational, defined by minimizing the distances traveled by bulk particles, and electromagnetic, which corresponds to minimizing the distances along the brane, and around the tips of folds.

This U-matrix is what Matti now has solved in TGD. "I must say that for me the idea of large dimensions is so ugly that the results are not astonishing. Aether hypothesis was a beauty compared with this beast. Should we accept anthropic principle? No, electroweak symmetry breaking is left."
Dark matter is an anyonic phase of matter in TGD. And the redshift or blueshift is the interaction change, showing the light-cone direction. This interaction changes de facto the speed of light a little??? Blue or red is not illusion only? In GRT framework speed of light is by definition a constant in local Minkowski coordinates. It seems very difficult to make sense about varying speed of light since c is purely locally defined notion. I quote from Mattis paper:
"Particle states belonging to di fferent pages of the book can interact via classical fields and by exchanging particles, such as photons, which leak between the pages of the book. This leakage means a scaling of frequency and wavelength in such a manner that energy and momentum of photon are conserved. Direct interactions in which particles from di fferent pages appear in the same vertex of generalized Feynman diagram are impossible. This seems to be enough to explain what is known about dark matter. This picture diff ers in many respects from more conventional models of dark matter making much stronger assumptions and has far reaching implications for quantum biology, which also provides support for this view about dark matter."

The basic implication of dark matter hierarchy is hierarchy of macroscopic quantum coherent systems covering all length scales. The presence of this hierarchy is visible as exact discrete symmetries of field bodies reflecting at the level of visible matter as broken symmetries. In case of gravitational interaction these symmetries are highest and also the scale of quantum coherence is astrophysical. Together with ruler-and-compass hypothesis and p-adic length scale hypothesis this leads to very powerful predictions and p-adic length scale hypothesis might reduce to the ruler-and-compass hypothesis. High Tc superconductivity, nuclear string model, the 5- and 6-fold symmetries of the sugar backbone of DNA suggest that corresponding cyclic groups or cyclic groups having these groups as factors are symmetries of dark matter part of DNA, presumably consisting of what is called as free electron pairs (quasiparticles) assignable to 5- and 6-cycles, giving the vision about living matter as quantum critical system. The notion of (magnetic) field body which plays key role in TGD inspired model of living matter serving as intentional agent controlling the behavior of field body. For instance, the model of EEG relies and of bio-control relies on this notion. The large value of the Planck constant is absolutely essential since for a given low frequency it allows to have gauge boson energy above thermal threshold. Large value of Planck constant is essential for time mirror mechanism which is behind the models of metabolism, long term memory, and intentional action. The huge values of gravitational Planck constant supports the vision of Penrose about the special role of quantum gravitation in living matter.
On a blogpost aug. 09 he writes: TGD view leads to the explanation of standard model symmetries, elementary particle quantum numbers and geometrization of classical fields, the dream of Einstein. The presence of imbedding space M4×CP2 brings in light-like geodesics of M4 for which c is maximal and by a suitable choice of units could be taken c=1. The time taken for light to propagate from point A to B along space-time sheet can in TGD framework occur via several routes (each sheet gives its own value for c.), which gives a reduced speed of light for sub CDs. The light-like geodedesics of M4 serves as universal comparison standards when one measures speed of light - something which GRT does not provide. c measured in this manner increases in cosmological scales, just the opposite for what Louise Riofrio claims. The reason is that strong gravitation makes space-surface strongly curved and it takes more time to travel from A to B during early cosmology. More here.

Smolin used c=1 for above. But if gravity is 'turned on' the curvature change. "Momentum space is represented by covector fields and the metric induced on each fiber is dependent on the spacetime point. It can be said that the relative locality is dual gravity". And "In zero momentum local measurements are important, coming from uncertainty principle and QM, where a particle of energy p0 can only be diffusely-precisely localized with hbar/p0."

If time isn't absolute, then the velocity of light isn't absolute either, says Smolin. We need an absolute scale (invariant velocity c) to measure the velocity, and the scale of non-linearities; then there also is an absolute time (objective time in TGD). The additivity of momenta and energy implies the existence of an absolute spacetime too.
This then allows us to interchange distances and times, which makes possible the existence of an absolute spacetime, which replaces the notion of absolute space. Space itself remains, but as an observer dependent concept, because of the relativity of the simultaneity of distant events. When we contemplate weakening that to a non-linear combination rule for momenta in physical interactions, we need an invariant momentum scale. We have taken this scale to be mp but of course from a phenomenological point of view it should be taken as having a free value to be constrained by experiment. This, together with hbar makes it possible to interchange distances and momenta, which makes possible the mixing of spacetime coordinates with energy and momenta, so that the only invariant structure is the phase space. We saw above explicitly how non-linearity in conservation of energy and momentum directly forces translations of spacetime coordinates to depend on momenta and energies. Local spacetime remains, but as an observer dependent concept.
Smolin described a 2+1 D world. In 3+1 D the 'no gravity' field is governed by a topological field theory, he guess. And if time bends, then light too? Both in an U-matrix = timetravel IS possible? But that is highly controversial. Can these fields be constructed?

Nor the energy isn't absolute. The first law of thermodynamics break down at Planck scale? And the second law concerning systems out of balance too, complicated with gravity, and quantum gravity is unknown? Smolin & Magueijo 2002:
We propose a modification of special relativity in which a physical energy, which may be the Planck energy, joins the speed of light as an invariant, in spite of a complete relativity of inertial frames and agreement with Einstein’s theory at low energies. This is accomplished by a nonlinear modification of the action of the Lorentz group on momentum space, generated by adding a dilatation to each boost in such a way that the Planck energy remains invariant. The associated algebra has unmodified structure constants. We also discuss the resulting modifications of field theory and suggest a modification of the equivalence principle which determines how the new theory is embedded in general relativity.

Fermi:
We detect for the first time a GRB prompt spectrum with a significant deviation from the Band function. This can be interpreted as two distinct spectral components, which challenge the prevailing gamma-ray emission mechanism: synchrotron - synchrotron self-Compton. The detection of a 31 GeV photon during the first second sets the highest lower limit on a GRB outflow Lorentz factor, of >1200, suggesting that the outflows powering short GRBs are at least as highly relativistic as those powering long GRBs. Even more importantly, this photon sets limits on a possible linear energy dependence of the propagation speed of photons (Lorentz-invariance violation) requiring for the first time a quantum-gravity mass scale significantly above the Planck mass.
A pregeometric phase of matter giving rise to space and time, and a geometric phase giving rise to gravity, say Hossenfelder. "High-Temperature Superconductor Spills Secret: A New Phase of Matter?" (also this). For more details see the article in Science. This phase would be present also in the super-conducting phase. In TGD this phase would consist of Cooper pairs of electrons with a large value of Planck constant but associated with magnetic flux tubes with short length so that no macroscopic supra currents would be possible.

Sanejouand : The empirical evidences in favor of the hypothesis that the speed of light decreases by a few centimeters per second each year are examined. Lunar laser ranging data are found to be consistent with this hypothesis, which also provides a straightforward explanation for the so-called Pioneer anomaly, that is, a time-dependent blue-shift observed...


Variable speed of light? Absolute speed? Absolute time?

Kea: GRB 090510 eliminates Lorentz violating theories that conclude that the speed of light depends on its energy. Observable naive Lorentz violation via frequency dependent photon speeds has been ruled out - we find that the bulk of the photons above 30 MeV arrive 258±34 ms later than those below 1 MeV, indicating some energy dependent time delay.

Riofrio: Supernova redshifts are the only evidence of cosmic acceleration. Low redshifts increase linearly with distance, showing that the Space/Time expands. High redshifts increase non-linearly, leading to speculation about repulsive energies. Riofrio is the one talking for variable speed. " If constants like α really vary, the cosmos loses its homogeneity, and dark matter and dark energy could be different in various places." This is what is seen too.

Close coincidence in time between the onset of the "dimming of the universe" and the onset of cosmic acceleration. - Like a growing Planck constant?

One can look for a variation in the speed of light proportional to E/EPl, where E is the energy of a photon. That is one looks for an energy dependent speed of light (in far away gamma ray bursts, as instance) of the form v = c(1 ± aE/EPl), where a is a dimensionless parameter to be determined. This has not been seen, so we know that the parameter a must be less than around
one (at least for the minus sign). In fact a slowing down of the photon speed has been shown by Fermi, only the reason for the slowing-down is unknown.
Several bursts have now been documented in which the higher energetic photons (>GeV) arrive with a delay of more than 10 seconds after the onset of the burst has been observed in the low energy range (keV-MeV). While it is still unclear whether this delay of the high energetic photons is caused at emission or during propagation, more statistics and a better analysis - in particular about the delay’s dependence on the distance to the source – will eventually allow to narrow down the possible causes and constrain models that give rise to such features.

Lubos Motl, Fermi kills all Lorentz-violating theories, doesn't want to believe this - he calls Hossenfelder and 'light-varying believers' less pleasant things. He doesn't like Loop quantum gravity either, nor TGD.
Objections to loop quantum gravity
Lorentz violation and deformed special relativity
MAGIC: dispersion of gamma rays?
MAGIC: rational arguments vs propaganda
Aether compactification
Relativistic phobia
Testing E=mc2 for centuries
Lorentz violation makes perpetuum mobile possible
Well, for one thing, we expect the quantized spacetime to manifest itself as minute differences in the speed of light for different colors... Leslie Winkle. A serious thing :) And Hossenfelders answer.
On Keas blog:
The May 2009 burst was shorter and the photon(s) had higher energies, which allowed to measure the coefficient of the Lorentz violation with much better accuracy, and it's zero, 100 times more accurate zero than needed to prove that the Lorentz invariance holds at the Planck scale. The hypothesis that the delay arises on the journey would predict about 100 times bigger delay than the upper bound of the delay recently seen by Fermi.
Every theory that violates Lorentz symmetry implies that the speed of light is not universal. Special relativity, including Lorentz symmetry, was derived just from two postulates - the equivalence of all inertial frames and the constancy of the speed of light. A constant, energy-independent speed of light in a Lorentz-violating theory would require an infinite amount of fine-tuning (made with the assumption of constant speed of light).

Kea: All the photons we look at here are travelling at speed c. But we didn't observe all those GRB photons on their way here. And like in any simple 2 slit experiment, maybe they took many different paths. Maybe the path weighting for the higher energy ones gives a greater probability that these photons are slowed down more (by gravity?).
You see, Lubos talks for the need of an infinite hierarchy :)

In the frame of an absolute speed of light versus subjective speed of light what she says maybe makes sense? But the transparency of Universe indicate there must be some transformations of the photons then (to invisibility?). And biophotons, photoassimilation, electron excitation, all these small events must invoke on the speed a little. Also the massless photon itself is problematic. It must be a quasiparticle for the equations to work. Also in TGD it must have a minute mass for the massivation from the Higgs mechanism.

In what sense c could be changing in solar system

In General Relativity are the idea about Lorenz invariance and the absolute time in conflict, points Matti out. He says:
For Lorentz invariance to hold in this scenario, it has to be exactly the same for all observers. Thus, this idea either breaks Lorentz invariance by 1) assuming absolute time for all observers or 2) resulting in the speed of light being different for observers in different frames. Many-sheeted space-time allows variation of light-velocity in this sense without breaking of Lorentz invariance. In cosmic scales the light-velocity measured in this manner increases slowly (the time taken to travel along curved space-time surface decreases since it becomes less curved as it flattens so that c approaches its maximal value) . Solar system space-time sheet is predicted to not participate in expansion except possibly by rapid phase transitions and this is known to be true. This implies apparent reduction of c since the standard to which one compares increases slowly.

The Planck ratio.
The factor that the Planck mass is bigger than the particle masses is about the same factor that atoms are bigger than the Planck length (wouldn't it be weird if the Bohr radius were about as small as the Planck length?), and also that if the Planck mass wasn't a hell of a lot bigger, then interparticle gravitation would be an issue in the physics of the atom (wouldn't that be weird?). This is where both quantum and classic physics are born? TGD:
Non-commutative physics would be interpreted in terms of a finite measurement resolution rather than something emerging below Planck length scale. An important implication is that a finite measurement sequence can never completely reduce quantum entanglement so that entire universe would necessarily be an organic whole. Topologically condensed space-time sheets could be seen as correlates for sub-factors which correspond to degrees of freedom below measurement resolution.

Since the gravitational Planck constant is proportional to the product of the gravitational masses of interacting systems, it must be assigned to the field body of the two systems and characterizes the interaction between systems rather than systems themselves. This observation applies quite generally and each field body of the system (em, weak, color, gravitational) is characterized by its own Planck constant. A further fascinating possibility is that the observed indications for Bohr orbit quantization of planetary orbits could have interpretation in terms of gigantic Planck constant for underlying dark matter so that macroscopic and -temporal quantum coherence would be possible in astrophysical length scales manifesting itself in many manners.
The magnetic body too is a field body.

Smolin:
Last but not least, momentum spaces of constant curvature find its natural application as a model of Doubly (or deformed) Special Relativity (DSR), whose second observer independent scale is naturally associated with the curvature of the momentum space. These formulations were closely related to non-commutative geometry. A second formulation of DSR expressed the same idea as an energy dependence of spacetime. For a long time it has been suspected that these were different ways of formulating the same theories, the developments of this paper show how an energy dependent metric and non-commutative spacetime coordinates are different ways of expressing a deeper idea, which is relative locality.

And the quintessense, wormholes, black holes...
Planck mass divided by the Planck length (mp/lp) is varying in reality and would be the greatest possible meaningful density of mass/energy, which might be the same as the density of the universe one Planck time after the Big Bang. Planck mass is the largest possible mass that can fit in the smallest meaningful volume of space, which would be about equal to the Planck length cubed (superposition?), the Planck scale is not just about a limit on smallness, but largeness as well.

A Planck-mass black hole is not a tree-level classical particle such as an electron or a quark, but a quantum entity resulting from the Many-Worlds quantum sum over histories at a single point in spacetime. The Planck mass is the mass required to form a black hole with an event horizon of a Planck length - which basically means nothing smaller than this can collapse into a black hole. This is a good thing if you consider the incredible density of an atomic nucleus. Why is there a mass limit in some natural way? If you are imagining gravity as the curvature of spacetime, then the Planck mass is where spacetime gets so warped it gets closed off. But then QM uncertainty/compton wavelength has to be invoked as to why there is not then a complete collapse of spacetime to a singularity. Linked to gravitons?

For any given length, there is only one possible mass that will give a black hole whose radius is exactly that length (see the formula for Schwarzschild radius as a function of mass here). The Planck mass represents the minimum for the size of a black hole, since anything smaller would be a black hole smaller than a Planck length, which probably wouldn't even make sense according to quantum gravity. But the Planck mass is probably also the maximum mass that can be packed into a physically meaningful unit of space - if you try to add mass, you'll just get a black hole with a radius larger than the Planck length, whose density will be lower.
Talk of the Planck mass is misleading here. It is the Planck mass density that is large and maximal. The Planck mass density would equate to 10^96 grams per cubic centimetre.

The value for the mp given in the Particle Data Group's 1998 review is 1.221 x 10^19 GeV. mp is much higher than lp.

And the Planck temperature [highest possible temperature] is radiation with a wavelength of a Planck length. Not coincidentally, this happens to be the same as the temperature of the universe one Planck time after the big bang. Time-temp ratio?

To put this in terms of an ancient dichotomy, the Planck scale is the smallest scale for "form" - it is the smallest coherent unit of spacetime. And also the largest scale for "substance" - the density of energy/mass/temperature that can fit into a unit of spacetime.
Gravity = spacetime = form. Mass/energy/temperature as such do not feature in this explanatory chain. The alternative formulation yields "substance" - mass = wavelength = substance.

Physics Forum: Think of the Planck scale as a complex package that gives you both upper and lower limits. Then as the Universe expands, both the form and the substance "fall" towards their other extreme. So expansion increases spacetime - the form is heading towards maximum largeness. Yet at the same time the substance, the energy density of the Universe, is falling towards its lowest possible level, its lowest possible temperature. The "unpacking" of the Planck scale is thus a move from a hot point to a cold void. What was small at the beginning grows large, and what was large grows small - so a conservation of scale going on.

You get the Planck energy out of lightspeed considerations. The shorter the wavelength, the higher the energy. So if you have a world as small as the Planck scale, there is only room for a single crisp oscillation of about the Planck length. This would be the reason for a crisp upper bound on energy/mass/temperature. If anything "substantial" is happening at the Planck-scale, it would have to start at the shortest possible wavelength, and thus the highest possible energy level. Stress-tension, zero point or vacuum energy? Consciousness?

TGD:
The wormhole contact connects two space-time sheets with induced metric having Minkowski signature. Wormhole contact itself has an Euclidian metric signature so that there are two wormhole throats which are light-like 3-surfaces and would carry fermion and anti-fermion number. In this case a delicate question is whether the space-time sheets connected by wormhole contacts have opposite time orientations or not. If this the case the two fermions would correspond to positive and negative energy particles. First only Higgs was considered as a wormhole contact but there is no reason why this identification should not apply also to gauge bosons (certainly not to gravitons). p 16
Graviton-graviton pairs might emerge in higher orders. Gravitons are stable if the throats of wormhole contacts carry non-vanishing gauge fluxes so that the throats of wormhole contacts are connected by flux tubes carrying the gauge flux. The mechanism producing gravitons would the splitting of partonic 2-surfaces via the basic vertex. A connection with string picture emerges with the counterpart of string identified as the flux tube connecting the wormhole throats. Gravitational constant would relate directly to the value of the string tension. p.17

Winterberg writes in Planck Mass Rotons as Cold Dark Matter and Quintessence, 2002, Label, Analog Models of General Relativity:
According to the Planck aether hypothesis , the vacuum of space is a superfluid made up of Planck mass particles, with the particles of the standard model explained as quasiparticle – excitations of this superfluid. Astrophysical data suggests that ≈70% of the vacuum energy, called quintessence, is a negative pressure medium, with ≈26% cold dark matter and the remaining ≈4% baryonic matter and radiation. This division in parts is about the same as for rotons in superfluid helium, in terms of the Debye energy with a ≈70% energy gap and ≈25% kinetic energy. Having the structure of small vortices, the rotons act like a caviton fluid with a negative pressure. Replacing the Debye energy with the Planck energy, it is conjectured that cold dark matter and quintessence are Planck mass rotons with an energy below the Planck energy.

And an article 'Superconductivity from nowhere' in Physicsworld:
Unlike previously known superconductivity, it would survive at very high temperatures, perhaps billions of degrees. It would also exist alongside strong magnetic fields and, perhaps strangest of all, it wouldn't need a material to exist – just a vacuum. An up quark and a down antiquark can bind to form a positively charged rho meson, but the meson is normally so unstable that it decays. Now they think that in a strong magnetic field the quarks would be forced to move only along the field lines – and this would make the rho mesons far more stable. In addition, the rho meson's own spin would interact with the external magnetic field, lowering the particle's effective mass to zero so that it can move freely, as in a superconductor. The external magnetic field required for this superconductivity must be at least 10^16 T. This is done only in Colliders.
Vacuum superconductivity might not always need particle accelerators, however. The early universe might have had sufficiently strong magnetic fields, and the subsequent super-currents might have seeded the mysterious large-scale magnetic fields seen across the universe today. "It sounds like a crazy idea, but what if it is true?"
Zero point energy is adiabatic and expanding, according to the SM. Maybe this is an explanation? It would fit nicely into this theory. Frank Wilczek also talk of a 'grid' (Wilczek's grid), a vibrant energy field, in "The Lightness of Being".
"The Grid fills space, and is full of spontaneous activity. In some ways it resembles the old idea of “ether”. But the Grid is highly evolved ether, ether on steroids if you like, with many new features. We live inside a medium and that we're all connected. A low-energy supersymmetry, the color superconducting phases of quark matter. In modern physics, energy and space are much more fundamental than mass. That's part of the message. The traditional idea that there are stable, static bodies that are massive and hard to push around has been replaced by a much more fluid concept. Fields are more basic than particles."
There's much more to the world than what our sensory apparatus has evolved to react to. We have more than five senses?
Interactions in virtually empty space give rise to the substance of subatomic particles and complex molecules, and mass. The Grid is permeated with a not-yet-understood property that "slows down" some of the interactions in the field, just as electrons are slowed down in a superconducting medium. In the medium known as the Grid, we perceive that slowed-down quality as mass. It's usually called the Higgs field. We don't know what it's made of. We know it's not made of any of the known forms of matter; they don't have the right properties. So the simplest possibility, logically, is that it's made out of one new thing, and those would be Higgs particles. But I think you get a nicer theory by embedding it in a larger framework, where it's made out of several things. The whole barrier between light and matter has fallen. The underlying reality is much closer to the traditional concept of light than the traditional concept of matter.
This revelation about matter is not only satisfying, but it also opens new doors, say Wilczek.
Wilczek has helped to reveal and develop axions, anyons, asymptotic freedom, the color superconducting phases of quark matter, and other aspects of quantum field theory. He has worked on an unusually wide range of topics, ranging across condensed matter physics, astrophysics, and particle physics.
Axions are very light, very weakly interacting particles, and should contribute much of the dark matter. Axions have been lurking unrecognized on surfaces of bismuth-tin alloys and other materials. To be more precise: the equations that arise in axion physics are the same as those that describe the electromagnetic behaviour of a recently discovered class of materials known, collectively, as topological insulators. The axion field inside topological insulators is an emergent — and subtle — property of collections of electrons that is connected to their spin–orbit coupling.
And “topologically ordered phases” do not exhibit any symmetry breaking. Ordered phases of matter such as a superfluid or a ferromagnet are usually associated with the breaking of a symmetry; and are sensitive to order parameters. The classic experimental probe of topological quantum numbers is magneto-transport, where measurements of the quantization of quantum Hall conductivity σxy = ne2/hbar (where e is the electric charge) reveals the value of the topological number n that characterizes the quantum Hall effect state.

At last:
To Lubos, remember that I am no physicist. I just cite and recombine. With over 100 theories this is a mess. I suppose not only to me.
Are these peoples on the right track? At least they cannot be nonchalated.


References.
Nima Arkani-Hamed, 2010: Space-Time, Quantum Mechanics and Scattering Amplitudes.
- 'Duality for the S-matrix'.
Nima Arkani-Hamed et al. 1999: 'Manyfold Universe' JHEP 0012 (2000) 010 DOI 10.1088/1126-6708/2000/12/010
Nima Arkani-Hamed & co. 1998: The hierarchy problem and new dimensions at a millimeter, at arxive.
Fermi GBM/LAT Collaborations, 2009: Testing Einstein's special relativity with Fermi's short hard gamma-ray burst GRB090510.
Shou-Cheng Zhang and colleagues (X.-L. Qi et al. Phys. Rev. B78, 195424; 2008)
Lee Smolin 2011: The principle of relative locality.
Frank Wilczek 2005: In search of symmetry lost. NATURE VOL 433 , 20 JANUARY 2005. Nature PublishingGroup, 239-247.
- Wilczek on anyons and superconductivity, 1991.
Winterberg 2002: Planck Mass Rotons as Cold Dark Matter and Quintessence.
Superconductivity from nowhere.Physicsworld, Mar 29, 2011.
M.Pitkänen, 2010: General View About Physics in Many-Sheeted Space-Time: Part II. 73 p.
- Macroscopic quantum phenomena and CP2 geometry
- TGD and Astrophysics
- Knots and TGD. 2011
S. Hossenfelder, Lee Smolin, 2009: Phenomenological quantum gravity.
D. Hsieh et al.2009: Observation of topologically protected Dirac spin-textures and pi Berry’s phase in pure Antimony (Sb) and topological insulator BiSb. SCIENCE 323, 919 (2009). http://dx.doi.org/10.1126/science.1167733
LHC bounds on large extra dimensions by A. Strumia and collaborators 2011

and:
Einstein 1905: “Does the Inertia of a Body Depend on Its Energy Content?” m=E/c2
S. Hossenfelder, Bounds on an energy-dependent and observerindependent speed of light from violations of locality, Phys. Rev. Lett. 104 (2010) 140402. [arXiv:1004.0418 [hep-ph]].
S. Majid, Meaning of noncommutative geometry and the Planck-scale quantum group, Lect. Notes Phys. 541 (2000) 227 [arXiv:hep-th/0006166].

torsdag 24 mars 2011

Richard Feinman, the Other

An interesting blog, and name :) Only an y differentiate from the other Feynman.
Feinman talks for low carbohydrate diet.

Dr. Feinman is Professor of Biochemistry at Downstate Medical Center (SUNY) in New York. Dr. Feinman’s original area of research was in protein chemistry and enzyme mechanism. His current interest is Nutrition and Metabolism, specifically in the area of diet composition and energy balance. His work in this area is stimulated by, and continues to influence, his teaching in the Medical School where he has been a pioneer in incorporating nutrition into the biochemistry curriculum.

Nonequilibrium thermodynamics and energy efficiency in weight loss diets

Interesting.

Thermodynamics of weight loss diets
Background

It is commonly held that "a calorie is a calorie", i.e. that diets of equal caloric content will result in identical weight change independent of macronutrient composition, and appeal is frequently made to the laws of thermodynamics. We have previously shown that thermodynamics does not support such a view and that diets of different macronutrient content may be expected to induce different changes in body mass. Low carbohydrate diets in particular have claimed a "metabolic advantage" meaning more weight loss than in isocaloric diets of higher carbohydrate content. In this review, for pedagogic clarity, we reframe the theoretical discussion to directly link thermodynamic inefficiency to weight change. The problem in outline: Is metabolic advantage theoretically possible? If so, what biochemical mechanisms might plausibly explain it? Finally, what experimental evidence exists to determine whether it does or does not occur?

Results

Reduced thermodynamic efficiency will result in increased weight loss. The laws of thermodynamics are silent on the existence of variable thermodynamic efficiency in metabolic processes. Therefore such variability is permitted and can be related to differences in weight lost. The existence of variable efficiency and metabolic advantage is therefore an empiric question rather than a theoretical one, confirmed by many experimental isocaloric studies, pending a properly performed meta-analysis. Mechanisms are as yet unknown, but plausible mechanisms at the metabolic level are proposed.

Conclusions

Variable thermodynamic efficiency due to dietary manipulation is permitted by physical laws, is supported by much experimental data, and may be reasonably explained by plausible mechanisms.

Complete here.
And collaborator - Department of Nuclear Medicine, Jacobi Medical Center.
OOPS? Something New?

tisdag 22 mars 2011

Life is science’s new focus?

On discovering Life.

Dimitar Sasselov is the Bulgarian director of the Origins of Life Initiative at Harvard University. It studies everything from planet formation and detection to the origin and early evolution of life, and the research forms a natural bridge between the physical and the life sciences. Focuses on radiation and matter interference.

Yet as these cosmic-scale projects, LHC and NASAs Kepler mission, open the second decade of the new millennium they are returning science to a frontier that seems oddly 19th century. Science is going back to the scale of life—that middle ground of minute energies and high complexities between the immense galaxies and infinitesimal particles. Two separate quests, one to discover habitable worlds, the other to synthesize artificial organisms, now unite to redefine “life” and its place in the universe.

Quote:
There is an aspect of life sciences that has been largely absent: the confrontation of fundamental questions of biology much as particle accelerators grapple with fundamental questions of physics. The roll call of early pioneers and prospectors is notable, but short. Fortunately, increasing numbers of researchers are now re-entering this fertile frontier.

The open secret of this emerging frontier is that we do not have a fundamental definition or understanding of life. Similarly, we do not understand life’s origins, how life emerges from chemistry. We do know that the chemistry of life on Earth, or “Terran” biochemistry for short, is rather restrictive in its molecular permutations. Unnecessarily so, it seems, given the enormous choice of good options provided by chemistry for building biological bodies and functions. However, we do not know whether nature or nurture is the reason. The bio-chemistry we see (and are!) could be universal, like gravity, where the same basic rules apply anywhere. Or our biochemistry could instead be one of many options, one that just happened to fit Earth’s environmental conditions. /quote.


Alternative biochemistries is needed, says the article. Words, words...

Two simultaneous but distinct approaches have defined the work on the origins and biochemical diversity of life. One approach is from within, "RNA-first", following paths that begin with existing Terran biochemistry and move away from its set of molecules and networks in search for alternatives. The other approach is from outside, "Metabolism-first", following paths from plausible prebiotic initial conditions. Both have scored recent breakthroughs.


RNA WORLD.
John D. Sutherland’s lab here (University of Manchester), in a brilliant example of systems chemistry, has performed a synthesis of nucleotides—building blocks like DNA and RNA — in which two of a nucleotide’s crucial parts, the base and the sugar, emerge as a single unit under natural conditions. RNA-world hypothesis.
RNA is for life: The origin of life on Earth required — at some point — the synthesis of a genetic polymer from simple chemicals. The leading candidate for this role is RNA, but although 'activated' ribonucleotide molecules (the building blocks of RNA) can polymerize without enzymes, no plausible route had been found by which the ribonucleotides could have formed. Now a team from the University of Manchester has found such a route. They also show that a widely held assumption about ribonucleotide synthesis — that the molecules formed from pre-existing sugar molecules and RNA bases — isn't necessary for RNA to have formed on prebiotic Earth.
Did life begin with RNA? An RNA polymer is a string of ribonucleotides, each made up of three distinct parts: a ribose sugar, a phosphate group and a base — either cytosine or uracil, known as pyrimidines, or the purines guanine or adenine. Origins of Life: "A new way of looking at the synthesis of RNA sidesteps a thorny problem in the field." The traditional view is that the ribose sugar and nucleobase components of ribonucleotides formed separately, and then combined. But no plausible reactions have been found in which the two components could have joined together. Powner et al. show that a single 2-aminooxazole intermediate could have contributed atoms to both the sugar and nucleobase portions of pyrimidine ribonucleotides, so that components did not have to form separately.

Powner, M. W., Gerland, B. & Sutherland, J. D. Synthesis of activated pyrimidine ribonucleotides in prebiotically plausible conditions. Nature 459, 239-242 2009.

" Although inorganic phosphate is only incorporated into the nucleotides at a late stage of the sequence, its presence from the start is essential as it controls three reactions in the earlier stages by acting as a general acid/base catalyst, a nucleophilic catalyst, a pH buffer and a chemical buffer. For prebiotic reaction sequences, our results highlight the importance of working with mixed chemical systems in which reactants for a particular reaction step can also control other steps."


Moving in the opposite direction, George Church’s lab at Harvard has achieved the successful synthesis of functioning ribosomes — the molecular machines that read genetic code and make the proteins for cells. Synthetic biology - and the Personal Genome Project, a new era of individualized medicine, in which drug treatments and other therapies can be optimized by custom - matching them with a person's unique genetic makeup. He helped initiate the Human Genome Project in 1984. Engineering Life: Building a FAB for Biology

Today scientists have learned how to write genetic code, and as described by J. Craig Venter, such state-of-the-art biotechnology work is “creating software that makes its own hardware.” Venter said this in 2008 when reporting his team’s successful artificial transformation of one bacterium species into another. The synthesis of ribosomes of your choice is a big deal, because, to borrow computer jargon, it allows you to change the “operating system” when writing new genetic code. The next major step beyond modifying Terran organisms is to create alternative biochemistries and entirely new trees of life. And these new trees are supposed to help us with the energy question. Both Church and Venter are linked to bio-fuelcompanies.

Chirality.
One example of an alternative biochemistry that is both intuitive and relatively close to fruition is the case of “mirror” life — that is, life with biochemistry essentially identical to our own, but composed of molecules of the opposite chirality, or “handedness.” Terran biochemistry is based exclusively on proteins built from “left-handed” amino acids; for balance, all Terran sugars are right-handed. Scientists understand why organisms can’t be chirally ambidextrous, with equal parts left- and right-handed proteins, but nobody knows whether the left-versus-right choice is a matter of chance or necessity.

This is the most essential difference to ordinary matter, though. Why is chirality so important for Life?

And boundaries...
Cells were not primary, but a result of the function. Maybe the immunological question also is essential.

Mirror life will not lead to an artificial life form without the necessary next step of “compartmentalization” - any self-sustaining biochemistry needs a container to hold it. On Earth, cells are the containers — their semi-permeable membranes encapsulate all the biochemical machinery life needs. Jack Szostak’s lab at Harvard has shown how these membranes can naturally form to create “protocells” and how these protocells can even spontaneously reproduce by splitting into two and more protocells. Szostak’s constructs seem tantalizingly close to real cells.

Speed and change...
The other essential compound is emergence of proteins. They are often catalysts, speeding up the reaction. Thomas Cech 1986, findings that explain RNA's function as an enzyme in addition to being an informational molecule. Biological catalysis by RNA, 1986. He suggests 2004 that the RNA serves a very different function, providing a flexible tether for the protein subunits. Telomerase can switch between inhibition, shelterin and extension, he writes in coll. 2007 and 2006,The RNA world: the nature of modern RNA suggests a prebiotic RNA world a link between RNA and proteines. Remember, prebiotic reducing atmosphere whose principal components where CH4 and/or CO2 (carbons), NH3, and H2O. Could the so-called hot carbon-nitrogen-oxygen cycle function also on Earth, but slowly? Is the burning process what we call Life? Fractal inflation, says Chris King.
Traditional chemistry, despite its quantum foundations, treats molecules as arbitrary building blocks. This view is incorrect when non-linearity and dynamical feedback are taken into account. The origin of life is dependent on dynamical processes of free interaction, not forced reactions and involves fundamental interactive quantum bifurcations and feedback effects characteristic of non-linear dynamical systems. Major features of metabolism, including the role of nucleotides and polypeptides, light-absorbing chromophores, phosphate dehydration energy, RNA, the major features of the genetic code, Fe-S groups, ion and electron transport, phosphorylation and the citric acid cycle are all described as being generic features of a cosmically general bifurcation tree. * The rich diversity of structure in molecular systems is made possible by the profound asymmetries developing at the cosmic origin, between the nuclear forces, gravity and electromagnetism. Chemical bonding is a consequence of the non-linear inverse square law of electromagnetic charge interaction in space-time.
' + topologically quantized classical fields in TGD. DNA nucleotides are stable only inside regions containing ordered or liquid crystal water forming a macroscopic quantum phase. The ‘protein folding problem’, being genetically coded, in potential energy environment (entropy) with potentially quantum computation by the molecular orbitals form fractal structures both in the geometry of their primary, secondary, tertiary and quaternary structures and their active dynamics, as illustrated by the fractal dynamics of myoglobin and ion channels.

Radoslav Bozov has detected a "Carbon Signaling System". "Theory of Carbon Signaling. Negentropy vs Entropy - Emergence of Self Propagated Biological Systems" in the Book "Recent Researches in Modern Medicine", ISI,SCI Web if Science and Web of Knowledge, Cambridge, UK, 2011, pp. 98-114; ISBN: 978-960-474-278-3 here.
He wrotes me, when I ask for a comment:
The emergence of carbon signaling systems requires self propagated mechanism linking mutational forces as a function of asymmetric dicarboxilic acid oxidation and glycine/pyruvate amino transferring systems. Proteins only lower activation energy of transitional intermediates of interacting substrates without changing the form of initial interfering matter. Thus speeding up reactions. Simply Cyanide is a substrate not used by biological systems. It is a toxic compound used for gold refining. What happened back then was a very slow reaction that took much time until protein emerged. It is likely that transcription factor proteins emerged first as carbon transferring systems prior to membrane formation and compartmentalization of space.
Proteins are powerful catalysts partly because, as well having active foci, which can invoke effects such as quantum tunnelling, enzymes bring to bear a global coherence of action, arising from cooperative weak bonding, which makes for both very powerful and responsive active sites. The resonances.

Polymerization/dehydration is linked to lipids (carbon-chains) but polymers are not stable in an aqueous environment. Polymerization in modern cells is basically a process involving metabolic control, and it seems that the metabolic control must have been present from the beginning in some primitive form (homeostasis?).

Mutations and constraints.
MistakeMakingMachines.
Gerald Joyce at the Scripps Research Institute together with Tracey Lincoln made paired RNA catalysts, each of which could assemble the other when supplied with the right building blocks. Then the scientists mixed the paired molecules with RNA building blocks in test tubes. Because the RNA 'enzymes' were not perfect, and made different forms of each other, the original pairs mutated into new, 'recombinant' forms that out-competed the originals. The 'winning' enzymes changed depending on the conditions in the reaction mixture, such as the concentration of various RNA building blocks. Joyce's group had already made enzymes capable of catalyzing their own replication, but they could only reproduce themselves a limited number of times. The new enzymes can reproduce themselves indefinitely. "This is the first time outside of biology where you have immortalized molecular information," says Joyce. 'Hypercycles' — networks of enzymes that replicate each other — could give rise to self-sustaining populations of early life forms.
Lincoln, T. A. & Joyce, G. F. Science 10.1126/science.1167856 2009 (2008).

What is Life?
We can forget the famous criteries of Life, as replication. Virus is commonly seen as living today.
The Steven Benner group: 2011: Setting the Stage: The History, Chemistry, and Geobiology behind RNA RNA Worlds: From Life's Origins to Diversity in Gene Regulation. "A definition- theory for Life is needed to develope hypotheses relating to the "RNA-first" model. Paleogenetics, prebiotic chemistry, exploration independent of TerrianLife, and synthetic biology. We are missing something in our models of Reality?

I would say the quantum biology is missing. The essence of mind-matter? Or evolution itself? The non-linearity of charge interaction which causes chemical bonding also gives rise to further residual interactions at lower energies which are resolved by cooperative weak bonding.

While Venter is using a top–down approach — trying to 'boot up' a cell with an entirely synthesized genome — Joyce and Szostak take a bottom–up strategy by attempting to recreate the events that could have led to the existence of genes, cells and life as it is now. The next major step would be to create a system that doesn't just do the same thing over and over, but can evolve the ability to perform new tasks. The evolution.

Szostak won Nobel in medicine 2009 on telomerases, co-winners and collaborators, Elizabeth H. Blackburn and Carol W. Greider. Life on Earth, SciAm 9, 2009.

Other models...
There are also other RNA-models, many-sheeted spacetime in TGD, the evolution of life in intra-terrestrial environement, and models where life came from space, Robert Shapiro; Small molecule interaction..., "metabolism first". A Simpler Origin for Life, SciAm, 2007. Shapiro's suggested readings and here.

The Origin of Life, according to A. G. Cairns-Smith started with chrystals and clay minerals. Chemistry and the Missing Era of Evolution, 2008, Chemistry. 2008;14(13):3830-9.
Stuart Kauffman belongs to the same team with random networks exhibiting a kind of self-organization that he terms "order for free". Self-organization and selection must be combined (The Third Culture, 1995). Ceaseless creativity, God is the creativity yielding a global ethics of respect for all life. Beyond Reductionism - Reinventing The Sacred. Some works here.

"Metabolism first" approach, where self-reproducing and evolving proto-metabolic networks are assumed to have predated self-replicating genes, has met chritism here.

Peptide Nucleic acid first is the third variant invented 1991 by Nielsen (the various purine and pyrimidine bases are linked to the backbone by methylene carbonyl bonds), Stanley L. Miller, 1997: Peptide nucleic acids and prebiotic chemistry. Peter Nielsen, 2008.: A New Molecule of Life? Sci Am,Vol. 299, No.6, pages 64-71. Miller is famous for the nucleotide creation exp 1953, the Miller-Urey experiment. A New Game of Life, PNAs and protocells.

Panspermia is the fourth way (Crick).

Intelligent Design is a belief to fight Darwinian evolution only. I am not convinced Darwinian evolution is everything, there are more to social life and evolution, but I don't think Intelligent Design is the answer. We need to find the constraining rules.
Stephen Meyer, “Signature in the Cell,” overview - Meyer brings his discussion about the feasibility of RNA’s role as the early storehouse for cellular information to a conclusion, he recalls a twenty year old conversation with a philosophy professor about origin-of-life-research: “The field is becoming increasingly populated by cranks. Everyone knows everybody else’s theory doesn’t work, but no one is willing to admit it about his own.” Following this statement, Meyer fast-forwards into the present, and writes of his own assessment of the field twenty years later: “I found no reason to amend these assessments.” The work Meyer had been discussing that led up to that final dismissive statement was that of Gerald Joyce and Jack Szostak. - Intelligent Design is just noisy?

T. Cavalier-Smith is also a big name in evolution of Life. 1991:
The three major classes of intron are clearly of unequal antiquity. Structured (often self-splicing and sometimes mobile) introns are the most ancient, probably dating (at least for group I) from the ancestral (eubacterial) cell 3500 million years ago, and were originally restricted to tRNA. Protein-spliced introns (usually in tRNA) probably evolved from them by a radical change in splicing mechanism in the common ancestor of eukaryotes and archaebacteria, perhaps only about 1700 million years ago. Spliceosomal introns probably evolved from group-II-like self-splicing introns after the origin of the nucleus between 1700 and 1000 million years ago, and were probably mostly inserted into previously unsplit protein-coding genes after the origin of mitochondria 1000 million years ago.
Exploring Life's Origins. Multimedia project at the Museum of Science.

More than ever, over the uncarved terrain of the new biology, Venter and Church are blurring the distinction between the academic and the commercial. Selling Life? George Church, with more than a dozen graduate students and 18 postdoctoral researchers, runs one of the biggest labs in the richest university in the world. Next to Venter's institute, though, his still feels like a scrappy outfit in the corner. Joyce is dean of the faculty at one of our nation’s most prestigious research organizations.