söndag 24 februari 2013

Problems leading to TGD.

Topological Geometro-Dynamics (TGD) is a unified theory of fundamental interactions. Quantum classical correspondence [by dimensional reduction] has been one of the guiding principles.

A different thinking in many questions is very much charachteristic of  TGD. Matti himself says that TGD is an oldfashioned quantum hadron model (as instance described by Björken), and the dimensions are emergent from microcosmos and gauge Lorentz invariance with its roots in the vacuum or zero point field. See also The model for hadron masses revisited.

The basic differencies of TGD in relation to other main theories:
  • Poincare symmetry, Lorentz invariance, so it is more like a parallell to General Relativity of Einstein, still does not contradict it. TGD is more like a scaled up variant in 8D (GR^2) of GR.
  • hadrons, as tripoints, three quarks, or pair of quark-antiquark, instead of strings
  • tripoints are 3-surface, non-local points of mostly wave nature as they follow Kähler action
  • Planck scale is not the basic scale, and Planck scale can be gigantic
  • gravitational Planck constants, gravitational 'waves' (with respect to dark matter)
  • there is no cosmological constant
  • there is negative energy and magnetism that is vanishing, which make possible the Zero Energy Ontology (ZEO), and because electric currents vanish faster than magnetic dito there are left over magnetic 'bodies' as an effector. This is extremely important in biology
  • fields are replaced with effects of spacetime sheets
  • actions are made by Noether currents, not Ricci tensors
  • time is an active force, creating entanglement and phases, also p-adic time. This is crucial in forming matrices with ZEO.
  • the understanding of Feynman diagrams as generalized matrices in 2D (as partonic 2-surfaces) made of 3-surfaces and their matrices/braids. Lightlike 3-surfaces from  maxima of Kähler function define the matrices. This can even describe the black hole inside. Partons and partonic 2-surfaces as generalizations too, as are N-atoms and N-particles?

The basic objection against TGD is acc. to Matti, that induced metrics for space-time surfaces in M^4 × CP_2 form an extremely limited set in the space of all space-time metrics appearing in the path integral formulation of General Relativity. Even special metrics like the metric of a rotating black hole fail to be imbeddable as an induced metric. For instance, one can argue that TGD cannot reproduce the post-Newtonian approximation to General Relativity because it involves linear superposition of gravitational fields of massive objects. Holger B. Nielsen made this objection for at least two decades ago. Perhaps the strong objection against TGD is that linear superposition for classical fields is lost.
The linear superposition is however central starting point of field theories. Many-sheeted space-time circumvent this argument. The replacement of linear superposition of fields with the superposition of their effecs meaning that sum is replaced with set theoretic union for space-time sheets. This simple observation has far reaching consequences: it becomes possible to replace the dynamics for a multitude of fields with the dynamics of space-time surfaces with only 4 imbedding space coordinates as primary dynamical variables. See also Standing waves in TGD.  

The continuity has also been an obstacle in a world where even the quantum fraction is geometric.

Discrete vs continuous controversy in physics - discrete and continuous features coexist in any natural phenomenon, depending on the scales of observation.

I quote from the TGD Intro (2007) about the main differencies from mainstream:

TGD was originally an attempt to construct a Poincare invariant theory of gravitation. Spacetime, rather than being an abstract manifold endowed with a pseudo-Riemannian structure, is regarded as a 4-surface in the 8-dimensional space.  
  • H=M^4_+ = the interior of the future light cone of the Minkowski space (to be referred as light cone)
  • CP_2= SU(3)/U(2) is the complex projective space of two complex dimensions
The size of CP_2 which is about 10^4 Planck lengths replaces Planck length as a fundamental length scale in TGD Universe.

The identification of the spacetimes as a submanifolds leads to Poincare invariance broken only in cosmological scales and solves the conceptual difficulties related to the definition of the energy-momentum in General Relativity. Even more, sub-manifold geometry, being considerably richer in structure than the abstract manifold geometry behind General Relativity, leads to a geometrization of all basic interactions and elementary particle quantum numbers. In particular, classical electroweak gauge fields are obtained by inducing the spinor curvature of CP_2 to the spacetime surface.
 

 
Fig. 1. a) Future light cone of Minkowski space. b) CP_2 is obtained by identifying all points of C^3, space having 3 complex dimensions, which differ by a complex scaling \Lambda: z is identified with \Lambda x z.

This forces a generalization of the conventional spacetime concept to what might be called manysheeted spacetime or 'topological condensate'. The topologically trivial 3-space of General Relativity is replaced with a 'topological condensate' containing matter as particle like 3-surfaces "glued" to the topologically trivial background spacetime sheet by extremely tiny connected sum (wormhole) contacts having CP_2 size connecting the spacetime sheets. End quote.

The criticality.
One big problem in physics is the criticality, how a classic world can come from the quantum uncertainty. This problem does not differ so much from M-theory, but the solution does very much.
TGD can be seen as a model giving rise to GR as a simple 'mirror image', and also there is a double mirror. Time dimension also have this 'mirror image', and magnetism,  em-'force' can be vanishing? See How to perform WCW integrations in generalized Feynman diagrams? and "The relationship between TGD and GRT".  He writes (from the GRT abstract, and I have filled in other links):
Radically new views about ontology were necessary before it was possible to see what had been there all the time. Zero energy ontology states that all physical states have vanishing net quantum numbers. The hierarchy of dark matter identified as macroscopic quantum phases labeled by arbitrarily large values of Planck constant is second aspect of the new ontology.
  1. Equivalence Principle in TGD Universe
  2. Zero energy ontology
  3. Dark matter hierarchy and hierarchy of Planck constants
  4. The problem of cosmological constant  
  5. The generalized Feynman Diagrams
  6. The families and massivation. The symmetries coming out from the microscopic massivation and time distortion or symmetry breaking. This last point I will not take up here.


1. The energy problem, is the equivalence principle holding in TGD?
The source of problems was the attempt to deduce the formulation of Equivalence Principle in the framework provided by General Relativity framework rather than in string model like context. The process shortly summarized:
a) Inertial and gravitational four-momenta are replaced with Super Virasoro generators of two algebras whose differences annihilate physical states = the super-conformal symmetries of quantum TGD.
b)  Number theoretical compactification providing a number theoretical interpretation of space spinors, and thus also of standard model quantum numbers.
c) The identification of the preferred extremals of Kähler action and  the formulation of quantum TGD in terms of second quantized induced spinors fields. This has turned out to be extremely useful for the development of TGD, made possible the  understanding of field equations, and led to a detailed understanding of quantum TGD at the fundamental parton level.


Absolute minimization of so called Kähler action is the fundamental variational principle of TGD and assigns to a given 3-surface X^3 a classical spacetime surface X^4(X^3) which is much like Bohr orbit going through a fixed point in wave mechanics characterized by classical non-determinism caused by enormous vacuum degeneracy and this forces a generalization of the notion of 3-surfaces in order to achieve classical determinism in a more general sense. 3-surfaces are in general unions of disjoint 3-surfaces with timelike separations rather than single time=constant snapshots of the spacetime surface. In particular, spacetime sheets with finite time duration, 'mindlike' spacetime sheets, are possible and are identified as geometric correlates of selves in TGD inspired theory of consciousness

2. Zero energy ontology (S-matrix is replaced with M-matrix definition "square root" of density matrix) allows to avoid the paradox implied in positive energy ontology, by the fact that gravitational energy is not conserved but inertial energy identified as Noether charge is. Energy conservation is always in some length scale in zero energy ontology. This principle is satisfied only by the outcomes of state function reduction.
To sum up, the understanding of Equivalence Principle in TGD context required quite many discoveries of mostly mathematical character: the understanding of the superconformal symmetries of quantum TGD, the discovery of zero energy ontology, the identification of preferred extremals of Kähler action by requiring number theoretical compactification, and the discovery that dimensional reduction allows to formulate quantum in terms of slicing of space-time surface by stringy word  sheets. See Tree like structure for the imbedding space
And later...
Gravitational four-momentum can be assigned to the curvature scalar as Noether currents and is thus completely well-defined [but non-conserved] unlike in GRT. Equivalence Principle requires that inertial and gravitational four-momenta are identical. This is satisfied if curvature scalar defines the fundamental action principle crucial for the definition of quantum TGD. Curvature scalar as a fundamental action is however non-physical and had to be replaced with so called Kähler action. The conservation of gravitational four-momentum seems to fail in cosmological scales. Also for vacuum extremals satisfying Einstein's equations gravitational four-momentum fails to be conserved and non-conservation becomes large for small values [lengths] of cosmic time.  My basic mistake looks now obvious. I tried to deduce the formulation of Equivalence Principle in the framework provided by General Relativity framework rather than in string model context.
But the conservation laws are questioned by many other too. This frame also gave a new interpretation of time.
The basic prediction of TGD is that the sign of energy depends on the time orientation of the spacetime surface.
 
Quantum states of 3-D theory in zero energy ontology correspond to generalized S-matrices. M-matrix might be a proper term, and is a "complex square root" of density matrix - matrix valued generalization of Schrodinger amplitude - defining time like entanglement coefficients. Its "phase" is unitary matrix. The counterpart of ordinary S-matrix is between zero energy states. I call it U-matrix. It has nothing to do with particle reactions. It is crucial for understanding consciousness via moment of consciousness as quantum jump identification. See Construction of Quantum Theory: S-matrix.

Wikipedia, Noethers theorem, Constant of motion, conservation law, and conserved current, says; 
A conservation law states that some quantity X in the mathematical description of a system's evolution remains constant throughout its motion — it is an invariant. Mathematically, the rate of change of X (its derivative with respect to time) vanishes,

\frac{dX}{dt} = 0 ~.
Such quantities are said to be conserved; they are often called constants of motion (although motion per se need not be involved, just evolution in time). The earliest constants of motion discovered were momentum and energy,
Here are some examples of research about Noether currents by other scientists:
  1. Applications of Noether currents. Scale invariance. R. Corrado 1994: To illustrate the use of Noether’s theorem and the currents produced, we examine the case of scale transformations. As we shall see, these are not necessarily invariances of the action and we will have to determine what conditions are necessary for scale transformations to be a symmetry. Only the masses breaks scale invariance. Any operator with a dimensionful coupling constant breaks the scale invariance of the massless theory.
  2. Continous symmetries and conserved currents. conservation of the Noether current holds in the quantum theory, with the current inside a correlation function, up to contact terms (that depend on the infinitesimal transformation). Conserved charges associated with this current are generators of the Lorentz group.
  3. Symmetries and conservation Laws.  Lagrangian density with a symmetry can give 1. time translations, - time translation invariance implies that H is constant. This does not appear to be the case in our Universe, because it is expanding (the cosmological constant). The Hamiltonian generates translations in time. 2. spacetime translations - Noether currents are the components of the stress-energy tensor. The conserved charges (components of the total four-momentum) generate translations. 3. Rotations -  specified by a vector ~ pointing in the direction of the axis of rotation. Its magnitude is the angle of rotation. If e is small, under a rotation, the corresponded charges form the angular momentum of the system. The angular momentum generates rotations, rotation 3x3 matrix. 4. Lorentz transformations - In addition to rotations, the group of Lorentz transformations contains boosts (small velocity, corresponding charges are the components of the vector and generates boosts: for infinitesimal velocities and finite vectors where Λ is the corresponding 4×4 Lorentz transformation matrix. show that vectors M and  L transform correctly as vectors in R3 under rotations.
  4. Noether currents and charges for Maxwell-like Lagrangians, Yakov 2003: Application of the Noether procedure to physical Lagrangians yields, however, meaningful (and measurable) currents. The well-known solution to this 'paradox' is to involve the variation of the metric tensor. The Noether current of the field considered on a variable background coincides with the current treated in a fixed geometry. Consistent description of the canonical energy–momentum current is possible only if the dynamics of the geometry (gravitation) is taken into account.
  5. Nonlocal currents as Noether currents, Dolan & Roos 1980: The first two nonlocal currents in the general two-dimensional chiral models are derived as Noether currents. The associated infinitesimal field transformations are shown to obey a group integrability condition. A subset of the structure constants of the symmetry group responsible for these conserved currents is calculated.
  6. GAUGE SYMMETRIES AND NOETHER CURRENTS IN OPTIMAL CONTROL. Torres, 2003: extend the second Noether theorem to optimal control problems which are invariant under symmetries depending upon k arbitrary functions of the independent variable and their derivatives up to some order m. As far as we consider a semi-invariance notion, and the transformation group may also depend on the control variables.
   
3.  Dark matter hierarchy.
The dimensional reduction for preferred extremals of Kähler action - if they have the properties required by theoretic compactification - leads to string model with string tension which is however not proportional to the inverse of Newton's constant but to  p-adic length scale squared and thus gigantic [and dark], see p-Adic Mass Calculations: New Physics. This allowed to predict the value of Kähler coupling strength by using as input electron mass and p-adic mass calculations. In this framework the role of Planck length as a fundamental length scale is taken by CP2 size so that Planck length scale loses its magic role as a length scale.   
The identification of gravitational four-momentum in terms of Einstein tensor makes sense only in long length scales. This resolves the paradoxes associated with objects like cosmic strings.

Dark matter hierarchy corresponds to a hierarchy of conformal symmetries Zn of partonic 2-surfaces and this hierarchy corresponds to an hierarchy of increasingly quantum critical systems in modular degrees of freedom. For a given prime p one has a sub-hierarchy Zp, Zp2=Zp× Zp, etc...
This mapping of integers to quantum critical systems conforms nicely with the general vision that biological evolution corresponds to the increase of quantum criticality as Planck constant increases. The group of conformal symmetries could be also non-commutative discrete group having Zn as a subgroup.
The number theoretically simple ruler-and-compass integers having as factors only first powers of Fermat primes and power of 2 would define a physically preferred sub-hierarchy of quantum criticality for which subsequent levels would correspond to powers of 2: a connection with p-adic length scale hypothesis suggests itself. Updated view here.

Particles of dark matter would reside at the flux tubes but would be delocalized (exist simultaneously at several flux tubes) and belonging to irreducible representations of Ga. What looks weird is that one would have an exact macroscopic or even astroscopic symmetry at the level of generalized imbedding space. Visible matter would reflect this symmetry approximately. This representation would make sense also at the level of biochemistry and predict that magnetic properties of 5- and 6-cycles [pentoses and hexoses] are of special significance for biochemistry. Same should hold true for graphene. Electron pairs are associated with 5- and 6-rings and the hypothesis would be that these pairs are in dark phase with na=5 or 6. Graphene which is a one-atom thick hexagonal lattice could be also an example of (conduction) electronic dark matter with na=6.

The idea about dark matter as a large Planck constant phase, requires na/nb= GMm/v0, v0=2-11 so that the values are gigantic. A possible interpretation is in terms of a dark (gravi)magnetic body assignable to the system playing a key role in TGD inspired quantum biology.  See Construction of Elementary Particle Vacuum Functionals.

The fundamental feature of the configuration space is that it has two kinds of degrees of freedom.  The degrees of freedom in which metric vanishes correspond to what I call zero modes and are purely TGD based prediction basically due to the non-point like character of particles identified as 3-surfaces. Zero modes are the counterparts of the classical macroscopic variables and in every quantum jump a localization in zero modes occurs; the state function reduction. This also means that the replacement of point like particle with 3-surface means giving up the locality of the physics at spacetime level: physics is however local at the level of configuration space containing 3-surfaces as its points. For instance, classical EPR nonlocality is purely local phenomenon at the level of configuration space. Besides allowing to get rid of the standard infinities of the interacting local field theories, the non-locality explains topologically the generation of structures, in particular biological structures which correspond to spacetime sheets behaving as autonomous units.

4. The cosmological constant.
Astrophysical systems correspond to [relativistic] stationary states analogous to atoms and do not participate [much] to cosmic expansion in a continuous manner but via discrete quantum phase transitions in which gravitational Planck constant increases. This from the dark matter hierarchy.
a) By quantum criticality of these phase transitions critical cosmologies are excellent candidates for the modeling of these transitions. Imbeddable critical (and also over-critical) cosmologies are unique apart from a parameter determining their duration and represent accelerating cosmic expansion so that there is no need to introduce cosmological constant = quantum phase transition increasing the size. See Could the value of fine structure vary in cosmological scales?
b) A possible mechanism driving the strings to the boundaries of large voids could be repulsive interaction, or  repulsive gravitational acceleration.
c) Cosmological constant like parameter does not characterize the density of dark energy but that of dark matter identi fiable as quantum phases with large Planck constant.
d) The Lambda problem:  large voids arequantum systems which follow the cosmic expansion only during the quantum critical phases.
e) p-Adic fractality predicts that cosmological constant is reduced by a power of 2 in phase transitions occurring at times corresponding to p-adic time scales. These phase transitions would naturally correspond to quantum phase transitions increasing the size of the large voids during which critical cosmology predicting accelerated expansion naturally applies.
f) On the average Lambda (k) behaves as 1/a^2 where a is the light-cone proper time. This predicts correctly the order of magnitude for observed value of Lambda.
g) What empty space is may be a consequence of cosmological constant absence. Such as stochastic quantization and a  holography  that  reduces everything to the level of 3-metrics and more generally, to the level of 3-D eld con figurations.  To a given 3-surface the metric of WCW assigns a unique space-time and this space-time serves as the analog of Bohr orbit and allows to realize 4-D general coordinate invariance in the space of 3-surfaces so that classical theory becomes an exact part of quantum theory. Both 4-D path integral and stochastic quantization for gravitation fail in this respect due to the local divergences (insuper-gravity situation might be di fferent). The preferred 3-surfaces circumvent this di fficulty, and give the GR^2. No emergence of space-time, no  'empty' space is there in TGD? In ZEO the S-matrix is replaced with M-matrix de fining a square root of thermodynamics.
Since the space-times allowed by TGD de ne a subset of those allowed by GR one can ask whether the quantization of GRT leads to TGD or at least sub-theory of TGD.
The arguments represented [in the article] however suggest that this is not the case.
A promising signal is that the generalization of Entropic Gravity (Verlinde's) to all interactions in TGD framework leads to a concrete interpretation of gravitational entropy and temperature, to a more precise view about how the arrow of geometric time emerges, to a more concrete realization of the old idea that matter-antimatter asymmetry could relate to di erent arrows of geometric time (not however for matter and antimatter but for space-time sheets mediating attractive and repulsive long range interactions), and to the idea that the small value of cosmological constant could correspond to the small fraction of non-Euclidian regions of space-time with cosmological constant characterized by CP2 size scale.

5. The helicity, vertices or spin.
The basic prediction of TGD is that the sign of energy depends on the time orientation of the spacetime surface ('negative energy' possible as a request or demand?), creating tensions and vortices as an S-matrix.

Generalized Feynman diagrams.
Zero energy ontology (ZEO) has provided profound understanding about how generalizedFeynman diagrams differ from the ordinary ones. The most dramatic prediction is that loop momenta correspond to on mass shell momenta for the two throats of the wormhole contact defining virtual particles: the energies of the energies of on mass shell throats can have both signs in ZEO. This predicts finiteness of Feynman diagrams in the fermionic sector. Even more: the number of Feynman diagrams for a given process is finite if also massless particles receive a small mass by p-adic thermodynamics. See topological torsion and thermodynamic irreversibility, by Kiehn and the TGD version. The mass would be due to IR cutoff provided by the largest CD (causal diamond) involved. 
Generalized Feynman Diagrams as Generalized Braids String world sheets and partonic 2-surfaces provide a beatiful visualization of generalized Feynman diagrams as braids and also support for the duality of string world sheets and partonic 2-surfaces as duality of light-like and space-like braids. The dance metaphor.
 
The TGD inspired proposal (TGD as almost topological QFT) is  that generalized Feynman diagrams are in some sense also knot or braid diagrams allowing besides braiding operation also two 3-vertices. The first 3-vertex generalizes the standard stringy 3-vertex but with totally different interpretation having nothing to do with particle decay: rather particle travels along two paths simultaneously after 1→2 decay. Second 3-vertex generalizes the 3-vertex of ordinary Feynman diagram (three 4-D lines of generalized Feynman diagram identified as Euclidian space-time regions meet at this vertex). I have discussed this vision in detail here. The main idea is that in TGD framework knotting and braiding emerges at two levels.

  1. At the level of space-time surface string world sheets at which the induced spinor fields (except right-handed neutrino, see this) are localized due to the conservation of electric charge can form 2-knots and can intersect at discrete points in the generic case. The boundaries of strings world sheets at light-like wormhole throat orbits and at space-like 3-surfaces defining the ends of the space-time at light-like boundaries of causal diamonds can form ordinary 1-knots, and get linked and braided. Elementary particles themselves correspond to closed loops at the ends of space-time surface and can also get knotted (for possible effects see this).

  2. One can assign to the lines of generalized Feynman diagrams lines in M2 characterizing given causal diamond. Therefore the 2-D representation of Feynman diagrams has concrete physical interpretation in TGD. These lines can intersect and what suggests itself is a description of non-planar diagrams (having this kind of intersections) in terms of an algebraic knot theory. A natural guess is that it is this knot theoretic operation which allows to describe also non-planar diagrams by reducing them to planar ones as one does when one constructs knot invariant by reducing the knot to a trivial one. Scattering amplitudes would be basically knot invariants.

Black holes.
One outcome is a new view about black holes replacing the interior of blackhole with a space-time region of Euclidian signature of induced metric and identifiable as analogs of lines of generalized Feynman diagrams. In fact, black hole interiors are only special cases of Eucdlian regions which can be assigned to any physical system. This means that the description of condensed matter as AdS blackholes is replaced in TGD framework with description using Euclidian regions of space-time.

The effective superposition of the CP2 parts of the induced metrics gives rise to an effective metric which is not in general imbeddable to M4× CP2. Therefore many-sheeted space-time makes possible a rather wide repertoire of 4-metrics realized as effective metrics as one might have expected and the basic objection can be circumvented. In asymptotic regions where one can expect single sheetedness, only a rather narrow repertoire of "archetypal" field patterns of gauge fields and gravitational fields defined by topological field quanta is possible.
The skeptic can argue that this still need not make possible the imbedding of a rotating black hole metric as induced metric in any physically natural manner. This might be the case but need of course not be a catastrophe. We do not really know whether rotating blackhole metric is realized in Nature. I have indeed proposed that TGD predicts new physics new physics in rotating systems. Unfortunately, gravity probe B could not check whether this new physics is there since it was located at equator where the new effects vanish.

Fundamental questions leading to TGD.
Ulla said... Seems this Firewall at the edge of Black holes, by Polchinski, has went through the blogosphere. Here Scott Aaronson.

Lubos saysuncritically promote the views of Joe Polchinski, Leonard Susskind, Raphael Bousso, and a few others. When it comes to the AMPS thought experiment, it just uncritically parrots the wrong statements by Polchinski et al.:

The interior (A) and the near exterior (B) have to be almost maximally entangled for the space near the horizon to feel empty; the near exterior (B) is almost maximally entangled with some qubits inside the Hawking radiation (C) because the Hawking radiation's ability to entangle the infalling and outgoing qubits. Because of the monogamy of the entanglement (at most one maximum entanglement may incorporate (B) at the same time), some assumptions have to be invalid. The unitarity should be preserved which means that the A-B entanglement has to be sacrificed and the space near the horizon isn't empty: it contains a firewall that burns the infalling observer.
That may sound good but, as repeatedly explained on this blog, this argument is wrong for a simple reason. The degrees of freedom in (A) and those in (C) aren't independent and non-overlapping. It is the very point of the black hole complementarity that the degrees of freedom in (A) are a scrambled subset of those in (C). The degrees of freedom in (A) are just another way to pick observable, coarse-grained degrees of freedom and "consistent histories" within the same Hilbert space. So the entanglement of (B) with "both" (A) and (C) isn't contradictory in any sense: it's the entanglement with the same degrees of freedom described twice.

It seems clear to me that this imbalanced perspective was incorporated to the article by the main "informers" among the scientists who communicated with Jennifer. This conclusion of mine partly boils down to the amazing self-glorification of Joe Polchinski in particular. So we're learning that if there's a mistake, the mistake is not obvious, AMPS is a "mighty fine paradox" that is "destined to join the ranks of classic thought experiments in physics" and it's the "most exciting thing that happened since [Bousso] entered physics". Holy cow. The mistake is obvious. AMPS simply assume that complementarity can't hold by insisting on separate parts of the wave function that are responsible for observations inside and outside. That's a wrong assumption, so it's not shocking that various corollaries such as the "firewall" at the horizon are wrong, too. This wrong assumed denial of complementarity is as wrong as the assumption that simultaneity has to be absolute – an assumption made by those who "debunk" Einstein's relativity; the error is in step 1 and means that they just didn't understand the original insights.
 
          Matti Pitkanen said...
Blackholes  represent the singularity of general relativity as a theory. What happens for the space-time in the interior of black hole? This should be the difficulty from which to start from. Not the only one.

One could also start from the energy problem of general relativity.

Or from proton instability predicted by GUTs: why quark and lepton numbers seem to be conserved separately?

Or by asking whether it is really true that so many primary fields are needed (superposition of effects of fields replaces superposition of fields in many-sheeted space-time)?

Or what is behind the family replication phenomenon?
Or what is the deeper structure behind standard model symmetries?

I could continue the list: the answer to every question unavoidably leads to TGD.

Superstring theories were claimed to provide quantum theory of gravitation but the outcome was land scape and tinkering with blackholes after it had become clear that superstrings themselves do not tell anything about physics and one must make a lot of ad hoc assumptions to get QFT theory limit. After production of huge amount of literature super stringers are exactly in the same position as before the advent of superstring models.

It would be encouraging if people would gradually realize that we have not made much progress during these four decades. Some really new idea is needed to make genuine progress and we must open our minds for it. Maybe it is here already;-).

          Ulla said...
Thanks, this was exactly the kind of list of problems leading to TGD I have asked for. You are welcome to continue on it :)

About Planck units.
Under our current best-guess of a complete theory of physics, the maximum possible temperature is the Planck temperature, or 1.41679 x 10^32 Kelvins. However, it is common knowledge that our current theories of physics are incomplete.

Gustavo Valdiviesso The use of the so called "Planck units" is rather arbitrary, and I will point out why: Every model has its limitations. For instance, Newton's second law breaks down at speeds near c and need to be replaced by a Lorentz invariant version, so that the concept of relativistic energy rises from it. But, you see, the speed of light was known from Maxwell equations well before relativity. It was also known that Maxwell equations and Newton's laws doesn't always get along (there are some situations were the Lorentz force between a point charge and a magnet does not have a action-reaction partner). Also, and more obvious, Maxwell equations are not invariant under Galilean transformations, in which Newton's second law is based. So, we have two models for the same Nature, and they disagree... one of them carries a fundamental constant: the speed of light. Years later, we see that this very same speed is the limit of one of the models: the one that did not care about it.

Now, we have several models (quantum mechanics, general relativity, etc) and we can expect all of them to have a limit, to break down at some value of some physical observable.

  
So we must have a model based on Lorentz invariance, which is exactly what TGD is.