# Is the Universe computing something?

Discussion in 'Astronomy, Exobiology, & Cosmology' started by arfa brane, Jan 26, 2016.

1. ### arfa branecall me arfValued Senior Member

Messages:
5,711
Someone mentioned broken symmetry.

This is not quite as esoteric as it sounds. For instance, a coin is symmetric. When a coin is spinning (i.e. being randomized), it has more symmetry than when it lands on one side or the other. The latter state is a broken symmetry.

So a string of coin tosses is a randomized string, each element in the string represents say, the ring $\mathbb Z_2$ under some action.

When the coin is spinning, the sides are in a random 'superposition' of probabilities, each side has exactly 0.5 probability of showing (if the coin is fair) when the symmetry 'breaks'. A string of results is represented by regular language as (0,1)*, a concise way of writing down a string with each digit having equal probability of being 0 or 1, a randomly 'generated' string.

Last edited: Feb 15, 2016

3. ### PhysBangValued Senior Member

Messages:
2,422
The CC could simply be a constant of gravitational interaction; in this case, it had no relationship to quantum theory.

There is no good calculation of the energy density of the quantum vacuum, so it is even possible that there is a CC an energy density from the vacuum.

5. ### The GodValued Senior Member

Messages:
3,546
.... Banging funny things around.

Messages:
21,703
And gravitational interactions, are evident when spacetime is warped/curved/twisted/ rippled.
And possibly a QGT may reveal the how and why this happens, which is at the quantum level.
Well that's basically what I have said...these forces, the CC, ZPE, Casimir effect, may all be one and the same, but all are properties of spacetime as I see it. and all or one of them probably is the DE component.

8. ### brucepValued Senior Member

Messages:
4,098
The cosmological constant is a specific scientific term. So it can't be all those different things. If it was all those things we wouldn't need to have a bunch of different terms to describe the same natural phenomena. Calling it a different name doesn't change the natural phenomena. Quintessence is a different model for describing the accelerated expansion of the universe. WMAP concluded it's the cosmological constant which best describes what's happening in this universe. So science is building a standard model of cosmology and the dark energy is best described by the cosmological constant. The good thing is science has a great empirical model for testing predictions derived from cosmological theoretical models.

Messages:
21,703
I can't argue with too much of what you have said bruce [and PhysBang] but I was always of the opinion that the mystery surrounding the DE component, was that we did not actually know the true nature of this force...whether the CC or quintessence as you mention, or any of the others. We just as yet are not sure.
The different names imo were just a result of different observations.
Perhaps I put my thoughts rather poorly.
What do you conclude from the following.....
https://briankoberlein.com/2015/03/06/nothing-but-net/

Messages:
21,703

Or this which seems to support your thoughts.....
http://www.sciencedirect.com/science/article/pii/S0370269306010197

Abstract
It has been speculated that the zero-point energy of the vacuum, regularized due to the existence of a suitable ultraviolet cut-off scale, could be the source of the non-vanishing cosmological constant that is driving the present acceleration of the universe. We show that the presence of such a cut-off can significantly alter the results for the Casimir force between parallel conducting plates and even lead to repulsive Casimir force when the plate separation is smaller than the cut-off scale length. Using the current experimental data we rule out the possibility that the observed cosmological constant arises from the zero-point energy which is made finite by a suitable cut-off. Any such cut-off which is consistent with the observed Casimir effect will lead to an energy density which is at least about 1012 times larger than the observed one, if gravity couples to these modes. The implications are discussed.

Current cosmological observations seem to favor a cosmological constant (Λ) as the leading candidate for dark energy which is thought to be driving the current phase of accelerated cosmic expansion [1]; the energy density contributed by it is constrained to be roughly ρDE≈10−11 (eV)4, with a corresponding length scale of the order of 0.1 mm. There exists a sizeable body of literature attempting to explain the origin of Λ as a consequence of the coupling of zero-point quantum fluctuations of matter fields, that pervade the vacuum, to gravity [2].

Write4U likes this.
11. ### PhysBangValued Senior Member

Messages:
2,422
Well, yes and no. A quantum theory of gravity may include a separate energy that acts like a CC, or the CC may just be a mathematical feature of the graviton field, or whatever quantum field creates gravity.

I haven't done too much research on the Casamir effect, but I believe that there are hypothetical explanations that do not rely on vacuum energy.

12. ### PhysBangValued Senior Member

Messages:
2,422
While I agree that there are tests that show that the dynamics favor a constant, I'm not sure that they can yet be said to rule out a vacuum energy of some sort, or even some sort of field. If there is a weird combination of constant and vacuum energy and field, then we might never find out!

13. ### sweetpeaValued Senior Member

Messages:
1,047
"Is the Universe computing something?"
OMG, does this mean it may be possible to hack the universe?

14. ### brucepValued Senior Member

Messages:
4,098
If you respect the results of the WMAP experiment then you can be somewhat 'sure'. Don't really like that term for science. We all know why Einstein derived the cosmological constant. To keep his GR universe from collapsing. In the classical metric it's a pressure term. Essentially an anti gravity term. It's the term that Guth and Linde used for the gravitational interaction, with a soliton in a quantum scalar field, that predicts the inflation event. During the event the pressure of the vacuum was negative and remained constant until the false vacuum 'rolled out' [reached a minimum vacuum energy state] and inflation came to an end. Initially Einstein didn't know the vacuum was comprised of fluctuating virtual particles seeking minimum energy states. Which must be accounted for when deriving quantum predictions associated with the cosmological constant. I think. So we have a classical model and a quantum model for the natural phenomena. These models describe the global effect of the cosmological constant. When we use the Casimir machine we alter the local physics of the vacuum. Locally between the uncharged plates. Resulting in a negative energy density between the uncharged plates and a measureable force pushing the plates towards each other. This is a pretty good discussion on the Casimir effect.
http://physicsworld.com/cws/article/news/2012/jul/18/physicists-solve-casimir-conundrum
So how big must the Casimir machine be to power a super luminal warp? It's the same theoretical principle that was used to hold the wormhole walls open during the theoretical analysis in this paper.
http://arxiv.org/abs/1405.1283

Last edited: Feb 15, 2016
15. ### brucepValued Senior Member

Messages:
4,098
Certainly. Based on what's known the classical and quantum models give a coherent description for the natural phenomena. I think. LOL. Thanks for your comments on this very interesting phenomena.

16. ### Write4UValued Senior Member

Messages:
7,604
Every manmade object is a result of hacking (applying) the observed universal forms, values, and functions.

Last edited: Feb 15, 2016
17. ### arfa branecall me arfValued Senior Member

Messages:
5,711
Ahem. So since a coin spinning then landing is a quite good analogy of a quantum particle with spin plus a measurement 'operator':

This is why Seth Lloyd and quite a few others claim the universe is computational--it's full of particles with spin, these particles interact and exchange something we call quantum information. Thanks to quantum randomness there is any number of ways to generate a random binary string by measuring spin, entirely analogous with tossing a fair coin.

On the other side we mostly have Ken Wharton, whose argument appears to be based on our assumption that the universe is Newtonian, a mistake since the universe is Lagrangian. He argues pursuing the latter "Schema" and laying aside our anthropic bias. I don't know what he's talking about; it doesn't matter which approach you use, in any quantum experiment the results are always classical--the output is the same.

Last edited: Feb 15, 2016
18. ### Write4UValued Senior Member

Messages:
7,604
https://en.wikipedia.org/wiki/Virtual_particle

IMO, this statement implies a very important question in regards to the required conditions where a *virtual* particle (such as the Higgs) becomes a *real* particle with a life of it's own. Apparently one requirement is that the Higgs combines with something else, which then gives it greater mass and the ability to persist as a real particle.

Any thoughts on this (possibly) fundamental process?

Messages:
5,711

20. ### sweetpeaValued Senior Member

Messages:
1,047
Remember the OP asks is the universe computing something.
Man is a part of nature's processes. The thoughts, for say, the making of clay pots, did not come from 'outside' of nature or 'outside' the universe. Those thoughts are part and parcel of nature or the universe. Or, to keep in tune with the OP, those thoughts and resulting actions are nature at work.

Last edited by a moderator: Feb 17, 2016
21. ### arfa branecall me arfValued Senior Member

Messages:
5,711
At a lecture on solid state physics, the subject was Heisenberg's Uncertainty Principle, and I recall the lecturer saying that it had cosmological implications, but he wasn't sure about them.

The relation between time and energy in HUP does have computational implications: there must be a connection to the fundamental limits of computation (Bennett's Brownian Computer for instance). That is, $\Delta E \Delta t \ge \hbar / 2$ says there is a fundamental limit on the time for any computation, rather than the time to measure some particle's energy to within $\Delta E$.

But that seems to just replace "time to measure" with "time to compute", somehow . . .

Apparently the correct relation is: $\Delta t = {\pi \hbar} / 2 \Delta E$, for a particle with a spread in energy $\Delta E$ to "move from one distinguishable state to another".
--http://arxiv.org/pdf/quant-ph/9908043v3.pdf

Last edited: Feb 19, 2016
22. ### arfa branecall me arfValued Senior Member

Messages:
5,711
Ok, so let's look again at the rock warming in the sun. It's a system, and it's processing information; the processing rate is limited by the amount of energy in each part of the system--if the surface is warmer than the insides it will have more energy to process information, the surface will compute faster than the inside of the rock. So, it's really just a close look, at say, the Heisenberg limit, of particles interacting.

I've intentionally used a rock to make a sort of parody of the whole thing, except this is serious shit, so it isn't a parody after all.

And a correction to post 281. I said the regular expression (0,1)* represents a random string of 1s and 0s, I should have said an element of the set generated by (0,1)* is such a string.

Last edited: Feb 19, 2016
23. ### Waiter_2001Registered Senior Member

Messages:
459
Hey Arf, how you doin? I see you've posted to sciforums TWICE here. I thought I'd post a reply to you in an effort to bring the thread forwards somewhat, which, I believe, will happen else I wouldn't have posted. It's all good what you posted though!