SELFFAMILYSOCIETYHUMANITYEARTHUNIVERSEDIVINE
image-preview

Two luminaries of 20th century astrophysics were Sir James Jeans and Sir Arthur Eddington. Both took seriously the view that there is more to reality than the physical universe and more to consciousness than simply brain activity.


In his Science and the Unseen World (1929) Eddington speculated about a spiritual world and that "conscious is not wholly, nor even primarily a device for receiving sense impressions."

Jeans also speculated on the existence of a universal mind and a non-mechanical reality, writing in his The Mysterious Universe (1932) "the universe begins to look more like a great thought than like a great machine." In his book QED Feynman discusses the situation of photons being partially transmitted and partially reflected by a sheet of glass: reflection amounting to four percent. In other words one out of every 25 photons will be reflected on average, and this holds true even for a "one at a time" flux. The four percent cannot be explained by statistical differences of the photons (they are identical) nor by random variations in the glass. Something is "telling" every 25th photon on average that it should be reflected back instead of being transmitted. Other quantum experiments lead to similar paradoxes. To explain how a single photon in the two-slit experiment can "know" whether there is one slit or two, Hawking and Mlodonow write: In the double-slit experiment Feynman's ideas mean the particles take paths that thread through the first slit, back out though the second slit, and then through the first again; paths that visit the restaurant that serves that great curried shrimp, and then circle Jupiter a few times before heading home; even paths that go across the universe and back. This, in Feynman's view, explains how the particle acquires the information about which slits are openŠ. It is hard to imagine a more absurd physical explanation. We can think of no way to hardwire the behavior of photons in the glass reflection or the two-slit experiments into a physical law. On the other hand, writing a software algorithm that would yield the desired result is really simple. A digital reality whose laws are software is an idea that has started to gain traction in large part thanks to an influential paper in Philosophical Quarterly by Oxford professor Nick Bostrom. Writing in the New York Times John Tierney had this to say: Until I talked to Nick Bostrom, a philosopher at Oxford University it never occurred to me that our universe might be somebody else's hobby. But now it seems quite possible. In fact, if you accept a pretty reasonable assumption of Dr. Bostrom's, it is almost a mathematical certainty that we are living in someone else's computer simulation. An alternate view (and more optimistic view) is that there exists a great consciousness whose mind is the hardware, and whose thoughts are the software creating a virtual universe in which we as beings of consciousness live.  

Breakdown of Infinite Resolution in Physics
Infinite resolution causes all sorts of breakdowns of known laws of physics, including matter imploding into black holes at sub-Planck scales and the inability to reconcile general relativity with quantum mechanics. Currently favored theories of matter and energy (string theory) and solutions to gravity and quantum mechanics (loop quantum gravity) assume a minimal length scale. Furthermore, evidence supports the notion that quantum states are digital in that spin values are quantized and there are no intermediate states, which is anomalous in a continuous space-time. As renowned physicist John Wheeler concluded
“every physical quantity, every it, derives its ultimate significance from bits, binary yes-or-no indications.”


Breakdown of Infinite Resolution in information Systems
It takes an infinite amount of resources to create a continuous reality, but a finite amount to create a quantized reality. The very nature of the computational mechanisms of a computer are essentially the same as Quantum Mechanics — a sequence of states, with nothing existing or happening between the states. The resolution of any program is analogous to the spatial resolution of our reality, just at a different level. In fact, carrying Moore's Law (consistent over the past 40 years) forward, computers will reach the Planck resolution in 2192. However, it is not necessary to model reality all the way to that level for the model to be indistinguishable from our reality. Only the OBSERVED elements of reality need to be modeled, and then only down to a resolution that matches the observational limitations of our measurement devices. A program can do this dynamically. Therefore, given Moore's law and the limitations of "observational reality", we should be able to create Virtual Realities that are indistinguishable from our current reality within 20 years or so. The very fact that our reality appears to be quantized may be considered strong evidence that reality is programmed.


The Simulation Argument
Various modern philosophers and scientists have posited that we are likely to be living in a simulation (Nick Bostrom). This is because it is highly probable that we will be able to create ancestor simulations within a few years, when we achieve a trans-human stage. Due either to the number of simulations that will be run, or to the proximity that we are to that stage, it is actually more probable that we are in one than the case where we haven't yet reached that stage.


4. Matter as Data
It was once thought that 0% of matter was empty space. In the early 20th century, scientists discovered that atoms are actually comprised of subatomic particles. If these subatomic particles, such as neutrons, were made of solid mass, like little billiard balls, then 99.9999999999999% of normal matter would still be empty space. That is, of course, unless those particles themselves are not really solid matter, in which case, even more of space is truly empty. This was shown to be the case in the 1960s via QCD, or quark theory. Now, string theorists say that even quarks are really just vibrating bits of string, possibly with a width of the Planck length. If so, that would make subatomic particles all but 1E-38 empty space; hence normal matter is all but 1E-52 empty space. Gets kind of ridiculous doesn’t it? In fact, if particles are comprised of strings, why do we even need the idea that there is something “material?” Isn’t it enough to define the different types of matter by the single number – the frequency at which the string vibrates? What is matter anyway? It is a number assigned to a type of object that has to do with how that object behaves in a gravitational field. In other words, it is just a rule. We don’t really experience matter. What we experience is electromagnetic radiation influenced by some object that we call matter (visual). And the effect of the electromagnetic force rule due to the repulsion of charges between the electron shells of the atoms in our fingers and the electron shells of the atoms in the object (tactile). In other words, rules. In any case, if you extrapolate our scientific progress, it is easy to see that the ratio of “stuff” to “space” is trending toward zero. Which means what? That matter is most likely just data. And the forces that cause us to experience matter the way we do are just rules about how data interacts with itself. For example, probability wave functions follow patterns described accurately by relatively simple mathematical equations that resemble probability functions that apply to data. Data and Rules – that’s all there needs to be. By Occam's Razor, this is a simpler and more likely way to describe matter and only requires pure data.


Computational Analog and the Observer Effect
Of course, one does not need to model reality all the way to the Planck level for the model to be indistinguishable from our reality. An efficient program would dynamically generate incremental resolutions of various components, as needed, such as when an object is put under a microscope. Extending this concept to the quantum level, once a subatomic particle is observed, the program must then establish properties for that particle, effectively resulting in the collapse of the probability wave function. In 2008, the Institute for Quantum Optics and Quantum Information (IQOQI) in Vienna, determined to a certainty of 80 orders of magnitude that objective reality does not exist by itself and only comes into being when consciously observed. Thus, the uncertainty of this result is 1 in 100000000000000000000000000000000000000000000000000000000000000000000000000000000, clearly a ridiculously small number. This effectively put a nail in the coffin of the last hope for objective realists, the hidden variable theory. Interestingly, we can establish a very real explanation for this effect by turning the evidence question on its head; thus, "If you were to program a universe-simulation, what kinds of programmatic efficiencies would be needed?" Most important would be to employ dynamic reality generation. In other words, for any space unobserved by a conscious entity, there is no sense in creating the reality for that space in advance. It would unnecessarily consume too many resources. Instead, macroscopic reality may be modeled with an extremely high degree of compression, which I've estimated to be about 100 trillion. But once you decide to isolate a subatomic particle in that macroscopic object and observe it, the program would then have to establish a definitive position for that particle, effectively resulting in the collapse of the wave function, or decoherence. Moreover, the complete behavior of the particle, at that point, might be forever under control of the program, like a finite state machine. After all, why delete the model once observed, in the likely event that it will be observed again at some point in the future? Thus, an efficient process of “zooming in” on reality in the program would result in exactly the type of behavior observed by quantum physicists, fully explaining the Observer Effect as well as Quantum Entanglement. In other words, in order to be efficient, resource-wise, the Program decoheres only the space and matter that it needs to.


Equations Creating Reality
To cite one of many examples, it has been shown that the negative frequency solutions to Maxwell’s equations actually reveal themselves in components of light. Were our reality what it appears to be, solutions to equations should only make sense in the context of describing reality. However, it seems to be the other way around. Data and rules don’t manifest from the reality; they create the reality.


Other Computational Similarities
Many researchers have also noted that a simulation model solves the “prime mover” philosophical problem. While the big bang theory implies a universe that arises from nothing, which has no grounding in objective reality, a virtual reality can easily "boot up" from an external context. In Brian Whitworth's paper "The emergence of the physical world from information processing", he outlines several additional examples of circumstantial evidence that our universe is a simulation, including:

  • Randomness and apparent lack of hidden variables
  • It never made sense to Einstein, nor many other scientists, that particles would behave randomly (e.g. radioactive decay) as opposed to following deterministic rules. However, in a computational model, random number generators are a simple concept even in today's systems, and can easily be used to drive apparently random behavior.
  • Probability waves are easy to create — Related to the point above, the probability waves that describe all matter and are responsible for the real effect of quantum tunneling, have no basis in objective reality, yet are easy to construct programmatically.
Explanator Power

The huge set of well-studied anomalies facing us in fields as varied as physics (entanglement), philosophy (near death experiences), geology, anthropology (OOPart) , metaphysics (precognition) and psychology can all be explained only by a programmed reality model. No other theory has that explanatory capacity. quantu entanglement

Thermodynamics and information

There is an uncanny similarity to
(S=k*ln(W)) and
(H=E(-ln(p(X))). In thermodynamics, entropy is proportional to the logarithm of the number of states that matter can be in. In information theory, entropy is proportional to the logarithm of the probability mass function of some random variable. This doesn't necessarily imply that matter is information, just that they behave similarly at a macroscopic level.


Near Death Experiences
Hundreds of research papers and dozens of books by reputable scientists and doctors have compiled evidence that near death experiences can not be artifacts of a dying brain, but rather real conscious experiences outside of our normal physical reality. 


Past Lives
Researchers and psychiatrists such as Dr. Ian Stevenson have studied thousands of cases of children with past life recollections and adults who have recollections under hypnosis. In many cases, special skills, languages, phobias, corroborating artifacts, and/or knowledge of people and places appear to be inexplicable without invoking reincarnation. Even the American Medical Association admitted that some cases were "difficult to explain on any assumption other than reincarnation."


The Finely Tuned Universe
The universe is incredibly finely tuned for the physical existence of matter, let alone life. For example, universal constants cancel out all of the vacuum energy in the universe to an amazing accuracy of one part in 10 to the 115th power. Also, a deviation in the expansion rate of the early universe of 1 part in a billion in either direction would have caused the universe to immediately collapse, or fly apart so fast that stars could never have formed. There are many many more such examples. The standard scientific explanation for this effect is that a near-infinity of universes are spawned every second, most of which are entirely benign. Via a philosophical sleight of hand called the anthropic principle, we happen to be in the only perfect one. Occam's Razor heavily favors a design-oriented simulation theory here (the consciousness-driven model solves this).

Article byDemi Powell