Jump to content

Computer simulation universe


TJPrime

Does this make sense?  

21 members have voted

  1. 1. Does this make sense?

    • Yes
    • Yes, but with changes
    • No, I have no idea what you're talking about/this makes -1 sense


Recommended Posts

For years, as many scientists do, I have wondered how the universe started. I know about the big bang, but what banged? A theory came across me about a year ago - not too long before I started playing KSP;

The universe is just a computer simulation. God exists, as a gamer.

This makes sense to me - humans are very close to doing this with computers. There are other reasons too, some of which scientists haven't thought about. One thing is that, when close to matter, time slows down. Also, when travelling at extreme speeds (i.e - light speed), time slows down. This could be seen as lag. Lag found in computers. It's slowed because of the computer we live in is processing every atom in a given space and is calculating where it should be at a certain speed

Also, pixels. All energy is associated to a particle: light and photon, electricity and electron, ect. Maybe all energy particles are made from the same thing in different arrangements, just like atoms with their neutrons, protons and electrons. There's also the point that matter is condensed energy (y'know, E=MC^2).These 'energy particles, particles' (which we'll call 'Energons') are just the pixels of the universe.

Furthermore, there are these black holes everywhere. If the universe is the internet, black holes are equivalent to Danny2462's videos compressed into a nano-second and played on a loop.

Death = universe has run out of memory

Multiverses? Don't get me started!

What's your guys ideas? Does this seem legit?

Link to comment
Share on other sites

I've heard the argument that a species that can create computer simulations will create many, many simulations, so if it overwhelmingly likely that we are in some sort of simulation as opposed to the "original universe".

Now, on an intuitive level, I am certain that this must be rubbish, but on an intellectual level, I can't work out a good reason why it should be. It drives me insane!

Link to comment
Share on other sites

The quantum properties of nature would exclude a simulation running on some computer.

What do you mean?

Then what about the people who made the simulation? How did their universe come to be?

Two ways:

1) they're also a computer simulation, and it's an endless chain of simulations.

2) they don't 'exist' as such. I was once told that God does not exist, but he created existence.

Link to comment
Share on other sites

It could also be a dream of someone in a morning lecture.

Simulating is easy, as far as the rules are consistent - but who the heck is simulating it ? There's must be an endpoint of the simulation... can't see several beings doing it for the simulation could be inconsistent, nor I can see that the simulator can also be said to be simulated.

Must be God.

Link to comment
Share on other sites

What do you mean?

The land of elementary particles is weird. Our rules of macrocosmos don't apply in their totality in this land. Particles aren't tiny balls and there are weird interactions between them, even seemingly impossible to us. Heisenberg's principle of uncertainty rules there.

It could also be a dream of someone in a morning lecture.

Simulating is easy, as far as the rules are consistent - but who the heck is simulating it ? There's must be an endpoint of the simulation... can't see several beings doing it for the simulation could be inconsistent, nor I can see that the simulator can also be said to be simulated.

Must be God.

Maybe the simulation is made by a simulation, and that simulation is made by another, ad infinitum.

Edited by lajoswinkler
Link to comment
Share on other sites

A theory came across me about a year ago - not too long before I started playing KSP;

The universe is just a computer simulation.

Theories are backed by data. This isn't even a hypothesis, because those are testable. What you have here is an idea, one that not really new at all. In fact, there was a very popular movie back in 1999 that featured simulated reality as a central plot point. I bet you've seen it.

What's your guys ideas? Does this seem legit?

Here's the short version of my response:

Here are some of the most obvious issues, but it's by no means exhaustive.

This makes sense to me - humans are very close to doing this with computers.

No we aren't. Not even remotely. Not only can we not simulate things with perfect fidelity, a hard requirement for this to be true (we could detect errors, and have been looking for decades), the computers aren't close to powerful enough to simulate anything of any real size. Moore's Law isn't magic, nor is it really a law. There are practical limits to what you can do with any technology, and we're getting damn close to the end of the road for silicon computer chips. Most industry experts, including Moore himself, are astonished it's held up as long as it has. It's very likely we'll see the end of Moore's Law at 5nm, which Intel projects will hit the consumer space in 2020. After that, we'll need something new. Graphene is promising, but expecting consumer products in time to pick up the slack from Si in another ten years, when not one graphene product is yet on the market, is a techno-polyannaism. The hard truth is that in the very near future, computers are going to stop advancing as quickly as they have in the past.

But forget about the future, what about today? Just how far off are we from doing something like this?

Let's think for a moment about how to store the data to represent the state of the universe. Let's say that we've got some compression algorithm that's so good it's mathematically impossible, storing all information about an atom with a single byte, with no overhead to read or write. Position, momentum, spin, charge, temperature, quantum states, chemical bonds, vibration, all in one byte. Further, we'll say there's nothing in the universe but atoms (laughably false, 96% of the universe isn't even matter, and atoms themselves are quite complex arrangements of even smaller particles, which are themselves made up of even smaller particles). So, for every atom that exists in the universe you want to simulate, you need exactly one byte of storage. I'd use bits instead of bytes to be even more generous, but I don't want to deal with the factor of 8. I'm going to write out each number the long way so the sheer size sinks in.

Okay, so how much storage is that? Well, as it happens there are around this many atoms in the universe (10^80)

100,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000

And the largest hard drives store around this many bytes (4 * 10^12)

4,000,000,000,000

So, just to store the data for the simulation, you need this many of the largest hard drives in the world (10^80 / (4 * 10^12) == 2.5 * 10^67):

25,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000

That number is too big for human minds to comprehend. Literally, it's meaningless to us. Even a number like a million is too much for us, but that's a trillion trillion trillion trillion trillion million. So let's say we can build these hard drives for free, and they weigh a nanogram. Just how much is that? Unsurprisingly, a lot.

10^67 hard drives at 0.000000001 grams each has a mass equal to this many Suns ((2.5 * 10^67 / (10^15)) / (1.98 * 10^30) == 1.25 * 10^22):

12,500,000,000,000,000,000,000

This is a billion times more massive than the most massive thing ever observed, which is a whole galaxy, APM 08279+5255. So just to store the data needed in this impossibly easy scenario stacked in your favor by a factor of more than a billion, you'd need billions of galaxies worth of hard drives. And what about the rest of the computer? How do you stop this thing from collapsing into a black hole? Just storing the state of your simulation will take up a much of the mass of the universe. We haven't even begun to talk about execution resources, how to power the thing, how to cool it, how to deal with the second law of thermodynamics in such an enormous, well ordered machine, how to deal with errors in the simulation, how to deal with the inevitable star crashing through your computer the size of a billion galaxies, how to deal with the propagation delay on a computer whose traces measure in the gigaparsecs, etc. etc.

But all this was stacked in your favor. In a real scenario, you could never simulate a universe as large or complex as your own, because doing so would require more matter than there is. The lower bound on how much matter you'd need to store the state of the universe is actually the universe itself. At best, you could use each particle to represent itself, and run the simulation using the laws of the universe already in place.

So to put it mildly, it's ludicrous to say we're anywhere close to being able to do this.

There are other reasons too, some of which scientists haven't thought about.

That's very arrogant, don't you think?

One thing is that, when close to matter, time slows down. Also, when travelling at extreme speeds (i.e - light speed), time slows down. This could be seen as lag. Lag found in computers. It's slowed because of the computer we live in is processing every atom in a given space and is calculating where it should be at a certain speed

Why do you think determining the position of an object moving in a straight line is so hard? Simulating relativistic motion is only difficult because of the effects of relativity. It's a trivial matter to analyze Newtonian motion. And your simulation has to be perfectly accurate anyway, so it shouldn't matter how fast things are going. If there's a "time step" to the universe, it's the Planck time. Which, for the record, is the amount of time it takes to move one Planck length at the speed of light. The Planck length is the distance between which it's not possible to measure a difference in position between two points. So there's no need to slow things down that are moving quickly, because even if they're at the maximum possible speed, they won't produce an error in measurement.

But anyway, wouldn't it make a lot more sense for the folding or proteins, complex gravitational interactions (n-body problem, black holes are easy because they're singularities) or the motion of an object through a fluid to cause time to slow down? They're far, far harder to simulate. But wouldn't that slow down the whole universe if it were so hard, not just the offending participant? And this doesn't even match up with what lag is, because lag is a delay between initiating an action and it happening, not the slowing down of events. Lag is also not caused by processing load, but latency. This is commonly confused, and is a huge pet peeve of mine. Doubly so when somebody uses the nonsensical qualification "frame lag." There is no latency, just a reduction in frequency.

This metaphor is tortured, and makes no sense.

Also, pixels. All energy is associated to a particle: light and photon, electricity and electron, ect. Maybe all energy particles are made from the same thing in different arrangements, just like atoms with their neutrons, protons and electrons.There's also the point that matter is condensed energy (y'know, E=MC^2).These 'energy particles, particles' (which we'll call 'Energons') are just the pixels of the universe.

The idea of an "energy particle" is an oxymoron. Particles are not energy (though you can convert a particle into its energy equivalent), they have energy. If energy was made up of particles, then how do things get hot? How do particles vibrate? Wouldn't globbing on more energy, in the form of mass in your energon particles, just increase the momentum and reduce the vibration rate? How does an atomic nucleus capture a photon and later re-emit it? Is the photon breaking down into its base energons, then later recombining? Why doesn't it turn into different particles, or stay there forever?

Really, all of the forces you've been talking about have been the same one, electromagnetism, and it needs no invented "energon" because its force carrier is the photon and science already has a good idea how all this works.

Using pixels as a metaphor is even more tortured than the last one. Pixels are a method of displaying something, what does it have to do with the forces that drive the universe? I seriously can't follow your logic. You hypothesize a new fundamental particle with no reason, then abruptly declare them to be the same as pixels for again no reason. And if you think the fundamental building blocks of programs are pixels, and not, you know, data, then you have a lot to learn about computers as well as physics. A pixel is an abstract concept related to the particular way in which we tend to display images to the end user using an imprecise grid of finite points and mapping continuous space onto it. There are formats that use continuous curves, have no resolution and can be scaled infinitely with no loss of quality (vector graphics). Pixels are no more fundamental to computer science than printers are. It's just a way to visually represent the underlying data.

I don't want to sound rude, but it's clear you have very little knowledge of physics or computers. Do you really want to be weighing in on a cosmology rooted in a computer simulation when you don't seem to understand either subject?

Two ways:

1) they're also a computer simulation, and it's an endless chain of simulations.

Nope. Because each universe simulation must be less complex than the last, you have an exponential decay of the complexity of nested universes. No matter how complex the first universe is, somebody has to build it out of some amount of material they spend a finite amount of time collecting. Even if the first universe is infinite, the computer they build must be finite. Even if you were able to gather up all the matter and energy in your universe and were able to simulate a universe 99% as complex as your own, in only 1000 generations of universes, you have 0.004% of the complexity left.

You can use some basic algebra to determine how many nest levels you can go with a given starting complexity and efficiency. Assuming a universe of our size and the impossibly high efficiency of 99% (0.01% would be impossibly high, but again, I'm generous), then the complexity at a given generation x is given by the equation:

0.99^x * 10^80 == complexity

When the complexity hits 1, you're done. That comes out to a paltry ~18,000 generations. After that, you're unable to simulate even a single atom with the resources available in your universe. Because of the nature of exponential decay, it doesn't matter how big the first simulation is, you're going to come to the end very rapidly.

2) they don't 'exist' as such. I was once told that God does not exist, but he created existence.

This is known as "special pleading." You've just claimed, with no justification, that reason doesn't apply to your argument. Pleading for a special exemption from logic, as it were. I'm sorry, but the rules of logic apply, and if your idea isn't logical, then that just means it's wrong. Thinking up magical duct tape to patch over the holes in your idea just compounds wrongness with nonsense and more wrongness.

It makes no rational sense to discuss the actions of things you've accepted to not exist. If something doesn't exist then it has no impact on the universe. Deciding that god, or anything else, can simultaneously be imaginary and real is absurd. Things that don't exist don't do things, because they don't exist. If god created the universe, than by definition god exists. If god does not exist, then by definition he cannot have created the universe. This is just a tortured redefinition of the word "exist" to try to escape the grasp of science and logic.

What's your guys ideas? Does this seem legit?

No, and that's the nicest way I can put that.

Edited by LaytheAerospace
Link to comment
Share on other sites

Well, anything's possible... especially where you cross into realms outside our universe (but even potentially within). All the best thinkers and scientists will tell you that you are much more likely right if you state that something is possible than if you state that it is not. Obviously no conclusions can be drawn about the truth at this point, nor can any experimentation be done at this point. As a thought experiment however... it's a feasible concept. I will use some of the previous poster's arguments to make a couple of points.

Nope. Because each universe simulation must be less complex than the last, you have an exponential decay of the complexity of nested universes.

This is undoubtedly true. What he fails to consider is that the exponential decay is natural, and as a rule can be viewed both ways. In fact, it would be equally reasonable to describe the situation as there being exponentially increased complexity in each universe further up the chain. Just because we cannot imagine a universe more complex than our own doesn't preclude its existence. It's entirely possible that computers in this universe don't work the same as ours, and that their laws of physics are much more complex than ours. For example, it could be possible that their computing architectures use a system that employs basic switches and processing on signals/data-bits that have 40 states instead of the 2-state binary system we must use. We need to use binary (at the moment) because it's all we've got that works with our technology. In fact, it could be entirely possible to make higher-order computers in our own universe. This means that the universe being simulated would have more simplistic rules, and likely be smaller than the universe in which it is being simulated. However, as all we know is our own universe, it's hard to get a reasonable understanding of the relative sizes or complexities of other universes (or even what the realm of possibility includes).

10^67 hard drives at 0.000000001 grams each has a mass equal to this many Suns ((2.5 * 10^67 / (10^15)) / (1.98 * 10^30) == 1.25 * 10^22):

12,500,000,000,000,000,000,000

Again, we have the fallacy of thinking that everywhere else must be the same as here. This entire line of reasoning is entirely accurate if you were to try to simulate your own universe.... then again, who would do that? Even those interested in simulations for practical purposes are generally only interested in much, much, much, much^10*10 smaller simulations... like weather patterns or political patterns. The only reason one might wish to simulate an entire universe is for a basic understanding of how universes tend to change or develop... or how universes with other types of laws might behave, or even to allow people to play with the universe (like ksp for example). Of course, as in ksp, so too would a game likely use less stringent physics and rules to make play more fun. It doesn't have to be exactly like our universe, it can be whatever we want it to be.

Conclusion: 2-fold

A) It is entirely possible that our universe is simulated. The odds of it obviously cannot be estimated, although if multiple universes exist, the odds go up that at least one is simulated.

B) Given that we have no reference point, and no way to reason or logic the answer into something likely resembling the truth while postulating about existence outside our own dimensions or universe... then speculating about the methodology behind how this simulation was achieved is pointless and baseless. It would be far more interesting and reasonable to speculate the ways in which we might ourselves simulate a smaller universe. (by dumbing down physics? AI? exotic computing methods? etc)

Link to comment
Share on other sites

Well, anything's possible... especially where you cross into realms outside our universe (but even potentially within).

It's not a very interesting observation to say that anything is possible when you disregard the rules of possibility.

All the best thinkers and scientists will tell you that you are much more likely right if you state that something is possible than if you state that it is not. Obviously no conclusions can be drawn about the truth at this point, nor can any experimentation be done at this point. As a thought experiment however... it's a feasible concept.

Sure, it's feasible. I just don't think it's very likely or interesting (more on that later), and felt a need to poke holes in some of the most egregious problems with this presentation, like claiming we're nearly able to pull this off ourselves with modern computers and that the OP had scientific insights that had eluded the actual scientists. I find it hard to ignore ignorance paired with arrogance masquerading as science.

This is undoubtedly true. What he fails to consider is that the exponential decay is natural, and as a rule can be viewed both ways. In fact, it would be equally reasonable to describe the situation as there being exponentially increased complexity in each universe further up the chain. Just because we cannot imagine a universe more complex than our own doesn't preclude its existence.

I can imagine a universe more complex than ours just fine. More fundamental forces. More fundamental particles. Hell, just more energy. Non-locality of macroscopic particles, sure thing! A universe where forces operate on larger scales allowing the rise of living beings the size of galaxies or even bigger. The laws of the universe could even vary across space and time (it appears that ours do not, this has been heavily researched), which would be very interesting to anybody living in such a universe.

The problem is that the chain of universes, no matter how complex previous universes are, cannot be infinite. The first universe must build a computer of finite complexity and power because it must be built in a finite amount of time, and thus have a finite number of nested universes it can sustain. Because the decay is exponential, it doesn't matter how complex that first computer is.

It's entirely possible that computers in this universe don't work the same as ours, and that their laws of physics are much more complex than ours.

Any universe which has laws precluding computers of infinite power (like our own) can only nest universes with computers of similarly finite power. If this were not the case, then they could just have their computer simulate a more powerful computer, which could then simulate an even more powerful computer, ad infinitum, which is clearly nonsense. Simulations incur overhead, because you must always do more work than a computer purpose built for the task. Even if we wanted to, we couldn't build a simulated universe that violated this property, because it's running inside a computer that obeys the laws of our universe. If you could get around it, you could just simulate a very small universe that could simulate increasingly complex computers, eventually yielding a computer more complex than the one which started the initial simulation.

Thus, if we are in some sort of chain of universe simulations, all computers in all universes after a universe with finite computer power have finite power. Conservation of energy also requires this to be true, though I'll admit conservation doesn't necessarily hold in universes "above" ours, but any universes after a conserving universe must also conserve, and thus we're right back to the problem of no longer being infinite and rapidly running out of complexity for our simulation.

For example, it could be possible that their computing architectures use a system that employs basic switches and processing on signals/data-bits that have 40 states instead of the 2-state binary system we must use. We need to use binary (at the moment) because it's all we've got that works with our technology.

The number of concurrent signals your chips can handle isn't relevant to the fundamental limits imposed by information theory. You can remove the computer entirely and what I said is still true. It doesn't matter if your chip is built from gates that handle a billion signals at once, you still have to store the data, and you can't store more data than the total data storage capacity of your universe. This is the "counting argument."

It's a fundamental law of compression, and has some fairly major consequences in the realm of computing. Specifically, it means it's not possible to produce a lossless compression algorithm that can operate on arbitrary data and also guarantee any factor of compression, no matter how small. There will be some sets of data for which the algorithm is unable to produce any compression at all. Similarly, you can't store a universe more complex than your own inside your universe, because there isn't enough stuff in your universe to represent all the possible states that the more complex universe can take. Change the rules of the nested universes all you like. In the end it's all just data, including any further nested universes, and at the mercy of the counting argument.

If you want a more thorough explanation of the counting argument, read section 1 of this: http://mattmahoney.net/dc/dce.html. Or google it, I don't care. Just stop arguing against it, because it's true.

the universe being simulated would have more simplistic rules, and likely be smaller than the universe in which it is being simulated.

This is exactly my point. Each new universe must be less complex than the last, until you're left with no complexity at all, and no universe. You could change the rules in order to make certain aspects of the universe more or less complex, but the overall complexity monotonically decreases. Because the decay is exponential, this happens very fast regardless of the chosen size of your initial universe. Doubly so when you consider that it's objectively unreasonable to expect any universe to be converted in any appreciable fraction to any one thing (especially if the second law of thermodynamics holds, which precludes perfect, unchanging megastructures), much less a giant computer that never makes mistakes or suffers damage. As a result, it's all but certain that you'd lose the overwhelming majority of the complexity of the previous universe at each generation, reaching a minimum in extremely short order. As in, can finger count the generations.

Even those interested in simulations for practical purposes are generally only interested in much, much, much, much^10*10 smaller simulations... like weather patterns or political patterns. The only reason one might wish to simulate an entire universe is for a basic understanding of how universes tend to change or develop... or how universes with other types of laws might behave, or even to allow people to play with the universe (like ksp for example). Of course, as in ksp, so too would a game likely use less stringent physics and rules to make play more fun. It doesn't have to be exactly like our universe, it can be whatever we want it to be.

I'm not sure exactly what point you're driving at here. Yes simulations are small scale. That's because of the incredible difficulty simulating even very small things! This is not making a strong case for perfect simulations of an entire universe, but rather highlighting just how ludicrously generous I've been in not pointing out the effectively impossible to satisfy processing requirements to simulate said universe. Even if you gathered up all the matter from the universe, built a computer and assumed data storage were infinite, you'd only be able to simulate a computer with a tiny fraction of the complexity of its host universe. Look at it any way you want and it still falls apart.

A) It is entirely possible that our universe is simulated. The odds of it obviously cannot be estimated, although if multiple universes exist, the odds go up that at least one is simulated.

What do you get when you double statistically insignificant odds? Statistically insignificant odds, of course. Just because we can't demonstrate something to be false doesn't mean it's worth seriously considering. An infinite number of things cannot be disproved, and I choose to not seriously consider any of these things, because they are by their very nature non-scientific. I'll need some evidence, or at the very least a testable hypothesis, if you want me to do anything but ignore or mock it. Why? Because of Occam's Razor, everybody's favorite misunderstood scientific principle.

Put simply, it says that given two otherwise equally valid competing hypotheses, the one which makes the fewest assumptions is most likely to be correct.

My view is that we live in a universe that came about as universes do, however that may be.

Your view also starts here, but then you say that maybe extradimensional aliens built a mega-computer (with the help of different laws of physics) that was so inconceivably powerful that our entire universe is just a simulation inside it. In fact this has happened a number of times, perhaps an infinite number of times.

I'll leave it as an exercise to the reader to pick out which of us is making fewer assumptions.

B) Given that we have no reference point, and no way to reason or logic the answer into something likely resembling the truth while postulating about existence outside our own dimensions or universe... then speculating about the methodology behind how this simulation was achieved is pointless and baseless. It would be far more interesting and reasonable to speculate the ways in which we might ourselves simulate a smaller universe. (by dumbing down physics? AI? exotic computing methods? etc)

More interesting to you, maybe. Personally, I find handy wavy "anything is possible" arguments immensely tedious. It's the intellectual equivalent of school children playing superheroes and giving themselves new superpowers every few minutes.

"Nuh uh, you didn't get me because I can fly!"

"What, this doesn't make sense? Well, it doesn't have to, because multiverse!"

Intellectual white noise, and I'll have none of it.

Edited by LaytheAerospace
Link to comment
Share on other sites

This is basically the computer simulation variant of the head in a jar idea.

How do you know you are not just a head in a jar hooked up to a computer simulating all sensory input?

There is no way of proving this is or is not the case.

If you want to simulate every single particle in the universe you need some big computer. However just as with KSP, you only need to simulate in detail where you are looking at. (No need to simulate the sound of a falling tree if there is nobody to hear it.) We know that observing something does impact what we are observing. (http://en.wikipedia.org/wiki/Schr%C3%B6dinger's_cat)

We know simulated universes exist. KSP is one of them. We can have a simulated universe inside a simulated universe. If fact we do have several games that incorporate mini games the character inside the game can play. Playing cards inside GTA for example.

If our universe is a simulation, that means that all our laws of physics are made up. "Reality" could be something completely different our simulated brains can't even begin to comprehend. So trying to apply any of our logic to it would be cute/hilarious to the observer.

And mind you the simulated universe we are in could be full of errors. (They left out the Brojinacs in this simuulation for example, can you image that an entire universe without a single one of them?) A lot of stuff isn't implemented yet. (that's why so much tastes like chicken) and sometimes the whole thing crashes and has to be restarted from the last save. (Which is called Déjà vu in French.) ;)

Edited by running
Link to comment
Share on other sites

Theories are backed by data. This isn't even a hypothesis, because those are testable. What you have here is an idea, one that not really new at all. In fact, there was a very popular movie back in 1999 that featured simulated reality as a central plot point. I bet you've seen it.

In the same year:

http://www.imdb.com/title/tt0139809/

Nope. Because each universe simulation must be less complex than the last, you have an exponential decay of the complexity of nested universes. No matter how complex the first universe is, somebody has to build it out of some amount of material they spend a finite amount of time collecting. Even if the first universe is infinite, the computer they build must be finite. Even if you were able to gather up all the matter and energy in your universe and were able to simulate a universe 99% as complex as your own, in only 1000 generations of universes, you have 0.004% of the complexity left.

What if ... this simulation/universe is already dumbed down like KSP is relative to our world? (Only on sun, always only one SOI, floating points etc.) :wink:

OK, you basically answer this one here. :)

You can use some basic algebra to determine how many nest levels you can go with a given starting complexity and efficiency. Assuming a universe of our size and the impossibly high efficiency of 99% (0.01% would be impossibly high, but again, I'm generous), then the complexity at a given generation x is given by the equation:

0.99^x * 10^80 == complexity

When the complexity hits 1, you're done. That comes out to a paltry ~18,000 generations. After that, you're unable to simulate even a single atom with the resources available in your universe. Because of the nature of exponential decay, it doesn't matter how big the first simulation is, you're going to come to the end very rapidly.

But this assumes that the simulating universe is similar to the universe being simulated.

What if ... our simulation/universe is as similar to the simulating universe as our world is similar to the cartoon world with all its wonky laws? :wink:

No, and that's the nicest way I can put that.

You did very well. :D

Link to comment
Share on other sites

@LaytheAerospace

lol. That's awesome. However "handy-wavey" you may think it is, my point was not that it is probable or sensible... my point, in counter to the previous post, was that it may be possible... and that is a far from saying the idea is rubbish. As for... "then you're left with no complexity" that argument doesn't even stand up against simple algebra. Decreases in complexity on a fractional or even exponential curve will never hit zero... and my point was that regardless of the *relative* complexity of one universe compared to another, we can only ever consider the idea of complexity as it relates to our own experience and knowledge... so there's no way to know whether someone in a universe with 1 billionth the complexity of ours would consider their universe complex by their standards. As a simple thought experiment, all you have to do to see that this is possible is simply forget what you think you know, because it may or may not apply to higher-order universes. As for you pointing out that doubling statistically insignificant odds still leave them rather insignificant... so? Again, I wasn't suggesting that it was significant or even probable... just that it's a possibility that exists... and when you have a whole universe to deal with (let alone possibly countless universes) then "statistically insignificant" is just a super fancy way of saying uncommon... not nonexistent. I could continue here, but I get the impression I've already wasted enough of my time.

Link to comment
Share on other sites

I know about the big bang, but what banged?

The universe. Can you explain what it is about that you're struggling with? If you're trying to think of what might have caused the Big Bang, that's probably either unknowable or a meaningless question. Either way, we don't have any idea.

Link to comment
Share on other sites

But this assumes that the simulating universe is similar to the universe being simulated.

What if ... our simulation/universe is as similar to the simulating universe as our world is similar to the cartoon world with all its wonky laws? :wink:

What if ... our simulation/universe is as similar to the simulating universe as our world is similar to a game of Poker? (As in Poker being a universe with laws etc.) You don't even have the means to begin to describe the "real" universe from within the poker universe. Hence it is impossible for those inside the simulation to make any claims about the "real" universe. Perpetuum mobile for instance could be a everyday, common thing in the "real" universe even though our logic says it can't be based on the rules inside our simulation. Even what we call magic, could be a real thing with mana as it's underlying resource etc. In that case, the extremely high power computer needed might be quite easy to construct.

Some questions:

- What constitutes a universe... is KSP a universe? is GTA a universe? is Poker a universe? What defines what is and isn't a universe?

- Does simulation imply it resembles the real world? or can you simulate something that is completely made up (or with a very limited amount of reality) (Is world of warcraft a simulation? or does it have to be something like Flightsim?)

and the big question....

What if I was able to prove that we are all inside a simulation and we ourselves are in fact simulated?

What does that change?

Are you going to go on strike, demand the "user" installs mechjeb to make your life easier? Or perhaps change certain parameters?

Do you think you can Neo yourself outside of the simulation and make the changes yourself?

Maybe we should setup a simulation, let some sentient being evolve and then show them it's a simulation and see how they react?

Link to comment
Share on other sites

However "handy-wavey" you may think it is, my point was not that it is probable or sensible... my point, in counter to the previous post, was that it may be possible... and that is a far from saying the idea is rubbish.

And my point is that this is not only extremely unlikely, it cannot be falsified and it's a useless exercise to spend time considering each and every thing you can't demonstrate to be false. It also may be possible that this is the dream of a space turtle. Want to discuss that now?

As for... "then you're left with no complexity" that argument doesn't even stand up against simple algebra. Decreases in complexity on a fractional or even exponential curve will never hit zero...

Congratulations on pointing out that exponential decay is asymptotic? You should probably go back to my first post where I actually do this math. You don't need to hit zero in order to no longer be able to continue the simulation. My first post's stop point was when you could only simulate a single atom, though obviously you run out of complexity long before that. Computers are made out of many trillions and trillions of atoms, and as I've repeatedly pointed out, any universe we simulate is restricted by the laws of our computers. Here's that equation again. I'll leave it in the abstract form instead of substituting numbers. Feel free to plug in whatever numbers you like and solve for generations using some simple algebra.

efficiency^generation_limit * start_complexity = 1

So if you gather up 1% of the energy in the universe, have a starting complexity of one googol (10^20 times more complex than our own universe), then you get a whopping fifty generations. And that's still being hopelessly generous to you. 0.01% (still generous) gets you 25 generations. Multiply the complexity of your universe by a thousand and you get...26. Another factor of a thousand...27. We just added six zeroes to the end of the complexity of the universe, and got two additional generations. Is it sinking in yet that starting size doesn't matter?

Since we're talking about computers, we can actually think in terms of the first computer's execution resources. It must execute all the universes below it (because it's running the computers that run the universes). You cannot continually spawn more processes and just wave your hands saying they're getting less complex. Each universe requires resources, and there's a minimum complexity below which there's no point to run a simulation because nothing interesting can be simulated (or the complexity is not great enough for life to form and build the computer). Fork your universes as often as you like. You don't get infinite universes, you get a fork bomb.

How about an experiment. Write a batch script that opens itself twice and run it. Tell me what your computer does. If it can keep spawning these ludicrously simple processes forever, then I'll admit you're right.

and my point was that regardless of the *relative* complexity of one universe compared to another, we can only ever consider the idea of complexity as it relates to our own experience and knowledge... so there's no way to know whether someone in a universe with 1 billionth the complexity of ours would consider their universe complex by their standards.

No, I'm speaking in absolutes. An absolute minimum complexity is needed to accomplish certain things, like build a computer. You can't do that with one atom. You basically can't do anything with one atom. At a certain point the simulation can simply not continue because the overhead from the simulations above it use up all the resources.

So no, it doesn't matter what the denizens of a universe think about it. They don't even factor in to my analysis. There are fundamental laws which must be obeyed. You can't wave your hands, shout "Multiverse!" and get around that.

As a simple thought experiment, all you have to do to see that this is possible is simply forget what you think you know

I've explained repeatedly why this is not interesting to me. The ladder paradox is an interesting thought experiment. The ring paradox is an interesting thought experiment. The Monty Hall problem is interesting This is just nonsense. Pointing out that it cannot be falsified changes nothing, it just highlights the non-scientific nature of the idea. It's a bad thing for your idea to not be falsifiable. How many times do I need to say that?

and when you have a whole universe to deal with (let alone possibly countless universes) then "statistically insignificant" is just a super fancy way of saying uncommon... not nonexistent.

No, statistically insignificant means something particular. It means that the hypothesis (again, I'm generous) should be rejected, and the bar is set based on the problem domain. Doesn't matter how big the universe is, statistically insignificant retains its meaning. It most certainly is not "a super fancy way of saying uncommon." You don't get to supply your own definitions of terms and then debate me based on your incorrect definitions.

I could continue here, but I get the impression I've already wasted enough of my time.

Yes, yes you have. "Multiverse!" isn't a debate trump card, it just makes you look silly when you tell people they aren't allowed to use reason, math and logic when they discuss your ideas. If we cannot analyze ideas logically, then what's the point of discussing them? So everyone can come and marvel at how interesting and deep your idea is?

Link to comment
Share on other sites

Ok, I admittedly have not done a fully exhaustive view of the thread (at work currently) and I want to get this in before the possible threadlock (these discussions usually result in everybody agreeing to stop before a threadlock, so threaddeath. Or a threadlock because people kept at it.)

First: One of the big complaints about all this is the massive power requirements it would take an above system to run our universe.

Computer scientists who believe the simulation theory have been working on ideas for quite a while on how to go against this argument and one of the most compelling ones is that generally speaking you really don't NEED to simulate the universe as a whole down to the subatomic level. You only need to simulate in depth that which is experienced. Anything else can be done in a simpler way. Radiation for example, something used as an inherantly random thing with predictable curves (you can guess how radioactive a given block of material will be over time, but not accurately predict where any given particle of radiation will fly). For the purposes of modeling the radiation field you start by taking the "radiation value" of that brick of material, just a measure of how strong it is, and calculate a fading sphere (the whole inverse square law with the distance) centered on the brick. Now put in the surrounding objects (walls, floor, etc) and then take their densities and volumes, calculate appropriate 'shadows' for the blockage. You have a crude simulation of the radiation now (of course you can somewhat trivially make this better, but lets just continue from here). Toss a person inside, you can easily calculate how much radiation their body is absorbing and roughly 'where' based on their position and orientation inside the field. Assign a random probability that they will get cancer, another to what kind (weighted probabilities here), and to where (weighted by individual body part exposure). Your number comes up, bam! Cancer. But since there is no detector that cares about which particle hit what cells DNA, they don't bother modeling it. However, if for some reason we built a sensor that can be used to determine exactly this information, then the instant the 'sense!' button is pushed, the universe pauses and calculates down to the atom which cell was hit, how it was effected, etc, and propagates it through, so that when the sensor looks, it sees exactly what is expected as if the calculation had always been done in the first place. Why not always calculate this? Because not everybody is going to get this test done, so why waste the computations on it?

In short, you can reduce the calculations you need to run about the universe down to simple rules (macro object collisions like video games do instead of modeling out every atom in your foot touching every atom in the floor), and you only go hyper-detailed when something in the universe would otherwise check or care (IE: When a person looks.). But you only need to do Just In Time calculations, because you can just pause the universe if you need in order to make sure a given calculation was done 'as though it had always been done'.

Think of it this way. We have a video game called The Sims. Look how similar that is to the real world. People (in general) don't suddenly fall through the floor, fire spreads, objects can break, etc. And we certainly don't model out the atoms within. Now, if there was a capacity for the Sims to reason and to arbitrarily inspect objects and things super closely, they would quickly figure out something wasn't necessarily right unless we provided them the data they would expect to see. But a desktop computer can run this 'simulation', now if we tried to up that simulation to go atomic scale, we don't have a computer powerful enough on the planet to do that in anything approaching real time from our perspective (though of course to the sims everything is real time).

So as far as the inhabitants of the simulation are concerned, their simulation is a perfect recreation of the over-world.

I've read articles that declare that taking this thinking into account certainly brings the calculations required for a universe simulation down from impossible to simply astoundingly large, yet possible within a couple hundred years if not sooner.

Second) There are plenty of real scientists that take this idea very seriously and are attempting to provide experiments to help prove it. One example is a scientist who is trying to make ever faster atomic clocks under the assumption that we will eventually make a clock that operates faster than the refresh rate of the universe. The result would be a clock that SHOULD operate at X ticks/second but it does not actually (for unknown and undescernable reasons) have that precision. Another example is that astronomers assume the universe would be divided into "chunks" much the way Minecraft does (Minecraft worlds CAN be infinite, but to do this, only the area that the player occupies and the surrounding 8 areas are actually loaded into memory). As a result, it is predicted that certain super small/lightweight particles or even light would be able to show this chunking effect due to super small position errors that accumulate as they pass through millions of chunks. Both of these scientists receive a great deal of funding for their work, primarily because regardless of their attempts at the universe sim system, their research is valuable in many ways (faster clocks, better GPS. Better images of the universe, better understanding of the universe).

Finally) There is actually no reason to assume that a universe MUST make a simulation less complex in every way. Lets say that FTL is impossible, just flat out is, period. We could run a simulation of the universe that provides its inhabitants a method for moving FTL. We do this because we are curious how a species with this ability would actually develop. Now we make the universe as accurate as we can, and to the limitations that I mentioned above in point one, we do. But the important point is that we now have additional complexity to the universe and a logical reason for it to be added. So really the argument that every universe down the chain quickly becomes useless through lack of complexity is a bit silly.

Link to comment
Share on other sites

Computer scientists who believe the simulation theory have been working on ideas for quite a while on how to go against this argument and one of the most compelling ones is that generally speaking you really don't NEED to simulate the universe as a whole down to the subatomic level. You only need to simulate in depth that which is experienced. Anything else can be done in a simpler way.

If you want life to arise and evolve, you need to simulate the entire space of the universe at a pretty low level. If you abstract away the lowest levels to approximations, you lose the value of the simulation. It seems likely that if this is the case, that our universe is a simulation fudging the numbers at the lowest level to make computation possible, then that level is probably the Planck units. This has the unfortunate consequence of meaning we could never measure an error if we were, in fact, in a simulated universe. I'm very surprised you haven't taken the position that these Planck units can be seen as an expression of more limited physics imposed by the simulation. Within the framework of simulated reality, it's a valid interpretation.

However, if for some reason we built a sensor that can be used to determine exactly this information, then the instant the 'sense!' button is pushed, the universe pauses and calculates down to the atom which cell was hit, how it was effected, etc, and propagates it through, so that when the sensor looks, it sees exactly what is expected as if the calculation had always been done in the first place. Why not always calculate this? Because not everybody is going to get this test done, so why waste the computations on it?

Turning down the complexity of the simulation when detectors are not present would appear to be a major problem. How do you determine if an event will be detected? What is a detector, really? Is a scientist a detector if he's daydreaming and won't remember the observation? Do we only run a high resolution simulation in areas intelligent beings are currently looking? Can inanimate objects be detectors? You can't have life evolve otherwise, so I suppose they must be. So why isn't every particle in the universe considered a detector? What about effects of single particles that can be checked later? Like, say, a high precision camera dealing with dark counts on its CCD? Single particles must interact with it as single particles should, because it produces an immediate effect, but then again so does every interaction in the universe. So do we just wait until the result of that interaction would matter to calculate the result? Does it only matter if something intelligent would notice? Why?

In short, you can reduce the calculations you need to run about the universe down to simple rules (macro object collisions like video games do instead of modeling out every atom in your foot touching every atom in the floor), and you only go hyper-detailed when something in the universe would otherwise check or care (IE: When a person looks.).

What's so special about people? Doesn't the simulation need to maintain consistency regardless of whether or not sentient beings inside it notice? Abstracting the details away in order to simplify the simulation will inevitably cause errors, and the more you abstract away, the easier they'd be to detect. The details matter regardless of whether or not an intelligent being is watching at that moment. It seems a bit arrogant to think the universe takes special notice of life and that things don't matter until we look at them, and they only matter as much as we care to investigate.

Think of it this way. We have a video game called The Sims. Look how similar that is to the real world.

Seriously? It's not remotely similar to the real world. Can I perform scientific experiments in it? Look through a telescope at the stars? Watch life evolve? Enjoy the original creative works of the simulated people? Build new devices for the other Sims to use? Establish new political offices? Explore space? Of course not. It's absurd to claim that the Sims, or any other game, is anything but a pale imitation of reality that falls apart if you look more than an inch deep.

We do have software that is similar to the real world. We use it to simulate reality and check our predictions, or to make predictions about things we cannot solve analytically. It runs on supercomputers, and runs slowly. Apples to apples.

People (in general) don't suddenly fall through the floor, fire spreads, objects can break, etc. And we certainly don't model out the atoms within. Now, if there was a capacity for the Sims to reason and to arbitrarily inspect objects and things super closely, they would quickly figure out something wasn't necessarily right unless we provided them the data they would expect to see. But a desktop computer can run this 'simulation', now if we tried to up that simulation to go atomic scale, we don't have a computer powerful enough on the planet to do that in anything approaching real time from our perspective (though of course to the sims everything is real time).

I think you may be badly understimating the problem of complexity. Not all problems are created equal, and the Sims is certainly not making use of algorithms with exponential complexity. Making a rough facsimile of a tiny fraction of a society has a much lower complexity than, say, simulating turbulence, or the halting problem. Let's compare apples to apples, please.

I've read articles that declare that taking this thinking into account certainly brings the calculations required for a universe simulation down from impossible to simply astoundingly large, yet possible within a couple hundred years if not sooner.

I'd stop reading articles at whatever site told you that, because it's foolish on its face. Computers are only a few decades old, integrated circuits younger still, and those articles claimed to be able to predict the state of computing in centuries? There's a serious problem with intellectual honesty there. Without any more information, I'm going to guess that the articles abused Moore's Law, which I mentioned in my first post is expected to not hold very much longer. Using Moore's Law to predict the state of computing more than a decade or two out (and even that's a stretch) is a mistake. Apologies if those articles used a scientifically valid method to accurately predict what computers would be capable of in the year 2200. Regardless of their validity, I'd be interested in reading them, if you can dig up links.

Second) There are plenty of real scientists that take this idea very seriously and are attempting to provide experiments to help prove it.

This is a fallacious argument. The argument from authority, specifically. There are people seriously investigating, and arguing in favor of, all sorts of things. This doesn't mean they're any more likely to be true. I've seen plenty of people with Ph.Ds support obviously incorrect positions, like creationism, or 9/11 controlled demolition. Education is not infallibility.

One example is a scientist who is trying to make ever faster atomic clocks under the assumption that we will eventually make a clock that operates faster than the refresh rate of the universe. The result would be a clock that SHOULD operate at X ticks/second but it does not actually (for unknown and undescernable reasons) have that precision.

Finally, a falsifiable prediction! It's science! But the possibility of an experiment to produce evidence is not itself evidence, and my position is unchanged. Occam's Razor says this hypothesis is to be rejected.

Another example is that astronomers assume the universe would be divided into "chunks" much the way Minecraft does (Minecraft worlds CAN be infinite, but to do this, only the area that the player occupies and the surrounding 8 areas are actually loaded into memory).

Please don't say computers can simulate infinity, because they can't. Eventually, even Minecraft won't be able to generate any more areas, or the areas you've already been to must be lost, because that data has to go somewhere, and data storage is finite. Just because you don't need it loaded into memory doesn't mean the data doesn't exist.

Both of these scientists receive a great deal of funding for their work, primarily because regardless of their attempts at the universe sim system, their research is valuable in many ways (faster clocks, better GPS. Better images of the universe, better understanding of the universe).

Another argument from authority. Terrible ideas get funded regularly. See the solar roadways project for an example.

As far as the other benefits, sure. I love pure science. Fund those projects all day long. As a former defense contractor, I could point you directly at places to cut the defense budget to get us there, too. Once saw a room full of year old, unopened, $5k workstation laptops. They (US Army) had a few hundred thousand dollars left in the budget at the end of the fiscal year, and wanted to avoid a budget reduction the following year. So they bought several dozen computers they didn't need and left them to collect dust.

Finally) There is actually no reason to assume that a universe MUST make a simulation less complex in every way.

Never said it did. In fact, I've gone to great pains to speak generally in terms of overall complexity. Overall complexity is monotonically decreasing, which precludes an infinite chain of simulated universes. I do not mean this to say a universe can't be simulated, only that this idea of infinitely nesting universes is invalid due to the counting argument.

So really the argument that every universe down the chain quickly becomes useless through lack of complexity is a bit silly.

Only when you misconstrue my argument badly. Change the physics of your nested universe all you like, the capabilities of the simulation are still bounded by the laws of the universe which runs the simulation, not the universe being simulated. The easiest way to visualize this is simulating our own universe, but with less stuff in it. You can simulate a totally different universe, of course, trading complexity from one place to the other, but you can't change the overall budget. Otherwise you can just simulate a universe with different rules that then simulates a computer better than your starting computer, an obvious impossibility. Each universe is more restrictive than the last, even if individual areas are relaxed.

Another way to put it is that the overhead increases with each successive generation. The universe running the simulation must host the computer, which incurs overhead (not all of the universe can be the computer, at the absolute least you'd want somebody alive to see the result at the end), and the computer must perform the simulation, which also has overhead. You'll never reclaim that overhead in nested universes, for obvious reasons.

Link to comment
Share on other sites

This thread is quite old. Please consider starting a new thread rather than reviving this one.

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...