Jump to content

Von Nuemann Machines


KAL 9000

Recommended Posts

1 hour ago, Jouni said:

*snip*

In this case, the context is the galaxy. We're talking about self-contained systems capable of moving from one solar system into another and self-replicating upon reaching the destination. I've essentially claimed that the galactic "ecosystem" doesn't look complex enough to support such self-replication.

And I've repeatedly argued that the notion that self-replication requires a complex ecosystem is wrong. On Earth at least, self replicating systems arose, by chance, in an abiotic environment. We have no idea exactly what environment was required to start that process off but we can say that no complex ecosystem was required because until self-replicating systems arose, by definition there were no living organisms around to create that ecosystem. (You can argue about what constitutes 'life' but the ability to replicate falls into all of them) Like I said, it all had to start from somewhere.

Given that self-replicating systems can and have arise by chance, without a supporting ecosystem, I don't see how you can argue that there are any fundamental physical laws that would prevent the construction of a von-Neumann machine, that is a system capable of moving from one suitable abiotic environment to another and establishing a self sustaining, self replicating colony within each of those environments. I say 'suitable abiotic environment' because, as previously stated, I believe that a practical von-Neumann machine will need some chemical diversity to work with - a self-replicating system constructed entirely from one element, including carbon, seems vanishingly unlikely.

In short, I think von-Neumann machines are an engineering problem and I will readily concede that they are an extremely difficult engineering problem that we're unlikely to be solving any time soon. However, I don't believe that they are an unphysical problem, in other words, there are no physical laws that rule them out entirely.

Link to comment
Share on other sites

5 hours ago, KSK said:

And I've repeatedly argued that the notion that self-replication requires a complex ecosystem is wrong. On Earth at least, self replicating systems arose, by chance, in an abiotic environment. We have no idea exactly what environment was required to start that process off but we can say that no complex ecosystem was required because until self-replicating systems arose, by definition there were no living organisms around to create that ecosystem.

That argument depends on a technically correct but rather useless definition of "ecosystem", at least for this context.

I've seen a number of speculations on how life originally arose. In all of them, there was a complex self-sustaining system of chemical reactions (with an external source of energy), from which the first living organisms evolved. Technically it was not an ecosystem, but it was still a complex system necessary for supporting the primitive organisms.

5 hours ago, KSK said:

Given that self-replicating systems can and have arise by chance, without a supporting ecosystem, I don't see how you can argue that there are any fundamental physical laws that would prevent the construction of a von-Neumann machine, that is a system capable of moving from one suitable abiotic environment to another and establishing a self sustaining, self replicating colony within each of those environments.

There is an important qualitative difference between those two cases. Evolution may eventually create something complex, but the results are essentially random. A von Neumann machine, on the other hand, must be able to produce a specific kind of complex systems reliably. In particular, each machine must create at least one replicate in the expected case. Just because something complex may evolve randomly doesn't mean that it's possible to build a self-replicating complex system that can move from solar system to solar system.

5 hours ago, KSK said:

In short, I think von-Neumann machines are an engineering problem and I will readily concede that they are an extremely difficult engineering problem that we're unlikely to be solving any time soon. However, I don't believe that they are an unphysical problem, in other words, there are no physical laws that rule them out entirely.

That depends on your definition of "physical". By a simple counting argument, there is an unlimited number of things that are technically possible but extremely unlikely to develop, because the universe is too small and too short-lived. That may include many classes of objects that are conceptually simple but technically complex.

Link to comment
Share on other sites

9 hours ago, KSK said:

Yes - but there are other organisms (e.g. the cyanobacteria that I keep banging on about) that don't need the resources released by the decomposers.

Despite that, almost everything depends on them. Even cyanobacteria, when part of a lichen, depend upon a fungus which breaks down the resources nearby. Granted, it's technically not a decomposer.

Link to comment
Share on other sites

8 hours ago, Jouni said:
8 hours ago, Jouni said:

That argument depends on a technically correct but rather useless definition of "ecosystem", at least for this context.

I've seen a number of speculations on how life originally arose. In all of them, there was a complex self-sustaining system of chemical reactions (with an external source of energy), from which the first living organisms evolved. Technically it was not an ecosystem, but it was still a complex system necessary for supporting the primitive organisms.

Well frankly it's better to base an argument on a technically correct definition than a bizarrely wrong one as you did.

With regard to your second point - what do you think a living organism is if not 'a complex self sustaining system of chemical reactions with an external source of energy.' You're not talking about an external system that the earliest organisms needed to evolve - you're talking about the organisms themselves. And besides - you're still missing (or wilfully ignoring) the point that that first complex system - call it an ecosystem, call it an organism, call it 'a complex self-sustaining system of chemical reactions (with an external source of energy)' arose from an environment where no such thing previously existed.

8 hours ago, Jouni said:

There is an important qualitative difference between those two cases. Evolution may eventually create something complex, but the results are essentially random. A von Neumann machine, on the other hand, must be able to produce a specific kind of complex systems reliably. In particular, each machine must create at least one replicate in the expected case. Just because something complex may evolve randomly doesn't mean that it's possible to build a self-replicating complex system that can move from solar system to solar system.

Absolute nonsense. That randomly evolved organism needs to be just as capable of reliably producing specific complex systems as a deliberately engineered von-Neumann machine. They wouldn't last long if they didn't.

8 hours ago, Jouni said:

That depends on your definition of "physical". By a simple counting argument, there is an unlimited number of things that are technically possible but extremely unlikely to develop, because the universe is too small and too short-lived. That may include many classes of objects that are conceptually simple but technically complex.

I already defined physical as 'something not forbidden by the laws of physics'. I've already conceded that von-Neumann machines may turn out to be impractical - but there's a significant diffference between technically impractical and physically impossible. Anyway - I suggest we agree to disagree on this one. I'm clearly not convincing you, you are most definitely not convincing me, so I'm going to quit before this gets acrimonious and the thread gets locked. Thanks for the debate.

Edit. Apologies for the mixed up quote formatting. 

Edited by KSK
Link to comment
Share on other sites

On 16.12.2015 klo, KSK said:

And besides - you're still missing (or wilfully ignoring) the point that that first complex system - call it an ecosystem, call it an organism, call it 'a complex self-sustaining system of chemical reactions (with an external source of energy)' arose from an environment where no such thing previously existed.

I fail to see the relevance of that point. Something complex may randomly evolve in an environment where such complexity did not exist before. The important word is "randomly". There is a categorical difference between a random process producing a complex result and a random process producing a particular complex result.

On 16.12.2015 klo, KSK said:

Absolute nonsense. That randomly evolved organism needs to be just as capable of reliably producing specific complex systems as a deliberately engineered von-Neumann machine. They wouldn't last long if they didn't.

Please think again.

Do you really think that the following two scenarios are equivalent:

  1. A von Neumann machine arrives to a solar system and sets up a primitive ecosystem. Eventually, something complex (e.g. dinosaurs) evolves in the ecosystem and is able to reproduce reliably within the ecosystem.
  2. A von Neumann machine arrives to a solar system and sets up a primitive ecosystem. Eventually, the ecosystem is able to produce new von Neumann machines, which can be launched to other solar systems, where they can continue the work reliably.

In the first scenario, evolution produces something complex that can reproduce within the ecosystem. In the second scenario, evolution produces a particular kind of complex system that can reproduce outside the ecosystem.

On 16.12.2015 klo, KSK said:

I already defined physical as 'something not forbidden by the laws of physics'. I've already conceded that von-Neumann machines may turn out to be impractical - but there's a significant diffference between technically impractical and physically impossible.

Then define the laws of physics. In particular, is computational complexity a law of physics?

There is a differerence between something being logically impossible, improbable (statistically impossible), and infeasible (computationally impossible). Something that is said to be physically impossible may be logically possible but extremely improbable. For example, there is nothing that makes perpetual motion logically impossible within the laws of physics – the laws that forbid it are statistical in nature. Physicists are currently debating whether some ideas from computational complexity should be considered fundamental laws of the nature – comparable to the laws of thermodynamics – or whether they are just practical issues.

Link to comment
Share on other sites

On 12/16/2015, 1:18:38, KSK said:

Absolute nonsense. That randomly evolved organism needs to be just as capable of reliably producing specific complex systems as a deliberately engineered von-Neumann machine. They wouldn't last long if they didn't.

On 12/15/2015, 4:26:32, Jouni said:

Just stop arguing with Jouni.  He's a grad student somewhere who knows only a tiny sliver of math he doesn't understand, and he seems to think he can apply this to the universe, even when it disagrees with centuries of findings by every other scientist.  I think he's being willfully ignorant as well, because he never "updates" his arguments even when we tell him about factual disagreements with his theories.

Edited by SomeGuy123
Link to comment
Share on other sites

On 12/12/2015, 7:37:15, Spaceception said:

If we made Von Nuemann probes, we'd need to program them not to replicate themselves on anything that has/very likely has life on it. We'd also need to program them to only replicate themselves a certain number of times so they don't get out of control. And finally, we'd need to put them in a system with a Gas giant with likely moons so they could spread out easier, once we've figured all of that out, we can have nano-machines exploring the Galaxy!

This, would not really work. Not for ever at least. Not without very good engineering or luck.

Think about a Von Neumann probe as being basically an animal, or a plant or whatever: a life form. It is subject to natural selection like the rest of us. Now, let's take an animal, say, a deer. Deer are not really programmed to climb steep mountains, rather they like relativley flat ground, hills, and not so steep mountains (at least the deer I know).

If I were to take a sufficient population of these deer, and put them into an environment where it was distinctly advantageous to climb cliffs, they would likley eventually learn to do it. Generations would pass, and they might slowly learn to hop up rocks to evade predators, or reach otherwise inaccesible food sources. As time goes on, they would cease to be deer, but their ancestors would have been

When and if we were to release self replicating probes into the Cosmos, we would be relinquishing control over them. I cannot say that it is advantageous to them to land on planets with life, but I cannot say that it is not, either. And, extrapolating across the practical infinity of space, it is, it would seem, possible that these spacecraft will have time to decide for themselves whether or not to do so. So many generations produce copying errors, and eventually, those safegaurd programs will errode.

Link to comment
Share on other sites

6 minutes ago, Newt said:

 So many generations produce copying errors, and eventually, those safegaurd programs will errode.

That I can say is a problem that's solvable.  It's possible to make safeguards that won't lose data until the end of the universe, and to make the replication systems only capable of producing identical children, all the way down to the bonding of individual atoms.  There are issues with this in that if the machines can't really adapt themselves they might fail after a while.

Edited by SomeGuy123
Link to comment
Share on other sites

Perhaps I should calrify that. Using things like checksums, you should be able to make a program that will copy itself, check the copy, and make so few mistakes that you can do it for billions of years without errors. I do not know the numbers for this off hand, but with current programs to do this it probably is acheivable. A 512 bit checksum has a fantastically small chance of being duped, and it is conceivable that you could be using a bigger one just because you can.

However, it seems that you are wanting to make a thing that evolves, probably. Assuming that that is the case, your checksums will need to be not completley restrictive of all accidental changes, and those changes that they allow probably will be able to overide and mess up other stuff in the program, given time.

Link to comment
Share on other sites

This thread is quite old. Please consider starting a new thread rather than reviving this one.

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...