Jump to content

Is Cryogenics Possible?


Acemcbean

Recommended Posts

There is a big difference between replacing the brain all at once and slowly replacing it, small piece by small piece.

Our selves are the result of the brain activity of the whole brain (or perhaps the activity of a network of "observer neurons")... so if you just replace a small of the brain part by synthetic parts, the remaining brain can adapt to this, incorporate the "new" parts of the brain into the self and create a continuity in our self.

If we just replace the whole brain all at once, the continuity is disrupted. The self of the new brain might still belief that it is the former self ... but fact is that the old self got destroyed with the removal of the old brain.

So, the comparison of lajoswinkler with the teleporting problem is definitely valid.

If we (theoretically) would be able to totally scan a brain up to the molecular level and therefore would be able to create 1:1 copies of a canned brain ... and then would create multiple copies of the same brain, each of the brains would believe to be the former self (but only the original scanned brain [if it survives the scanning] would house the original self ... whereas the copies would all be their own individual selves)

Link to comment
Share on other sites

There is a big difference between replacing the brain all at once and slowly replacing it, small piece by small piece.

Our selves are the result of the brain activity of the whole brain (or perhaps the activity of a network of "observer neurons")... so if you just replace a small of the brain part by synthetic parts, the remaining brain can adapt to this, incorporate the "new" parts of the brain into the self and create a continuity in our self.

I understand where you're coming from, but these hypothetical synthetic components do not cause "disruption" when they are introduced. I also fail to see how disruption means death. Do you die every time you sleep? Do you die every time you have seizure? Both are very "disruptive", the latter even more so.

If we (theoretically) would be able to totally scan a brain up to the molecular level and therefore would be able to create 1:1 copies of a canned brain ... and then would create multiple copies of the same brain, each of the brains would believe to be the former self (but only the original scanned brain [if it survives the scanning] would house the original self ... whereas the copies would all be their own individual selves)

If the self is actually something that is unique and singular, then this is correct. However, if the "self" is the result of a computation, then it doesn't matter what is doing that computation, and the results are not unique. This means that all the computed "selves" are identical (at least, until they begin to diverge), and that it is meaningless to try to draw any distinctions (such as "original" or "copy") between them.

Example: pretend you have two computers that both function identically. Those computers are both set to run a certain function- in this case, we'll say that the computers are set to calculate 1000 digits of pi, using a mathematical expansion. If the sense of self is the output of neural computation, then this calculation of pi is analogous to the computation of "self" and consciousness by a brain. So to say that the self computed by one brain is any different that the self computed by another brain, given that both brains were exactly equivalent, is like saying that the pi computed to 1000 places by one computer is different than the pi computed to 1000 places by another computer.

Another way of looking at it- our feeling of self is a process, it's information interacting with information. We are nothing but information. Specific pieces of matter and energy do not factor into it other than matter and energy are utilized to run this process and store information, and an infinite number of unique configurations of matter and energy exist that can run this process of "self".

Information is not unique. It is senseless to talk about there being an "original" number 12,043, and "copies" of the number 12,043, since 12,043 always means the same thing. Likewise, our brains can be wholly described by a set of information, and in doing so, we also describe our consciousness as a set of information. Since information is not unique and can be exactly copied, it is creating a false distinction when you talk about there being an "original" you, and a "copied" you. There are simply two instances of you, and there is no distinction between them at least at first, because both instances contain the same information. Of course, their different experiences will cause them to begin to diverge, but at the moment of instantiation, they are exactly the same.

In summary- if the "self" exists as a unique and singular experience that cannot be reduced down to information, then you are correct. The only way I see that being possible is if a supernatural soul exists that cannot be described by information. However, if the self can be described as information, then talking about there being an "original" self and a "copied" self is as senseless as talking about there being an "original" number 943, and "copies" of the number 943.

Edited by |Velocity|
Link to comment
Share on other sites

I understand where you're coming from, but these hypothetical synthetic components do not cause "disruption" when they are introduced. I also fail to see how disruption means death. Do you die every time you sleep? Do you die every time you have seizure? Both are very "disruptive", the latter even more so.

The activity of the brain doesn´t stop during sleep ... it follows just other (regulatd) patterns.

My example only referred to your case C, where you replaced the whole brain within short time (and where therefore the brain activity would get disrupted).

If the self is actually something that is unique and singular, then this is correct. However, if the "self" is the result of a computation, then it doesn't matter what is doing that computation, and the results are not unique. This means that all the computed "selves" are identical (at least, until they begin to diverge), and that it is meaningless to try to draw any distinctions (such as "original" or "copy") between them.

Example: pretend you have two computers that both function identically. Those computers are both set to run a certain function- in this case, we'll say that the computers are set to calculate 1000 digits of pi, using a mathematical expansion. If the sense of self is the output of neural computation, then this calculation of pi is analogous to the computation of "self" and consciousness by a brain. So to say that the self computed by one brain is any different that the self computed by another brain, given that both brains were exactly equivalent, is like saying that the pi computed to 1000 places by one computer is different than the pi computed to 1000 places by another computer.

Another way of looking at it- our feeling of self is a process, it's information interacting with information. We are nothing but information. Specific pieces of matter and energy do not factor into it other than matter and energy are utilized to run this process and store information, and an infinite number of unique configurations of matter and energy exist that can run this process of "self".

Information is not unique. It is senseless to talk about there being an "original" number 12,043, and "copies" of the number 12,043, since 12,043 always means the same thing. Likewise, our brains can be wholly described by a set of information, and in doing so, we also describe our consciousness as a set of information. Since information is not unique and can be exactly copied, it is creating a false distinction when you talk about there being an "original" you, and a "copied" you. There are simply two instances of you, and there is no distinction between them at least at first, because both instances contain the same information. Of course, their different experiences will cause them to begin to diverge, but at the moment of instantiation, they are exactly the same.

In summary- if the "self" exists as a unique and singular experience that cannot be reduced down to information, then you are correct. The only way I see that being possible is if a supernatural soul exists that cannot be described by information. However, if the self can be described as information, then talking about there being an "original" self and a "copied" self is as senseless as talking about there being an "original" number 943, and "copies" of the number 943.

No.

I don´t know if you know anything about object oriented programming.

If yes, here IMHO a good example that illustrates it:

Lets say the Self is a class object (with the class object being initialized with the configuration of the brain).

If the self of all (copies) of a brain would be identical then we would have just one unique class object (the original self) ... and the self of the other brains just being pointers that point to this original self.

This view however is incorrect

Each Self rather is its own class object (but all class objects have been initialized with the same variables). That means, although all selves are based on the same set of variables (and would react identical), each self is its own unique object.

If you delete one of these Self-Objects, you would delete a unique entity even though you would have several copies left that would show exactly the same output as the one that you deleted.

Link to comment
Share on other sites

At the contrary, it's not frozen because its cells secrete compound(s) with antifreezing properties. It is stiff, but not frozen. Metabolism still partially works, but very slowly. Cytoplasm in the cells stays closer to gel than sol form.

Do you have a source for that? I have definitely seen it described as a complete stoppage: I believe the book "Winter World" describes this sort of freezing, in frogs and insects, as a reversible death.

http://www.pbs.org/wgbh/nova/nature/costanzo-cryobiology.html

"However, later in the freezing process the heart stops completely. Ice encases the organ and forms inside the muscle and the chambers. There's no need for a working pump now because the blood, too, is frozen. This state of arrested heart function can be tolerated for many days and perhaps months, but upon thawing the contractions spontaneously resume. Imagine our amazement when we witnessed those first few blips on a thawing frog's EKG!"

I agree it is not applicable to humans though.

EDIT: I'll drop the soul discussion as not relevant to this thread/forum. Sorry.

EDITx2: Hmm. It does seem the cells aren't completely shut down though, though organ function may be...

Edited by NERVAfan
Link to comment
Share on other sites

I think the distinction between the "original" and the "copy" is a false one.

A) The brain is slowly replacing and itself all the time, and we do not experience death. Also, every time we learn anything, the brain rewires itself and we do not experience death.

B) The point of the example of slowly replacing the brain with equivalent synthetic components is to show that this is effectively the same as what occurs naturally. We would end up with a fully synthetic brain, and not experience death.

C) So speed up the replacement arbitrarily, so that you effectively replaced your entire brain all at once, and that is no different than doing it slowly, and no different than what happens naturally.

D) So, why should you have to wait till the biological brain wears out? Why not replace a biological brain with a synthetic one, and pitch the biological brain in the trash?

A == B == C == D

By these arguments, then you could just wholly replace your brain with a synthetic one, literally pitching your brain in the trash, and not experience death.

Or you could keep the biological brain around. As you said, the two minds would slowly diverge as they would be subjected to different experiences.

What this points to, I think, is that we do not have an continuous, inner self. "Self" is an illusion. If replacing our brain with a synthetic brain, and throwing the biological brain in the trash is equal to death, then also,every time our brain changes its wiring or replaces a part of itself- which probably happens thousands or millions of times a second- we would also "die". Are we constantly dying, only to be "reborn" as new beings that falsely believe they have a continuous existence? The more simple explanation is that the sense of "self" as a singular existence is just an illusion.

So the distinction between the "original" and the "copy" is a false one. If the copy is an exact copy, then they are indistinguishable. The sense of self as a singular existence is an illusion; "you" are just the output of a function, and "you" exists where ever that function is run.

All this here, makes me think the first "brain uploads" will be Alzheimer patients. With their biological brain slowly decaying, there is plenty of time to adapt to synthetic replacement parts, to shift which parts contain the self -naturally- over years of use, rather than a sudden, artificial "transfer" process. And when theim minds finally fail completely, the inorganic component will be all of them that will remain... which is more than they would have had otherwise.

Link to comment
Share on other sites

hmm if it was me trying to design a form of cryonics for space travel, i would rather use a form of biological stasis mixed in with hibernation and "brain uploads" which work in tandem although i am not going to much into detail because i have been wanting to put this into a novel written by me for a long while....

Link to comment
Share on other sites

Do you have a source for that? I have definitely seen it described as a complete stoppage: I believe the book "Winter World" describes this sort of freezing, in frogs and insects, as a reversible death.

http://www.pbs.org/wgbh/nova/nature/costanzo-cryobiology.html

"However, later in the freezing process the heart stops completely. Ice encases the organ and forms inside the muscle and the chambers. There's no need for a working pump now because the blood, too, is frozen. This state of arrested heart function can be tolerated for many days and perhaps months, but upon thawing the contractions spontaneously resume. Imagine our amazement when we witnessed those first few blips on a thawing frog's EKG!"

I agree it is not applicable to humans though.

EDIT: I'll drop the soul discussion as not relevant to this thread/forum. Sorry.

EDITx2: Hmm. It does seem the cells aren't completely shut down though, though organ function may be...

If ice were to form inside cytoplasm, it would completely wreck the cell from inside out, snapping organelles and piercing the plasma membrane. Upon thawing, the cell would look like a pierced water balloon. Bacteria love that because no effort is needed to collect the nutrients and that's why thawed food spoils fast.

The cell must always remain in the gel state and even then, there's sluggish movement of certain parts of the mechanism.

Think about it. If everything stops, the frog could be kept in such conditions indefinitively and then revived. That's not the case. You won't be able to wake up a frog frozen for a decade. The longer the frozen period is, the less chances are you'll succeed. That means something is happening inside its cells, which is obvious because we're talking about temperatures a bit lower than the freezing point of water. That might seem ungodly cold to us when we fall through ice into sea or lake, but it's perfectly warm for very slow biochemical reactions.

It is certainly not death. It's extremely slowed down metabolism. High organ functions such as contractions of the heart muscles, peristaltics of the colon, that will stop.

Link to comment
Share on other sites

No.

I don´t know if you know anything about object oriented programming.

If yes, here IMHO a good example that illustrates it:

Lets say the Self is a class object (with the class object being initialized with the configuration of the brain).

If the self of all (copies) of a brain would be identical then we would have just one unique class object (the original self) ... and the self of the other brains just being pointers that point to this original self.

This view however is incorrect

Each Self rather is its own class object (but all class objects have been initialized with the same variables). That means, although all selves are based on the same set of variables (and would react identical), each self is its own unique object.

If you delete one of these Self-Objects, you would delete a unique entity even though you would have several copies left that would show exactly the same output as the one that you deleted.

I totally understand where you're coming from. And yes, I've done a fair bit of OOP (in C++, C#, and some scripting languages). You're viewing it from the viewpoint of physical instantiations. Each instance of oneself has to occupy a different physical space, and is thus unique in at least this way. That is true.

However, I'm viewing it on a more abstract level, the level of experience, I guess. The example of slowly replacing neurons with identically-functioning synthetic ones, and there being no change in your conscious state, shows to me that what is important in terms of one's own identity and conscious state is NOT the physical matter. It is the process. If the process that computes your conscious state does not change, then you don't notice anything (you may be hard pressed to notice some differences even if it does). Your consciousness is a process, meaning that you can reduce your consciousness down to a set of information and relationships between information. Do you follow and agree so far? Yes/No-why?

So your consciousness is a process- it is information (information stored, and information about what to do with that information, and information about how to process new information). Information is not unique, it is just information. So there is only one set of information that describes "you", just as there is only one irrational number pi.

This "you" can be instantiated on multiple different platforms- like a biological brain and a synthetic brain. If that occurs, each "you" will rapidly diverge into distinct beings as you cannot experience the same sensory input, and the brain is certainly a chaotic system at some level, so even if you DID experience the same sensory input (fed directly into your sensory stream Matrix style), you'd diverge anyway. But at the moment of instantiation, they are both "you", as they both are instantiations of the same information, and there is only one set of information that describes your mind.

Do you see where I'm coming from? I'm not saying you're wrong necessarily, I'm saying there's a different way of looking at it. You're looking at it in terms of physical instatiations of a certain set of information. I'm looking at it in terms of solely information, as my thought experiments lead me to believe that information is all that is important.

Viewing this in terms of OOP does not remove the physical. Two objects in OOP can encapsulate the same information, but they are different because they reside at different memory addresses. Thus, looking at consciousness/self from an OOP perspective does not remove the "body" from the "mind". From an OOP perspective, the "mind" is the information contained within the object, while the "body" is the memory address.

If you describe yourself solely as information, then there isn't any difference between "you" and an identical mind that encapsulates the same information that describes "you". Looking at it solely from a non-physical, information standpoint, it is meaningless to draw any distinction between the two, because both contain the same information. "You" are like the number pi; the number pi to 1000 places is the same no matter where or how you store it.

And again, the reason I look at it from an informational standpoint, as consciousness being a process (as opposed to your more physical standpoint of consciousness being a process run at a location), is because of the thought experiment that seems to show that there is no change in your consciousness if you replace your brain slowly (or even, instantly) with a synthetic equivalent, indicating that consciousness is solely based on information and information processing, which is not tied to any particular physical instantiation or location.

That the location and means of physical instantiation (synthetic/biological/computer simulation) is meaningless to ourselves is even further underlined by the fact that you could theoretically fake location and physical body with a fake sensory input stream. You could have two separate instances of you that both think they are in the same location and have the same body because they are fed the same fake sensory input. Chaos should still cause them to diverge though.

Now, if there is a supernatural soul, all that gets thrown out the window of course, because it would be impossible to reduce a soul down to information (since it is supernatural). Our conscious experience would thus be tied to our unique and unreproducible soul.

Edited by |Velocity|
Link to comment
Share on other sites

Even still, from the most basic of basic observations, each new "unique number" is still unique. If you kill the one that's been around 25 years and replace it with a sudden reincarnation - it might hold the same information, but the original "line of succession" is dead. Anyone following that "unique batch of information" since day 1 will agree that it died. It's gone. It should be given a funeral and burial service. It doesn't matter if there's a replacement standing by to give the funeral service - it's still dead.

What you're saying is akin to suggesting that if all 7 billion people on Earth were actually total replica's of the same unique brain (all clones with the exact same memory and information), that we could commit genocide in the billions and really never kill anyone, because that person is "still alive."

I disagree. A death is still a death. 1 brain, 1 life.

Link to comment
Share on other sites

Exactly. If you copy a brain and kill any of those two, original or copy, you're killing a person. That's why teleportation would require murder.
Even still, from the most basic of basic observations, each new "unique number" is still unique. If you kill the one that's been around 25 years and replace it with a sudden reincarnation - it might hold the same information, but the original "line of succession" is dead. Anyone following that "unique batch of information" since day 1 will agree that it died. It's gone. It should be given a funeral and burial service. It doesn't matter if there's a replacement standing by to give the funeral service - it's still dead.

What exactly is unique about that batch of information, other than the body it is in? Nothing.

Ponder these questions-

A) Brains are constantly replacing and repairing and regrowing themselves. Is your brain murdering itself every time it exchanges one piece for another, or adds a new piece or connection?

My answer:

Of course not. I do not die thousands of times a second.

What do you think?

B) If not, at what point (if at all) of exchanging one piece of brain matter for another exactly equivalent piece of brain matter does the original consciousness die?

My answer:

It never dies.

What do you think?

C) Also, again imagine that you can replace one neuron at a time with an exact synthetic equivalent, slowly over time. The brain already replaces itself constantly, this isn't really any different. The person's consciousness wouldn't notice a thing, their sense of self wouldn't notice a thing. Is that murder? If so, then why does the brain get a "free pass" to repair and rebuild itself (murder itself), while we do not?

My answer:

It is not murder if the new one is an exact equivalent. (Probably, even if it nearly an exact equivalent, that should hold.) A very slight change isn't murder either, as when we learn new things, our brains are changing. Should I not learn anything because learning is killing myself? (Do not answer, that is obviously rhetorical.)

What do you think?

D) So say instead of slowly exchanging a real neuron for a synthetic one slowly, why not speed up the process? Exchange one neuron at a time, a billion times a second? Replace the whole brain in 10 seconds, or even, instantly? Is that murder?

My answer:

Saying "yes" would be drawing an arbitrary line. So no, it is not murder, not as long as the new brain is an exact or nearly exact equivalent.

What do you think?

E) What if you don't discard the real neurons after they are replaced by synthetic ones? What if you reassemble them into the original brain? Does killing this brain become murder? If so, are you obligated to NOT try to reassemble the original brain? Or is discarding any of the neurons murder?

My answer:

Clearly, we begin to run into moral issues by this point, I think we can all agree on that. If you have a separate brain, it will very quickly be thinking its own separate, unique thoughts, and by the unique information definition, it is now a separate person, as the same set of information cannot be used to describe both the persons. So perhaps, you are in fact obligated to not reassemble the original brain! But I do not believe discarding individual pieces is murder. Either way, we're starting to draw arbitrary distinctions here...

What do you think?

Anyway, I'm trying to get you guys to see that you're drawing some arbitrary distinctions here, and the answers as to morality of brain replacement in the limit of fast replacement isn't exactly clear.

I think it's clear that what makes me me is solely the information in my head. If you copy that information, that separate set of information is exactly me until it thinks a different thought from me, because the instant it thinks even a slightly different thought, this other me requires a different set of information to describe than I do. It is the uniqueness of the information that describes our conscious minds and current mental state that makes us different people.

If you exactly copied a brain, the two brains would be the exact same person up until the moment that they started thinking different thoughts. The instant that it took TWO sets of information to describe the two brains, they become different people. And this would happen almost instantly due to different sensory inputs and different rolls of the quantum mechanical dice.

What you're saying is akin to suggesting that if all 7 billion people on Earth were actually total replica's of the same unique brain (all clones with the exact same memory and information), that we could commit genocide in the billions and really never kill anyone, because that person is "still alive."

No, I am NOT saying that. You don't quite get it- if you copy a brain seven billion times, and then "run the clock" for any significant amount of time, (really, all it would take is an instant) it will take seven billion unique sets of information to describe those seven billion minds, even if they were exactly copied from the same mind. The very microsecond after you "cloned" them, they would already be thinking slightly different thoughts, as their brains would be experiencing different sensory stimuli and be subject to different chaotic and quantum mechanical uncertainties and interactions. Each would experience the external world from their own, unique perspective. They would be seven billion separate people, and yes, killing them off would be genocide. They would experience 7 billion unique and different deaths. A terrible crime.

Understand this- the set of information that fully describes your brain also describes all your thoughts. Seven billion instances of the exact same person would, by this definition, all be thinking the exact same thoughts at the exact same time, but of course, that's impossible as mentioned above.

At the risk of sounding like I'm contradicting myself however, I think that there would be some practical limitations to the absolute rules described above-

1) Some amount of time would exist where the two different minds could think different thoughts without you considering them separate persons;

2) Some amount of tolerance should exist when copying a mind for the copy to be considered the same person, because even in some hypothetical future where we CAN effectively copy brains, it is impossible to make exact copies;

3) The replacement brain does not have to function absolutely exactly as the original brain; there is some point where it should be considered "close enough".

Edited by |Velocity|
Link to comment
Share on other sites

This thread is quite old. Please consider starting a new thread rather than reviving this one.

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...