Jump to content

Terrifying AI possibilities


coyotesfrontier

Recommended Posts

I was reading up on Blindsight by Peter Watts, and in it there exists an extinct subspecies of humanity, known as the Vampires. Having evolved to prey on humans, there are able to produce sounds and behave in such a way that is completely terrifying to humans, and humanity is completely unable to defend themselves once they are revived.

This got me thinking. What's stopping an AI from, say, mimicking the voices and personalities of the dead relatives of those fighting against it, and easily manipulating them into their own demise? Such an attack would completely destroy them mentally and leave them completely unable to fight back. Even more fundamentally, knowing how composers of horror films are able to make simple sounds that strike fear into the audience, an AI could formulate sounds designed to instantly paralyze those listening in fear. Moving away from auditory stimulus, the AI could produce images of the loved ones of the fighters being tortured, indistinguishable from the real thing, or simply produce intrinsically horrifying imagery like a typical jumpscare in a horror film. What would be an effective defense against something like this?

Happy early Halloween I guess.

Link to comment
Share on other sites

1 hour ago, coyotesfrontier said:

What's stopping an AI from, say, mimicking the voices and personalities of the dead relatives of those fighting against it

I believe, most of bloggers and news agencies generate text which can be easily emulated after the corresponding statistical analysis and applying found coefficient values to "translate" actual news or letters like if they are commented by this person.

(Actually, there are a lot of bot-news integrators like this, but I mean a deeper level.)

Also I believe, such bots will be an important part of global communication system, since most of humans will be learning from same internet resources, so an individual opinion will reflect a pure statistics.

So, I believe, this will make the difference between real and virtual persons disappear, and current post-monkey offline personalities will be upgraded/replaced with AI-augmented post-monkey online personalities of the future human generations.

I believe, this will just bring the future type of society where AI (sapient but having no its own will and intentions) will be melted together with all human personalities, augmeting and equipping them with internet-in-the-head and being animated by their wills, emotions, and intentions.

So, this looks not a dystopia, but an utopia

Edited by kerbiloid
Link to comment
Share on other sites

We already have the technology to create videos from scratch from anyone. Videos are very hard to rebuke cognitively, we all fall for them when we see them. In particular recontextualized videos, videos that relates an event which did happens but are used to demonstrate another event is happening, like those videos of crocodiles or sharks in submerged location used during each and every hurricane to make people believe there's actual shark and crocodiles lurking around the flooded town they live in. [snip]

Videos are hard to cognitively rebuke. Because they appeal to a lot of our senses, and also seems to require more work than, say, a picture or a text.

And deep fakes videos already exists. I mean.

[snip]

So, we already have expert system capable of generating real videos of things that did not happen. And our brains aren't equipped to work with that.

And we already have people exploiting those cognitive bias we have. Who needs AI when we have people ?

This is where I get lost on most of AI taking over the world. What's the story interest of it ? We already have people trying to take over the world, who needs AI for that ?

Also, if there's an AI taking over the world, I guess some weird stickers might be used as exploit against its cognitive systems and get access to execute arbitrary code on it. Which is harder to do with a human brain. There's no perfection in computer, there's only determinism and discretization (while we are non deterministic and analog).

Edited by Vanamonde
Link to comment
Share on other sites

2 hours ago, Okhin said:

Also, if there's an AI taking over the world, I guess some weird stickers might be used as exploit against its cognitive systems and get access to execute arbitrary code on it. Which is harder to do with a human brain. There's no perfection in computer, there's only determinism and discretization (while we are non deterministic and analog).

These are the very reason you can't make an AI that would truly think independently. Determinism in particular is a huge obstacle to thinking and especially imagination, since it is by nature nondeterministic. Without a breakthrough in computing, there can never be a thinking AI.

However, deterministic "AIs" are very much capable of being abused by humans. Which is already happening. Copyright bots screwing people over uploading perfectly legitimate material to YouTube, for instance, of Facebook's automated algorithms locking people out of their accounts. All this from people who aren't even actively malicious. 

That said, you don't need AIs to do that. Ultimately, manipulated videos are just another tool for misinfromation. Soviets have airbrushed people out of photos as far back as the 40s. The techniques are getting more sophisticated, but honestly, this isn't going to be that big of a change. All AI can do is make it cheaper and easier to do and more appealing to people. At most, it'll lead to complete loss of public trust in any sort of media.

Link to comment
Share on other sites

@Okhin Digital fakes are a different thing, they manipulate information. That works on conscious level, I can decide this information is false and choose not to believe it. What OP is describing is  stimulation of certain neurological pathways that could not be ignored (else is ineffective), essentially a form of medusa weapon.

Link to comment
Share on other sites

7 minutes ago, Dragon01 said:

Determinism in particular is a huge obstacle to thinking and especially imagination, since it is by nature nondeterministic.

Is it? Isnt the human brain just a complex biochemical computer, with its neurons firing according to physical laws? Of course there could be more to this, something that makes us self aware, but thats religios/philosophical speculation.

Link to comment
Share on other sites

7 minutes ago, Elthy said:

Isnt the human brain just a complex biochemical computer, with its neurons firing according to physical laws?

No, it isn't, starting with the fact that there's much more to it than the neurons. We are not even beginning to understand the incredibly complex web of interactions that is happening in there, but we do know that it involves many kinds of cells, and works completely different than a computer. Quantum mechanics are highly likely to be involved in all this, which would by itself brings a non-deterministic element into those processes, and could also explain how we are able to do such things like seeing, walking or dreaming (all firmly out of reach of deterministic computers). 

Quantum computing, when it gets explored in depth, might provide some answers, and perhaps a way to make a more brain-like device, or perhaps even a real "cogitator" (that is, a machine that thinks, not just computes). However, no deterministic machine will ever be capable of doing some of the things human brain does, if only because deterministic algorithms have some distinct limitations that result from them being deterministic.

Link to comment
Share on other sites

53 minutes ago, Elthy said:

Isnt the human brain just a complex biochemical computer,

Maybe it's just a network adapter, connecting the hardware to the universal net?

Let alone simple memories, but how is the abstract mathematics stored, and visual memories?

Then we're nettops. Or raspberries.

Edited by kerbiloid
Link to comment
Share on other sites

45 minutes ago, Elthy said:

Is it? Isnt the human brain just a complex biochemical computer, with its neurons firing according to physical laws? Of course there could be more to this, something that makes us self aware, but thats religios/philosophical speculation.

It is. But that doesn't mean all brains work alike. Not even identical twins think and behave in the same way. This is evolution at work - individuals of one species behave in a wide variety of ways and have different reactions to stimuli. Which means such "weapon" would have limited effectiveness and would be wildly inconsistent.

Link to comment
Share on other sites

9 minutes ago, kerbiloid said:

A one man's fear is another one's fun.

Well said.

My sister is terrified of spiders. I catch them in my hand, carry outside and release in the backyard. On the other hand, I'm afraid of heights, and  watch in mute terror as my sister is fearlessly hanging drapes, balancing precariously on a wobbly ladder.

Unless someone comes with a method of projecting simultaneously images of a spider and a ladder, one of us will be able to tell something wrong is happening to the other one.

Link to comment
Share on other sites

2 hours ago, radonek said:

@Okhin Digital fakes are a different thing, they manipulate information. That works on conscious level, I can decide this information is false and choose not to believe it. What OP is describing is  stimulation of certain neurological pathways that could not be ignored (else is ineffective), essentially a form of medusa weapon.

No it does not work on a conscious level. But again, given the complexity of the brain (not necessarily even the human one), nothing really works on a conscious level. No one is immune to propaganda. You can develop some critical thinking but it requires energy and to be able to sit back and think about a specific piece of information, You cannot do it for all bits of data your brain is stimulated by.

And I'm not sure the OP talked about the Medusa weapon. Or Basilisk hack (they're the same if I understand what a medusa attack is). Which is affecting someone brain with specific stimuli in order to activate a specific bug in it, such as starting a seizure by submitting an epileptic person to strobes of lights. Which have been happening by the way, without the need of any AI : https://newatlas.com/computers/epilepsy-foundation-twitter-strobe-seizure-gifs-law/

Or if you're talking more in the line of subliminal message delivery, I'm not sure it can work ? Or not on anyone ?

Ok, OP is explicitely talking about Medusa attack and Basilisk hacks, my bad :p

Edited by Okhin
Link to comment
Share on other sites

There is nothing more important and few things more interesting than current politics. But there is also nothing more certain to make forum members hate and attack each other than current politics, which is why we've had to rule the whole subject off-limits for our friendly little space game forum. Some comments here have been removed and we're asking folks to remember to save that kind of discussion for other places on the internet, please. 

Link to comment
Share on other sites

14 hours ago, Dragon01 said:

Quantum mechanics are highly likely to be involved in all this...

No kidding, Sherlock. Obviously atoms obey quantum mechanics. :lol:

Joking. I guess you mean things like entanglement?

Link to comment
Share on other sites

It wasn't a very clever joke in any case, because it shows you missed the point. I was talking about how the brain works, not what it consists of. A computer doesn't make use of QM phenomena in its logic, transistors and diodes all work based on QM phenomena, but the logic is 100% classical and deterministic. You could literally build the same thing with water pipes, this is sometimes used to explain electronics to kids.

Quantum computing is something different. In a nutshell, it uses qubits, which can take values of not just 1 and 0, but also a superposition of those states. Basically, there is certain probability that a qubit will be 1, otherwise it will be 0. This introduces a stochastic element into the computation. It does need to give a 1 or 0 at the end of the computation, so the end result is still a computer, but it now has access to some algorithms which would take basically infinite time on a classical computer. 
https://en.wikipedia.org/wiki/Quantum_computing

Now, we don't know about the human brain, but it's likely a step beyond that. Most notably, it has the capacity to create concepts and images ex vacuo (also known as imagination), a problem which is outright impossible on a deterministic machine. 

Link to comment
Share on other sites

Back to the frightening future of AI -

I think one thing that a lot of futurists miss is that there's never only one of anything. If there's an AI that's doing deep fakes and hurting humanity, there will invariably be an AI that lives to counteract it. Much like the Ecosystem - once one organism exists, another will appear to smite it. There's a natural checks and balances at play with this sort of thing.

And besides: once we've become completely inundated with fake news and deep fakes, and everything that can be real can also be fake, the only thing that will change is our willingness to act on new information. Physical needs and infrastructure won't lose value: We'll still need homes, roads, food, and language. All the fluff - like history and politics - will undergo a metamorphosis that I can't even begin to imagine.

Link to comment
Share on other sites

1 minute ago, WestAir said:

And besides: once we've become completely inundated with fake news and deep fakes, and everything that can be real can also be fake, the only thing that will change is our willingness to act on new information. Physical needs and infrastructure won't lose value: We'll still need homes, roads, food, and language. All the fluff - like history and politics - will undergo a metamorphosis that I can't even begin to imagine.

And yet, authors of SF (generally in the CyberPunk genre) tried to imagine it :p

For instance, take TransMetropolitan. There's no Uber AI who took over the world, but there's basically no history, people don't refer to the past because they're drenched in information (to the point you can receive and broadcast channels on your skin apparently)(or inhale information from air). They still have journalism and election though, they just don't reflect on the past. The culture is a mess and an aggregate of a lot of weird stuff that no one even tries to prove or disprove, but people seems OK with it.

There's also the RPG Cyberpunk 203X, in which you have basically an AI which randomly swaps bits of data of anything online, meaning that the longer data is online the higher the chance of it being corrupted. And there's a nano virus which destroyed paper. So, again, you have no real access to reliable information, all you have is a culture war (and its all oral culture), leading to the emergence of Alt-Cults.

The issue, when you have an AI that can create deep fakes, is not necessarily the immediate effect. You can probably disprove it because you still have access to facts. But twenty or forty year later, how can you disprove this video of events if all you have are non footage, altered or not and not enough context ? Especially since media (like everyone else) are and can be fooled. Finding what makes sense (if things makes sense), is easier to do when you still have first hand witness, multiple sources, and the likes, But as the time goes, you lose sources (I mean, try to find video of things happening in 2007 for instance), which is normal. We did not keep every piece of paper we wrote throughout history. We kept some. If the share of deep-fakes among what we keep is high enough then, given time, they become indistinguishable from facts.

Meaning that an AI planting deepfakes around is doing damage to the next generation past and history, more than it is doing damage to us. Which brings a whole new level of way to messing up with people on a cultural and generational level. I kind of like this idea (not that I'm a big fan of it), because it probably can be the result of non concerting AI working towards this kind of goal (cultural dominance), and exploitings each other feedback loop to pump more and more non sensical deepfakes in our information stream.

But then, in both setting I described earlier, people don't really care much. That's probably because they're CyberPunk settings though, which requires apathy from the masses to work as such.

Link to comment
Share on other sites

2 minutes ago, WestAir said:

I think one thing that a lot of futurists miss is that there's never only one of anything.

In nature. In business, if an AI is created by a corporation to spew deep fakes in order to spool business, how can you be sure that one would be created to debunking it, as opposed to others joining in the fun with lie-spewing AIs of their own? 

AIs do not spring out of the ground. They are designed and created by people. Seeing as they can't really think, they are ultimately bound to serve someone's interests. 

4 minutes ago, WestAir said:

And besides: once we've become completely inundated with fake news and deep fakes, and everything that can be real can also be fake, the only thing that will change is our willingness to act on new information. Physical needs and infrastructure won't lose value: We'll still need homes, roads, food, and language. All the fluff - like history and politics - will undergo a metamorphosis that I can't even begin to imagine.

You don't need AIs for that, this is already happening. As in, for this metamorphosis, check out how politics look right now. This is it, basically. We're so inundated in lies, fakery and deception that many people stick to what they feel is correct, and dismiss all opposing evidence as fabricated. Indeed, hyperpolarization such as we are seeing now might well lead to breakdown of democratic systems altogether, destabilizing systems long vaunted for their stability (all while established monarchies and dictatorships sit back and laugh :)). Hopefully whoever comes out on top in the resulting mess will be sensible...

As for history, I wouldn't worry about that. Some will disagree, but I'd draw a line between present and history at a point where politicians lose interest in politicizing the events in question. Yes, that means WWII is still "the present" for some people (WWI, however, is pretty firmly in history). History will be worked out by historians, meticulously untangled and, long after it ceased to be of any but, well, historical interest, presented in an objective manner in a scientific journal, to be spun and misinterpreted by popular history sites.

Link to comment
Share on other sites

On 10/1/2020 at 11:03 AM, Dragon01 said:

In nature. In business, if an AI is created by a corporation to spew deep fakes in order to spool business, how can you be sure that one would be created to debunking it, as opposed to others joining in the fun with lie-spewing AIs of their own? 

AIs do not spring out of the ground. They are designed and created by people. Seeing as they can't really think, they are ultimately bound to serve someone's interests. 

You don't need AIs for that, this is already happening. As in, for this metamorphosis, check out how politics look right now. This is it, basically. We're so inundated in lies, fakery and deception that many people stick to what they feel is correct, and dismiss all opposing evidence as fabricated. Indeed, hyperpolarization such as we are seeing now might well lead to breakdown of democratic systems altogether, destabilizing systems long vaunted for their stability (all while established monarchies and dictatorships sit back and laugh :)). Hopefully whoever comes out on top in the resulting mess will be sensible...

As for history, I wouldn't worry about that. Some will disagree, but I'd draw a line between present and history at a point where politicians lose interest in politicizing the events in question. Yes, that means WWII is still "the present" for some people (WWI, however, is pretty firmly in history). History will be worked out by historians, meticulously untangled and, long after it ceased to be of any but, well, historical interest, presented in an objective manner in a scientific journal, to be spun and misinterpreted by popular history sites.

Basically you just described humanity through the ages :) Always, always there were fake news - from a fisherman gloating about the size of a fish he caught to village gossip to a chronicler embellishing the deeds of a king. I think we will manage :D

Link to comment
Share on other sites

4 hours ago, Scotius said:

I think we will manage :D

Oh, we will manage. I just think we'll go back to the kings and princes. :) Who, bear in mind, ruled through most of those ages, with a more or less decent record on average.Those, at least, do not need to be concerned about fake news directly impacting the way they run their countries. An autocrat can't completely disregard public opinion, but doesn't have to bend to its every whim. They also don't benefit from hyperpolarization, quite the contrary, in fact.

Democracy requires rational people (already a tall order) acting on unbiased information in order not to devolve into tyranny of the majority and/or outright ochlocracy. I'm not saying there weren't biases all over the place before the internet, but it did give them far more visibility. Because of that, they aforementioned requirement is fulfilled more and more poorly.

Link to comment
Share on other sites

5 hours ago, Dragon01 said:

Oh, we will manage. I just think we'll go back to the kings and princes. :) Who, bear in mind, ruled through most of those ages, with a more or less decent record on average.Those, at least, do not need to be concerned about fake news directly impacting the way they run their countries. An autocrat can't completely disregard public opinion, but doesn't have to bend to its every whim. They also don't benefit from hyperpolarization, quite the contrary, in fact.

Democracy requires rational people (already a tall order) acting on unbiased information in order not to devolve into tyranny of the majority and/or outright ochlocracy. I'm not saying there weren't biases all over the place before the internet, but it did give them far more visibility. Because of that, they aforementioned requirement is fulfilled more and more poorly.

I imagine the last state of human society will be one where big brother works in tandem with ultra-transparency. I imagine that, while camera's and Siri will remove any real privacy for the people, our future leaders will also have a 24/7 dash-cam and everything they do will be scrutinized as harshly and unfairly as the rest of us.

I imagine Star Wars would have ended a lot differently if Palpatine had a camera on him at all times.

Link to comment
Share on other sites

Yeah, he'd have a huge crowd turn out on #DeathStar II, #Emperor and #Savethegalaxy feeds on Space Twitter and Space Instagram overflowing with vicious arguments about what's happening (and presumably a whole lot of people in support). :) This can be a future, but I don't think it'll be a democratic future. Rather, it'll be one of celebrity exhibitionism and persistent noise, either an ochlocracy, dictatorship, or in best case, a monarchy. Nobody will have any privacy, which would actually make creating a vapid personality cult easier. And this is just par the course for politician and royals alike, just look at how HM The Queen and her entire family has a personality cult far more widespread and persistent than any dictator in history (it helps that she's genuinely a really cool person). All politicians and most royals are celebrities to some degree, and for politicians in particular, the skills needed to become an internet celebrity overlap with those needed to be any good at politics. Basically, you're manipulating people into giving you their likes, be it on Facebook or in the polling booth. Palpatine was really good at this kind of thing, BTW, he'd probably be a social media darling if the Empire had those.

Absolute transparency for absolute power is a good idea, but you need to realize just how much noise this would produce. Any talk about serious issues could be overshadowed by the next mealtime, as long as something "notable" was on the table. :) There are people doing that sort of thing right now, not 24/7, but almost. In fact, if you're famous enough, gossip media will do that to you whether you like it or not. Our only hope is that whoever ends up in charge knows what he/she is doing, because social media aren't gonna help. When everyone has a voice, it's hard for anyone to actually be heard.

OTOH, one thing that social media make easier are communication and analytics. If we get someone interested in actually ruling, as opposed to someone who just want to be in the limelight 24/7, he/she could use that to give people what they need, as opposed to what they want. Generally, people talk about the latter, and the former has to be guessed, which was for a long time a major challenge of effective governing. That's why real governments have it much harder than economic simulation players - the former often have to operate on limited and inaccurate information.

Edited by Guest
Link to comment
Share on other sites

On 10/3/2020 at 8:33 AM, Dragon01 said:

I just think we'll go back to the kings and princes

Yeah, republics are definitely unstable. *glares pointedly at Athens and Rome*

***

I honestly don't imagine AI will have much more effect on politics and the course of nations any more than the tech we have today. What can it do besides spout more lies, or truths no one will listen to?

On 10/1/2020 at 2:58 AM, Dragon01 said:

I was talking about how the brain works, not what it consists of.

One word: sarcasm.

On 10/3/2020 at 2:06 PM, WestAir said:

I imagine Star Wars would have ended a lot differently if Palpatine had a camera on him at all times.

The scary thing is, some humans are willing and able to convince people that anything they're doing is for the greater good. Since it is for the greater good, and since the ends surely justify the means, we the populace can ignore what is happening to this or that group of people. After all, we're gonna benefit from their unfortunate circumstances. It's just Machiavelli, Mill, and Hobbes come back to haunt us. And really, they just wrote what they saw. "The ends justify the means" is an integral part of human nature; it justifies all actions. It is also something I would like to not see in an AI.

Maybe making hypothetical superintelligent beings in our own image isn't such a good idea...

Edited by SOXBLOX
Link to comment
Share on other sites

2 hours ago, SOXBLOX said:

Yeah, republics are definitely unstable. *glares pointedly at Athens and Rome*

Do take a closer look at them, then. Athens was what we would now call an oligarchy, with women, immigrants and the poor excluded. The former was par the course until 20th century, but the latter two basically ensured that the native aristocracy (Greek term!) ran things. Admittedly, it worked pretty well for the citizens. Rome operated on a similar basis, and BTW, it had its best years as Roman Empire. Roman Republic lasted 500 years, which is a record for a democracy, but Roman Empire lasted 1500 years. It had its ups and downs, but it was still around much longer than any democracy you can name. 

2 hours ago, SOXBLOX said:

The scary thing is, some humans are willing and able to convince people that anything they're doing is for the greater good. Since it is for the greater good, and since the ends surely justify the means, we the populace can ignore what is happening to this or that group of people. After all, we're gonna benefit from their unfortunate circumstances. It's just Machiavelli, Mill, and Hobbes come back to haunt us. And really, they just wrote what they saw. "The ends justify the means" is an integral part of human nature; it justifies all actions. It is also something I would like to not see in an AI.

Any system that doesn't account for human nature is destined to collapse. Anyone who tries to deny it is a fool. The thing with philosophies of those guys is that they work. Generally, a ruler following Machiavelli (especially one that read as far as the "don't be hated" part) will win against one who does not. This is why sociopaths are generally replace idealists in positions of power. The latter don't have what it takes to hold onto it.

So, what can be done? Set up a system that would ensure power ends up with competent sociopaths. Could be AI, but doesn't have to. Oppression breeds hatred, so such a ruler would not resort to it more than absolutely necessary. Instead, bread and circuses. For most people, freedom is an illusion even in a democracy, and indeed, those too used to thinking they have many freedoms can become a liability in a crisis, because they may refuse to surrender some of them for the common good. People aren't good at figuring what's good for them, and even worse at figuring out what's good for others (if they even care).

Edited by Guest
Link to comment
Share on other sites

2 minutes ago, Dragon01 said:

Athens was what we would now call an oligarchy, with women, immigrants and the poor excluded.

This doesn't make them a monarchy. Still a pure republic.
The immigrants' election rights are obviously optional.
The women didn't have them just because they were not financial subjects, so would anyway vote at the husband's command.
The poor ones didn't have rights for the same reason. Unlike now, they would vote for a piece of bread for this day.
So, their election system was adjusted to their economical abilities and reality.

The same reasons stayed actual until the industrial revolution of XIX-XX and following from it financial emancipation of the listed ones.

Also I would remind the Italian republics (Venice, etc.), existing for about a millenium.

10 minutes ago, Dragon01 said:

This is why sociopaths are generally replace idealists in positions of power.

The sociopaths are just more visible. Generally opportunists win, who can give to others what they expect.

Link to comment
Share on other sites

This thread is quite old. Please consider starting a new thread rather than reviving this one.

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...