Jump to content

Why I Do Not Fear AI...


Recommended Posts

On 6/8/2023 at 5:53 PM, darthgently said:

Ask the anesthesiologists how they feel about their 70+ yo patient spending even one unnecessary minute knocked out. And the entire time the whole team is getting more weary and error prone

Upside of robotic surgery is the small penetration who reduce recovery time a lot. 
But don't think it don't work well if its more issues than they assumed. 

Link to comment
Share on other sites

40 minutes ago, magnemoe said:

Upside of robotic surgery is the small penetration who reduce recovery time a lot. 
But don't think it don't work well if its more issues than they assumed. 

Robotic surgery can be great, the real point was as more people train almost exclusively with robots, they lack the experience to deal with situations where they have to convert the cases to open. That's the AI analogy—as people use LLMs to help their writing, for example, they might become less skilled at writing extemporaneously... course it might be the exact opposite in that example. I can imagine someone who is a poor writer having GPT rewrite everything. Assuming they bother to read what GPT is doing, I suppose it would actually be teaching them to write in real time. But there will be certain subjects where people might not exercise their brains as much, and default to outsourcing that cognitive task to a machine. In most cases this will actually be fine, but it's potentially a problem in cases where humans need to be the final backup.

I suppose we've been seeing this in cockpits for a while, as pilots manage their aircraft, and only actively fly the plane a few minutes of any long-haul flight.

 

Link to comment
Share on other sites

1 hour ago, tater said:

Robotic surgery can be great, the real point was as more people train almost exclusively with robots, they lack the experience to deal with situations where they have to convert the cases to open.

Practice matters.

Spoiler

 

(Btw what is the panel made of, and why there is no scratch on it?)

Link to comment
Share on other sites

9 hours ago, darthgently said:

We are spiraling into a black hole of centralized narrative weaving and political arbitration of "truth"

https://spacenews.com/intelligence-analysts-confront-the-reality-of-deepfakes/

In Russia, a deepfake of a certain political figure whose name starts with P was aired claiming a number of different emergency legal actions had been put into place.

Someone on Twitter made the perfect statement about this:

Quote

The future is here: the end of shared reality and truth is coming. [political statement], but it will target you - us - tomorrow. We're in no way ready for this.

 

Edited by SunlitZelkova
Link to comment
Share on other sites

The end of the Western folk illusion of having exclusive "free information". Eyes can lie, too, not only ears.

Just think yourself and avoid emotional injections.

"Always remember, that if your soul is flying up to the skies, that's probably somebody is trying to possess your body on ground." (c)

As we can see, all modern agenda is based on hysterical emotional storm, from eco-friendly millionaires whose childhood was stolen, and amateur actresses having gotten a jackpot after sleeping in proper bed, and later realising that they had been abused.

Edited by kerbiloid
Link to comment
Share on other sites

On 6/16/2023 at 6:53 PM, darthgently said:

We are spiraling into a black hole of centralized narrative weaving and political arbitration of "truth"

https://spacenews.com/intelligence-analysts-confront-the-reality-of-deepfakes/

That was actually the worst possible example to base this off of. The photo quite literally looked like a random plume of smoke; AI was unnecessary to create it, and... how do we even know it was?

The recent Trump-Fauci debacle (which had telltale signs of AI, like bad lattering on the White House seal) or @SunlitZelkova's example of a speech by our Darkest Overlord (Наитемнейший) are far more poignant.

On 6/17/2023 at 4:09 AM, SunlitZelkova said:

the end of shared reality and truth is coming

It's been a long time coming owing to the Memory Wars. Heck, the use of neural networks as an excuse to dismiss reality is almost outpacing their use to fabricate reality.

Spoiler

R63k0SHcat0.jpg?size=1079x1269&quality=9

"A neural network can draw you anything"

As someone who tried to get AI to draw a tank, I can tell you this is way beyond their capability.

 

Edited by DDE
Link to comment
Share on other sites

Further to the above (literal law-mowing thought) the ability to produce fake content would have two-pronged consequences. It will allow members of existing opinion groups to engage in greater self-delusion; they already treat outside content with bad faith ("inoculation"). The real battle would, however, be over the few remaining neutrals. This group would, however, quickly become jaded and vanishingly small.

To again use Sunlit's example, this sorts of hacked broadcasts have been a fixture since the beginning of, ahem, known events. The only new thing was the video format and the associated deepfake; indeed, the way the content matched prior psyops tripped many peoples' alarm bells (except, maybe, Tuymen, which isn't close enough to the action to be inoculated).

Link to comment
Share on other sites

2 hours ago, DDE said:

how do we even know it was?

+1
How do I even know that that Pentagon was existing, when I have never seen it myself, and only know about it from Fallout Brotherhood of Steel TV and internet.

Link to comment
Share on other sites

On 6/16/2023 at 5:53 PM, darthgently said:

We are spiraling into a black hole of centralized narrative weaving and political arbitration of "truth"

https://spacenews.com/intelligence-analysts-confront-the-reality-of-deepfakes/

Stuff like this has been around for an long time. Staged events and photo manipulation is the most common, examples is the Bush images of holding an book outside down or watching trough a binoculars with the lens caps on.
Done by amateur activists and debunked by Snopes because original images existed. 
In short do not trust stuff posted at FB or Twitter. 
Yes you can now do it easier and better.

Link to comment
Share on other sites

2 hours ago, magnemoe said:

Stuff like this has been around for an long time. Staged events and photo manipulation is the most common, examples is the Bush images of holding an book outside down or watching trough a binoculars with the lens caps on.
Done by amateur activists and debunked by Snopes because original images existed. 
In short do not trust stuff posted at FB or Twitter. 
Yes you can now do it easier and better.

This goes well beyond Photoshop.  Imagine your parent or spouse receiving a phone call.   Everything about the call tells them it is you they are talking to.  The AI deepfake on the other end knows endless details about you and your life.  It can mimick your voice excellently enough for a phone call.  From data mining it knows your likes, dislikes, annoyances, doubts, hopes, and where you told others you'd be and can answer questions about your flight, the local weather, etc accurately and using the phrasing and style you use.  The AI is spoofing your phone number on caller ID.  But in reality the AI is working for a company in competition with yours, a nation hostile to your nation, or a drug cartel who doesn't like your high profile activism against them since losing a family member to their activities.  And it is all automated.  No need to devote a team of people.  You don't have to be incredibly important as the tech is cheap.   They could find out even more detailed business or state information from the call.  They could end your marriage in one conversation if they chose to.  They could leave an already depressed parent in a suicidal state if they chose to.

This isn't faking a photo.  This is faking you, or faking people influential in your life, to the point that people can trust nothing not experienced in person.

Trust is the cornerstone of civilization

 

Edited by darthgently
Link to comment
Share on other sites

59 minutes ago, magnemoe said:

Stuff like this has been around for an long time.

 

On 6/16/2023 at 6:53 PM, darthgently said:

We are spiraling into a black hole of centralized narrative weaving and political arbitration of "truth"

No. I think we're actually slowly coming to grips with just the opposite. The news media has always been the delegation of fact-finding and thinking to a different person; the idealized careful consumer who read news critically and weighed them for plausibility never really existed. The first form of such media was the town crier - a person you trusted due to cohabitation and common self-interest, who you could, at worst, punch sense into. This didn't change much with the advent of the printing press. This did, however, change with the rise of the telegraph, which concentrated information in the hands of the newswires. Today's two-tier system emerged, with less than a half-dozen companies providing all the headlines, and a second tier of media echoing them. It's also the system that enabled the rise of modernist nation-states with the above-mentioned common political reality and news agenda. But note how this didn't set off the collapse of local newspapers - only the TV did. It did so because it provided a form of parasocial relationship with the new form of street crier.

However, I think, mass media guys made the wrong conclusions. They thought this new industrialized mass media was the product of their highfalutin standards of objectivity, et cetera et cetera. They sought the secret sauce to Walter Cronkite was the CBS backstage, and not Walter Cronkite. They were completely taken aback when the new social media-enabled street criers turned out to garner legitimacy more easily than some faceless cubicle rat at Reuters. Indeed, consider that their preferred response to "misinformation" is deeply bureaucratic in fashion, trying to rein the situation in and put social media under censorship by Old Media.

The problem is that people still go to street criers of their choice, and they gain utility when the street crier's thought process matches their own, causing them to cover issues of concern and follow similar patterns of analysis. Otherwise you end up arguing with the TV presenter, which is entertaining, but not too pragmatic. And so people have, do, and always will look for news sources that share their worldview. It may be Contrapoints. Or it may be Maryanna "Wagner Group Bunny" Bat'kova.

Therefore, the old method of imposing a shared reality through an institutional hegemony is dead. That means a shared reality could only be based around the opinions of widely accepted authority figures. But the allure of discounting massive swathes of society as not worthy of being included into your consensus is too great; in some cases, it may be an unachievable task. But first and foremost, you need to recognize the problem start making an effort at it, even if it would take a saint.

5 minutes ago, darthgently said:

This goes well beyond Photoshop.  Imagine your parent or spouse receiving a phone call.   Everything about the call tells them it is you they are talking to.  The AI deepfake on the other end knows endless details about you and your life.  It can mimick your voice excellently enough for a phone call.  From data mining it knows your likes, dislikes, annoyances, doubts, hopes, and where you told others you'd be and can answer questions about your flight, the local weather, etc accurately and using the phrasing and style you use.  The AI is spoofing your phone number on caller ID.  But in reality the AI is working for a company in competition with yours, a nation hostile to your nation, or a drug cartel who doesn't like your high profile activism against them since losing a family member to their activities.  And it is all automated.  No need to devote a team of people.  You don't have to be incredibly important as the tech is cheap.   They could find out even more detailed business or state information from the call.  They could end your marriage in one conversation if they chose to.  They could leave an already depressed parent in a suicidal state if they chose to.

This isn't faking a photo.  This is faking you, or faking people influential in your life, to the point that people can trust nothing not experienced in person.

In the past, falsified letters presented a similar problem with similar outcomes. Again, not a qualitative change.

Link to comment
Share on other sites

13 minutes ago, darthgently said:

This goes well beyond Photoshop.  Imagine your parent or spouse receiving a phone call.   Everything about the call tells them it is you they are talking to.  The AI deepfake on the other end knows endless details about you and your life.  It can mimick your voice excellently enough for a phone call.  From data mining it knows your likes, dislikes, annoyances, doubts, hopes, and where you told others you'd be and can answer questions about your flight, the local weather, etc accurately and using the phrasing and style you use.  The AI is spoofing your phone number on caller ID.  But in reality the AI is working for a company in competition with yours, a nation hostile to your nation, or a drug cartel who doesn't like your high profile activism against them since losing a family member to their activities.  And it is all automated.  No need to devote a team of people.  You don't have to be incredibly important as the tech is cheap.   They could find out even more detailed business or state information from the call.  They could end your marriage in one conversation if they chose to.  They could leave an already depressed parent in a suicidal state if they chose to.

This isn't faking a photo.  This is faking you, or faking people influential in your life, to the point that people can trust nothing not experienced in person.

Trust is the cornerstone of civilization

Yes that one is bad, it could be worse, you tell your wife or parent your been robed and is hurt, you are at an private hospital and they need their credit card information to treat you, no issue it will be covered by the travel insurance who was in your wallet so you don't have it as an example, find an list of people at holiday in an place this is semi plausible and run the scam. 
While people still fall for the Microsoft calling you.  So the Microsoft called me about company's Windows Azure subscription, my first reaction was, probably an scam but it was real.
And as you call from an hospital it would not be your phone who was stolen.
Now your scenario require that they have decent amount of my voice and me info about me in general. Who is more fun if most you post online is fake in the first place :) 
Mine is simpler as its more urgent and you are hurt. Telecom can do a lot to prevent spoofing phone numbers and they will have to do.
Or phone calls is as questionable as social media. Hacking them is old and this can be done much better with AI. 
 

Link to comment
Share on other sites

12 minutes ago, steve9728 said:

-Hey NOMI, show me your war face!

-Sorry I don't know how to do this at the moment

It's an undocumented feature.

Spoiler

photo_2023-02-06_11-35-07.jpg

 

Link to comment
Share on other sites

On 6/18/2023 at 4:34 PM, darthgently said:

This goes well beyond Photoshop.  Imagine your parent or spouse receiving a phone call.   Everything about the call tells them it is you they are talking to.  The AI deepfake on the other end knows endless details about you and your life.  It can mimick your voice excellently enough for a phone call.  From data mining it knows your likes, dislikes, annoyances, doubts, hopes, and where you told others you'd be and can answer questions about your flight, the local weather, etc accurately and using the phrasing and style you use.  The AI is spoofing your phone number on caller ID.  But in reality the AI is working for a company in competition with yours, a nation hostile to your nation, or a drug cartel who doesn't like your high profile activism against them since losing a family member to their activities.  And it is all automated.  No need to devote a team of people.  You don't have to be incredibly important as the tech is cheap.   They could find out even more detailed business or state information from the call.  They could end your marriage in one conversation if they chose to.  They could leave an already depressed parent in a suicidal state if they chose to.

This isn't faking a photo.  This is faking you, or faking people influential in your life, to the point that people can trust nothing not experienced in person.

Trust is the cornerstone of civilization

 

This has already happened. Some woman in Arizona got a call from a dude saying he kidnapped her daughter. She was in tears because she clearly heard her daughter pleading for help in the background. It was only after she called her daughter that she realized everything was ok.

The more gullible are doomed. In Japan, we already have a problem known as the Ore Ore (It’s me, it’s me) scam where people pretend to be distant relatives in need of money. Old people in particular are highly vulnerable to such scams. Now imagine if they actually hear loved ones voices!

Link to comment
Share on other sites

On 6/19/2023 at 6:56 AM, kerbiloid said:

At his time the idea made some sense, it was before archaeology. Today we have many documents from back then like all the Egyptian sources and the dead sea scrolls.

Link to comment
Share on other sites

4 hours ago, SunlitZelkova said:

This has already happened. Some woman in Arizona got a call from a dude saying he kidnapped her daughter. She was in tears because she clearly heard her daughter pleading for help in the background. It was only after she called her daughter that she realized everything was ok.

The more gullible are doomed. In Japan, we already have a problem known as the Ore Ore (It’s me, it’s me) scam where people pretend to be distant relatives in need of money. Old people in particular are highly vulnerable to such scams. Now imagine if they actually hear loved ones voices!

When I was a kid, my parents drove to pick me up from school one day to go shopping at a big mall in our city. Then while we were all in the car, my mum got a call: a child-like voice crying out about mum help me. Then it was "Your son has been kidnapped blablabla". My mum said "Guess why I know it's a scam?" And hung up the phone. A few seconds later, that guy called my dad again and it was the same thing. My dad: my son is in my car right now idiot!

My parents say that the "child's voice" they hear is very different from mine. But yes, imagine how different things could have been if kid’s voice print had been leaked and synthesized by the ai.

Link to comment
Share on other sites

34 minutes ago, tater said:

The fingers fiddling about has quite a bit of uncanny valley to it.  In human body language that excessive finger fiddling would communicate a negative impression of a pick pocket or overly "touchy feely" kind of person with  boundary issues that you'd instinctively avoid leaving your children alone with.  Quite the uncanny valley hurdle there.   Maybe we shouldn't trust anything or anyone too far with so much excessive "nervous" energy?

Link to comment
Share on other sites

Guest
This topic is now closed to further replies.
×
×
  • Create New...