Jump to content

The future of AI


boriz

Recommended Posts

8 hours ago, farmerben said:

I'm curious about the power demand of large AI systems.  Everyone is a bit over-excited about nuclear powered AI.  Which begs the question why are they choosing a baseload power source over one that easily ramps up and down like natural gas? 

Economics usually dictates using your most expensive capital equipment all the time at full power.  But, we don't know how much new power will demanded in 2026 or 2027.  And you can already smell the next financial bubble from a mile away.  If I were running an AI company and I just didn't know whether it was better to expand by 20% or by 40% next year, then I'd choose gas because it can be purchased ready to go at any scale you want.  I'd build in North Dakota where as we speak they are flaring gas because there is simply no way to move it all.     

because ai datacenters will be running full tilt around the clock, storage wont cover it so baseload is the only way.  tech companies also want to be green. if we haven't damaged the climate enough already, then a carbon running ever growing array of data centers will. i think what happened is the tech companies did the math and realized that nuclear was the only option. sun shines during peak demand so solar when the sun is out and gas when its cloudy. 

what you need the storage for mostly is wind power since it doesnt slot itself into the natural power demand cycle as well as solar does. wind blows when it wants too. naturally you build wind farms in areas with reliable, persistant wind (which since climate change is the problem, those conditions may not persist). wind turbine pylons seem to be large enough now that you can put gravity storage in each one, either weights on a pulley system or do it by pumping water to tanks near the top.

Link to comment
Share on other sites

You don't need storage if you pair renewables with gas.  Because you can throttle the gas in seconds to adjust to changes in the weather.   These are the things we have been building for at least the past 20 years.  Are we implicitly admitting these things are a bad investment compared to nuclear?  Is it just a greenwashing campaign where they really are using fossil fuel they just don't want anybody to notice?

We still have the forecasting problem.  A nuclear plant takes several years to build, and it's capacity is basically fixed for the life of the plant.  But we have no idea how much energy will be needed for AI's a few years from now.  Past financial bubbles should be our guide.  I predict the next five years will see massive overinvestment, followed by a correction with many bankruptcies.  At the moment there is massive capital investment going in with no idea what future revenues will be.  

Link to comment
Share on other sites

2 hours ago, farmerben said:

Is it just a greenwashing campaign where they really are using fossil fuel they just don't want anybody to notice?

It has been in many other cases. AFAIK the fast-response kind of gas powerplants that are paired eith renewable grids are several times more expensive somehow.

2 hours ago, farmerben said:

We still have the forecasting problem.  A nuclear plant takes several years to build, and it's capacity is basically fixed for the life of the plant.  But we have no idea how much energy will be needed for AI's a few years from now.  Past financial bubbles should be our guide.  I predict the next five years will see massive overinvestment, followed by a correction with many bankruptcies.

You're preaching to the choir. DeepSeek already caused a mild stock market panic by suggesting the leaders in AI are rather bloated (even if it's just a distillation). There is a clear assumption that later AI developments will be based on transformer neural nets and be compatible with all the hardware that's being acquired, while continuously demanding more and more and more power and compute.

It's madness when you take a step back away from it.

Link to comment
Share on other sites

8 hours ago, farmerben said:

You don't need storage if you pair renewables with gas.  Because you can throttle the gas in seconds to adjust to changes in the weather.   These are the things we have been building for at least the past 20 years.  Are we implicitly admitting these things are a bad investment compared to nuclear?  Is it just a greenwashing campaign where they really are using fossil fuel they just don't want anybody to notice?

We still have the forecasting problem.  A nuclear plant takes several years to build, and it's capacity is basically fixed for the life of the plant.  But we have no idea how much energy will be needed for AI's a few years from now.  Past financial bubbles should be our guide.  I predict the next five years will see massive overinvestment, followed by a correction with many bankruptcies.  At the moment there is massive capital investment going in with no idea what future revenues will be.  

storage is nice to have. it also has a fast throttle capability that gas has. i dont think it scales well, especially using current battery storage. it serves the same purpose as a smoothing capacitor in a switch mode power supply (though scaled up to grid size), and you always want smooth power. if it can buy you enough time to adjust your slow throttling plants, then it is a useful management tool.

as for nuclear it can be done, china has doubled its nuclear generation capacity in about 8 years, about the same time as it takes us to commission a single plant. if current baseload is below 100% green, because you are burning gas to make up where renewables are weak. if we consider what percentage of baseload is green, you want to get that as close to 100% as possible. renewables backed by gas is probibly not 100%, id even doubt that its 80%, nuclear is as close to 100% as you get. and it uses a lot less land than renewables and storage does.

Link to comment
Share on other sites

For a start, no matter what the say, what we have now is not Artificial Intelligence, it is Complex Algorithm. C.A that requires a Hoover Dam and a Nuclear power station to run. It is not smarter than a 5th grader, it is just faster. Take away it's library repositries and it is useless.

I am not a programmer so I come at this from a social point of view.

The companies that are heavily pushing for this are doing so because human beings are obsessed with having slaves and making huge profits.

So called A.I should not be allowed on the open web, only in closed research systems where they can speed up modeling of experiments. Take a look at all the things the companies promote as uses for the average person. Most involve faking something or decreasing people working things out for themselves.

I can just see a future generation, so used to asking their A.I how to do everything, that if you took the internet away they would be the Eloi from H.G Wells's book "The Time Machine".

Sadly, humans are quick to let the Genie out but useless at putting it back in the bottle, so I think we are stuck with things getting even less empathetic.

Link to comment
Share on other sites

7 hours ago, ColdJ said:

Take away it's library repositries and it is useless.

Depending on what you mean, it's either impossible or wrong. The neural net itself is the sum total of the training dataset subjected to an aggressive form of compression. The GPTs simply rebuild the relevant bits much like decompression algorithms do. More elaborate ones, like phind.com, use an Internet search to augment the original net.

7 hours ago, ColdJ said:

I can just see a future generation, so used to asking their A.I how to do everything, that if you took the internet away they would be the Eloi from H.G Wells's book "The Time Machine".

I find this particular thesis to heavily resemble arguments against... all sorts of automation, mechanization, and even division of labor. Seriously, why do we let people go to supermarkets when they should know how to hunt on their own with only the tools they're able to make for themselves?

The problem, as you would likely put forward, is that AI automated cognition and thought instead of labor. But we already do that on a whole multitude of other levels! How is an AI worse than adopting a received wisdom from a book of questionable quality (that is, the majority of them)? If anything, AI seems to be an attempt to respond to the maladies of current society: an information aggregator to skim over impossibly many web search returns, a writing assistant to help a generation that was already quite hopeless at writing, a virtual companion for the age of hyperconnected loneliness.

I know what you're getting at. But when children outsource homework to ChatGPT, the AI is not the problem, and those who put it out into the wild aren't at fault.

Edited by DDE
Accidental double-post
Link to comment
Share on other sites

11 hours ago, ColdJ said:

The companies that are heavily pushing for this are doing so because human beings are obsessed with having slaves and making huge profits.

So called A.I should not be allowed on the open web, only in closed research systems where they can speed up modeling of experiments. Take a look at all the things the companies promote as uses for the average person. Most involve faking something or decreasing people working things out for themselves.

I can just see a future generation, so used to asking their A.I how to do everything, that if you took the internet away they would be the Eloi from H.G Wells's book "The Time Machine".

To be fair though, this is already happening. No one really does anything on their own. We are told we have to do this at the office, we are told we have to go to the office, we are told we have to do such and such here and there. The structure of society was actually "done" a very long time ago by the people who first did it, and everyone has just been "copying" what they did ever since rather than doing their own thing.

Today's AI is just a natural development of that behavior. People are leaping from "I'm gonna do these activities for the rest of my life that someone else came up with" to "I'm gonna let someone (something) else do the activities in my life that someone else came up with."

The Eloi are already here, and they are the humans of today. The number of things in the world that people say "are natural" or "have always been that way" despite us now knowing that we in fact don't know about the majority of human history is evidence of that to me.

On the contrary, if the Internet or whatever other technology was taken away I don't think it would make them helpless. It would make them helpless if you hold them to the standard of late 20th century human behavior in the developed world, but as humans, with no assumptions or expections... they'll be fine. People don't literally need the Home Depot to function and survive, so to speak.

Nor do they need prior knowledge accumulated by past generations to succeed per se. A lot of evidence is being found that the Neolithic Revolution was not really a revolution at all, but people deciding to put knowledge they had already had both culturally and individually for a long time prior into action (people had already "discovered" agricultural long before, but chose not to do it). Office buildings, cars, books, and spaceships are not what make us human: they're conscious activities individual people choose to do, and just as one can choose to sort of "play" with all of those things, so too can people choose not to. But that's fine, in the way that it can't be said that 9-to-5 at the office is somehow a more "correct" way of life than that of the !Kung San who spend much of their day more or less lounging around.

Of course if we are talking about hard physical problems... like, without the food production and distribution system of 2010, local environments can't support the massive populations that exist today and they will all die... There's gonna be a lot of hurt there. When it comes to such a conundrum, I'm often reminded of a sort of parable from my 6th grade science teacher: "If the foxes die, then there will be too many rabbits and they will eat all of the berries, and when the berries are all gone they will all die."

But the issue with posing that as a problem is that death is not an inherently negative thing. People choose to think it is negative, just as they choose between either the 9-to-5 or doing nothing at their parents' house. But this choice is a personal choice, not one that the entire species as a whole must somehow make.

Some say this is because humans are "evil," and in the past I toyed with the idea that humans aren't really conscious. They say they are, much like how any of today's AI might somehow be programmed to say they are self-aware, but in reality they aren't.

Now, I believe this is more so because cultural conceptions about lofty subjects like life and death have not evolved to take into account the discovery of distant species of other animals, and the realization that they went extinct. Quite literally, people who found dinosaur fossils in China thought they were dragon bones; dragons, of course, being very real parts of the finite world. I often imagine a dramatized version of the discovery of the first dinosaur skull taking place in the 1600s, in which crusty miners unearth an astonishing "dragon skull:" oblivious that the Earth is in fact much older than 6,000 years and different animals were once at the apex of the world.

In virtually everything; from philosophy to law to morality to subsistence; all humans do is based on the underlying assumption that the world will always be the way it is and can never "run out," and that even if it does, we (also commonly: a deity) will make a new home for everyone in the sky.

Even the most self-proclaimed adherents to secular thought, who proclaim to reject superstitions of the past, rely on conceptions about reality and society birthed long before modern science came about. No one really believes humanity can go extinct, because apart from seeing the fossils and talking about them, we still use thought processes developed based on the assumption that nothing can go extinct.

As I said, there's choice in all of this: it's up to the individual to decide whether all this is really bad or okay (or even good). Me personally, I just enjoy the pictures on the Golden Record like living paleoart pieces, and like observing all of the animals around me before they become impressions in the dirt. What I'd give to be able to see J. Monesi, might as well take the opportunity with Holocene species while I have it.

On another note; all of this is why I have toyed with another theory: that the so called "advanced" state of humanity over all other forms of life is actually a defect. Alien paleontologists will treat the past 10,000 years of human history like the apparent drop in biodiversity in dinosaurs observed prior to the K-Pg extinction event. In fact, that's just what it is: for the other few million some years prior, humans (and I'm reducing the term human to the same generality as gorilla, not specifically related to sapiens) varied in shape and size (relatively). But the competitors are gone, and now the rabbits feast on the berries unopposed. I still think it will be thousands of years before the rabbits and berries both disappear though, rather than the dramatic 2100 deadline Hollywood has used in recent years.

Link to comment
Share on other sites

@DDE@SunlitZelkova

My take away from both of you is "we have already been stuffing up the general population for so long, what does it matter if we accelerate and stuff it up even more."

I will try to make this brief since if both sides of the discussion can't be swayed then what I write next is probably just me saying it to the universe.

8 hours ago, DDE said:

I find this particular thesis to heavily resemble arguments against... all sorts of automation, mechanization

At some level this is true, human beings need an outlet and need to feel they are achieving something in order to have a sense of happiness. We play video games in order to feel like we can achieve something, in a situation where the rules don't keep changing. Many of us can rule a galactic empire in a game because the rules are fixed. In life, every time you try to play by the rules society claims are fixed, they change them or add little loop holes.

If we hadn't started replacing human jobs with automation, then we wouldn't need all these other technologies to keep us occupied. Inventions were meant to make your jobs easier, not replace us.

8 hours ago, DDE said:

How is an AI worse than adopting a received wisdom from a book of questionable quality (that is, the majority of them)?

Because with a book you think out what it says and see if it fits with what you have learned about the world through personal experience. You can even attempt what has been written to prove if it is true.

When newer generations rely on A.I to answer everything, they make the assumption that it is correct. If you or I look up something, we read many different articles and then use our intelligence to find a pattern that makes the most sense.

When an A.I shows you pictures of the old British empire from 300 years ago and everybody in the pics has dark skin, you realise that it is not correct because you learn't about history in your schooling.

If future generations rely on A.I for all their answers, then it would be very easy to change history and they would never know.

@SunlitZelkova

I read all you wrote and I am on the same page for most of it. I have always enjoyed your posts on philosophical matters, in a number of threads.

Sadly, future generations will probably not be able to muse and make the sort of connections you do, thanks to A.I

 

Sorry all. That was as brief as I could get.

 

Link to comment
Share on other sites

52 minutes ago, ColdJ said:

Because with a book you think out what it says and see if it fits with what you have learned about the world through personal experience. You can even attempt what has been written to prove if it is true.

There are whole genres of books that tell you your thinking is wrong, your experiences are defective, misleading, perhaps even engineered, and must not be trusted, and that we the smart and enlightened ones will tell you exactly what to do, and whose face to crush with your jackboot, forever. Once again, you're creating a false dichotomy.

People have been rewriting history quite swimmingly without AI.

Link to comment
Share on other sites

4 hours ago, ColdJ said:

My take away from both of you is "we have already been stuffing up the general population for so long, what does it matter if we accelerate and stuff it up even more."

I will try to make this brief since if both sides of the discussion can't be swayed then what I write next is probably just me saying it to the universe.

One thing I’d just say is that I am not really saying “we” have been doing such things.

Individuals are just doing individual things. Some agree here and some disagree here. I wrote over in a thread in the Lounge that I think “humanity” is a pretty loony concept, and that anyone can only make truly accurate statements about our immediate surroundings.

People are free to do as they wish, as much as one subset of quantum theorists and philosophers might say otherwise.

I think trying to control people and prevent them from doing things is just as messy as “letting” them do whatever they want. But given such a hard choice, I prefer letting them run wild. Even if it does lead to a bad ending.

4 hours ago, ColdJ said:

Sadly, future generations will probably not be able to muse and make the sort of connections you do, thanks to A.I

I’d also like to note that I think you’re right about that. There will be no digital Nag Hammadi, IMO.

More unfortunately, we can already see the effect you predict for the future in the present day.

Not only are there all sorts of cool ideas that indigenous populations most certainly had but were lost due to their preference for purely oral transmission of ideas, but it is almost certain that distant ancestors of all humans had their own unique conceptions about reality, morality, etc. etc., not only in terms of cultural differences but individuals ones too.

There is no way we will ever know about these things. Future anthropologists will probably look at the rotten masses of discarded gaming PCs and think about all those who lived just prior to the “Digital Age Collapse” the same way I think about the Paleolithic.

Link to comment
Share on other sites

3 hours ago, DDE said:

There are whole genres of books that tell you your thinking is wrong, your experiences are defective, misleading, perhaps even engineered, and must not be trusted, and that we the smart and enlightened ones will tell you exactly what to do, and whose face to crush with your jackboot, forever. Once again, you're creating a false dichotomy.

And we can read them all and come to our own conclusions. I personally don't believe something just because I read it somewhere. I look to as many sources as possible and then form my opinion by what seems to make sense, based on my experience. I certainly won't attack or oppress  people because somebody tells me to. Race, colour, gender or country are meaningless to me. I only care how an individual treats their fellow human beings. My country has an epidemic of teens with no empathy, breaking into houses, stealing cars and causing massive damage, just because they feel like it. Because they have become disconnected to humanity through modern social media. The type of A.I that is being promoted just makes it worse.

I am not presenting multiple opposing views, I am expressing an encompassing one.

The fact you can think for yourself and present your opinions is because you were brought up before A.I

The way we are going there will generations without that ability.

 

10 minutes ago, SunlitZelkova said:

People are free to do as they wish, as much as one subset of quantum theorists and philosophers might say otherwise.

I think trying to control people and prevent them from doing things is just as messy as “letting” them do whatever they want. But given such a hard choice, I prefer letting them run wild. Even if it does lead to a bad ending.

My country didn't have a Vape problem and cigarette smoking was on the decline.

Vapes were introduced to my country with basically no regulation.

My country now has a massive teen vaping problem.

Young people will take things up with very little thought for long term consequences.

Nanny state is bad, but checking the risks thoroughly before making stuff available to the whole world, is just common sense.

Link to comment
Share on other sites

39 minutes ago, ColdJ said:

 

The fact you can think for yourself and present your opinions is because you were brought up before A.I

The way we are going there will generations without that ability.

 

I've seen older people who could only read a teleprompter, and not manage to present real opinions in real time.  You probably know who I'm talking about.  

Link to comment
Share on other sites

10 hours ago, farmerben said:

I've seen older people who could only read a teleprompter, and not manage to present real opinions in real time.  You probably know who I'm talking about.  

You wouldn't expect me of all people to disagree, but when it came to foreign policy and arms control, this guy was very consistent in his positions for over half a century.

I'd like to put forward my own grandfather, whose entire career in education was him rattling off from notes with other people's ideas. Same age, polar opposite politics, of the hammer and sickle and Mauser variety.

(Whereas his mother actually took kulaks behind the barn)

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...