Jump to content

For Questions That Don't Merit Their Own Thread


Skyler4856

Recommended Posts

10 minutes ago, Entropian said:

Thanks to this, I've discovered I'm not American.  (I hardly ever smile)

Anything learned can be unlearned - and vice-versa.

When I came out of Boot Camp I was stone faced.  Tell me the funniest story, and all you would get is "Ooh-rah.  Good to go."

Freaked

                    my

                             friends

                                                out.

Edited by JoeSchmuckatelli
Link to comment
Share on other sites

How does software optimization actually works? I played a game back then in 2019 on my laptop and it's laggy as hell, with stutters and choppy framerate that makes it nearly unplayable even at lowest setting, but 2 years later, a definitive edition came out and I tested it on the same laptop. A same game, same laptop but it's far smoother than the 2019 version, even at medium setting. How does the optimization worked?

Link to comment
Share on other sites

1 hour ago, ARS said:

How does software optimization actually works? I played a game back then in 2019 on my laptop and it's laggy as hell, with stutters and choppy framerate that makes it nearly unplayable even at lowest setting, but 2 years later, a definitive edition came out and I tested it on the same laptop. A same game, same laptop but it's far smoother than the 2019 version, even at medium setting. How does the optimization worked?

Oh, man, layers upon layers. Optimization is a huge topic. Very broadly speaking, if we limit it to games, you are looking at one of the following categories.

1. Algorithm improvements. Sometimes you can just rearrange how operations are performed and make the code faster. This can be literally reducing number of instructions that have to be executed, reduce time lost to memory latency, or reduce time interacting with OS or hardware. Usually, you don't see big wins in something like this after the game is released, but every once in a while there's a stupid mistake that somebody missed that makes a huge difference.

2. Threading optimizations. Sort of related to the above, but when you are working with multiple threads running in parallel, you are sometimes losing time on threads having to wait for each other. So by simply rearranging when operations are performed you can sometimes get huge performance wins. Again, usually, you get that out of the way before the game is released, but sometimes, improvements like that can come in after release. A particular case is if the code was originally optimized for a very specific core count (*cough*consoles*cough*) but later, re-optimized to cover broader range of possible CPUs.

3. Removing unnecessary code. Things like writing logs can really slow performance down, and sometimes that's accidentally left in the final game. Finding and removing that stuff helps and it's more common than you'd think.

4. Engine/Library/Driver improvements. Especially if you're using 3rd party engine like Unreal or Unity, just because you're done working on the game, doesn't mean they're done improving the engine. Sometimes, it makes sense to switch to a new version of an engine, and sometimes, it runs a lot better. (Also, sometimes worse, but that's what you get with relying on 3rd party software sometimes.) Likewise, an update to something like a graphics drivers might fix something your game has been relying on, in which case, it's a welcome surprise of better performance. It's rare, but it happens.

5. Hardware improvements. Just because your hardware didn't change, doesn't mean the code wasn't updated to make better use of hardware improvements you already have. This could be done with an explicit change to the game code or be picked up as part of the engine or library updates as in the previous section. In either case, you end up with your hardware better utilized giving you better performance.

6. Code optimization. If computer executed the code the way it's written by programmer, things would run at least ten times slower than they do. With a modern compiler, code is first converted into some sort of internal representation, with compiler removing anything that's found to be unnecessary and simplifying some loops and function calls. Then the representation is converted into machine code for particular target architecture, and compiler removes redundancies, shifts things around to make better use of registers, and may even rearrange order of instructions to make them fit better into the pipeline. When CPU executes instructions it will also convert them into micro-code and potentially re-arrange them to improve execution. Now, the programmers have very little control over any of that if any. But updates to compiler and associated libraries can result in better code produced by simple recompiling the project. Likewise, the way your CPU converts instructions into microcode is subject to firmware updates. Some of the optimizations also have to be enabled, and again, you'd be surprised how often games ship with some of the code unoptimized. Obviously, if it was a global disable, somebody would notice, but a few unoptimized functions in a core loop can really slow the game down.

There are tools that let you examine what's going on with the code. We can look at time spent on specific function calls, how busy individual cores on the CPU are, when various calls to OS and hardware are made, how much time has been spent waiting for other resources, and so on. But it's still a lot all at once, so learning how to improve your own code and how to look for problems in code handled by someone else is a huge part of being a games programmer.

Link to comment
Share on other sites

4 hours ago, K^2 said:

Oh, man, layers upon layers. Optimization is a huge topic. Very broadly speaking, if we limit it to games, you are looking at one of the following categories...

I know that setting a system on high-performance (the usual setting for playing high-end games)demands more power, does it means supplying more power can compensate for low-end hardware? (within a certain limit of course)

Link to comment
Share on other sites

19 minutes ago, ARS said:

I know that setting a system on high-performance (the usual setting for playing high-end games)demands more power, does it means supplying more power can compensate for low-end hardware? (within a certain limit of course)

Not power in the sense of electricity - as much computational and processing power... So CPU / RAM / GPU is the usual trifecta. Better system (especially NOT a laptop) = better gaming experience. 

Net code improvements have greatly increased gaming enjoyment for many, as has improved ISP internet 'speeds' - but just as those go up, people started trying to play games over Wi-Fi which can be problematic.  Wired connection for competitive gaming often a must. 

'lag' is rarely a factor in the games' underlying code (i.e. Textures and rendering improvements won't stop lag) - but net code, ISP choice, use of a VPN, the backbone network and quality, location and number of servers are all big players in the lag issue. 

Edited by JoeSchmuckatelli
Link to comment
Share on other sites

@ARS - let me add something... you are asking questions that threaten to drag you down into the exciting and frustrating and addictive hobby of system building.

Should you be brave / foolhardy enough to venture there - allow me to recommend [H]ardforum.  Good place with really knowledgeable people who've been building computers since the floppy drive days.

I'm one of them: a guy who really enjoys building my own rig... but I'm also not one of the extreme over-clocking types.  If you want to get into that: they're there in spades.

...

So... allow me to toss out a prejudicial thought: laptops are not gaming computers.

(I write this as a guy who played DOOM - the first one that came on 7 floppy disks - on a borrowed laptop before I could get a real computer).  Even Alienware/fill-in-the-blank 'gaming' laptops are gimped for what you can get for the same money building a desktop.  I know that sometimes a person's financial state isn't such that they can dive into an expensive hobby... but just DON'T think of any laptop as a gaming machine.  The first desktop you build is always the most expensive - because after that you salvage parts to use in new builds.

I should warn you that you need to have the ability to turn a Phillips Head Screwdriver to the right before even thinking about building your own system.  You should also have some familiarity with reading directions and understanding technical terms and jargon.  Once you've developed these abilities, you can build your own computer.

Another caution: the Covid logistics crisis makes building at this time very expensive - way moreso than normal.  But in normal times you can usually build a system for $800 that will do everything better than a $2,000 laptop.

 

Edited by JoeSchmuckatelli
Link to comment
Share on other sites

49 minutes ago, JoeSchmuckatelli said:

@ARS - let me add something... you are asking questions that threaten to drag you down into the exciting and frustrating and addictive hobby of system building.

Should you be brave / foolhardy enough to venture there - allow me to recommend [H]ardforum.  Good place with really knowledgeable people who've been building computers since the floppy drive days.

[lots of wisdom deleted]

Another caution: the Covid logistics crisis makes building at this time very expensive - way moreso than normal.  But in normal times you can usually build a system for $800 that will do everything better than a $2,000 laptop.

Covid logistics are mostly behind us, but the GPU will still cost an arm and a leg.  I'd even recommend pricing out pre-builts compared to a part list that includes a GPU.  You could still probably build an $800 PC  with an AMD 5600G (graphics included) and beat said $2k laptop at *everything*.

One thing I would add is to know your goals before you make any purchases.  Expect to be told that you need something like 1440 resolution and 144Hz refresh.  That refresh rate is higher than 50% of people playing can actually determine in a blind test, so if you even plan on such a thing, I'd strongly recommend checking how you like 60Hz, 120Hz, and so on.  Doubling or tripling the number of times the computer is asked to draw the screen is *expensive* (moreso now), and often can't even be seen.

Another thing that gets asked over and over at pcpartpicker is which CPU goes with which GPU.  IT DOESN'T MATTER.  If you buy a GPU that does 2560x1440@144Hz, then obviously you will want a monitor that can do 2560x1440@144Hz.  And as far as the CPU is concerned, the only important bit is the @144Hz bit.  That means you'll probably want a much higher CPU clock than usual (although the number of cores depends entirely on the game involved.  Adding more cores to most games won't get you the 144Hz), but the resolution is completely irrelevant to the CPU.

And finally, you can throw out pretty much everything you hear about "gaming" when it comes to KSP.  KSP depends entirely on the CPU (pretty sure even Intel integrated graphics will keep up, until you add fancy shading mods) and even then probably only a few threads.  Don't expect a fancy  GPU to help at all in KSP.

Link to comment
Share on other sites

4 minutes ago, Admiral Fluffy said:

What are some of the problems associated with mining on the moon?

1. It's far and requires a lot of delta-V.

2. Sticky and abrasive regolith.

3. Lack of water for washing, cooling, and solving.

4, Lack of life supplies.

5. Lack of easily available ores, as most of lunar minerals are heatproof oxides.

Link to comment
Share on other sites

9 hours ago, ARS said:

I know that setting a system on high-performance (the usual setting for playing high-end games)demands more power, does it means supplying more power can compensate for low-end hardware? (within a certain limit of course)

Yes, a little. The actual semiconductor physics of it is complex - I've spent a year on a condensed matter track before switching to particle theory, and most of it is still bizarre to me, but in terms of practical consequence, what you have control over is clock frequency, voltage, and temperature of the die. The thing you actually care about with performance is the clock frequency. The number of instructions performed by CPU, outside any time wasted waiting on memory or whatever, is a multiple of CPU clock frequency. (You can also overclock memory and GPU also with their own limitations and caveats.) The problem is that there is a limit to how fast transistor states can switch, so if you get the clock speed too high, you end up with errors. You can compensate for that by increasing the voltage applied - kind of equivalent of throwing switches harder to get them to move faster to a new position, but there's a limit to that before you risk physical damage to the circuits. (Again, not unlike physical switches.) Finally, cooling the die helps to reduce thermal bounce, which both allows you to go a bit higher on the voltage and helps given voltage settle the transistors faster. So if you can cool the die more, you technically can get away with even higher clock speeds, which is why you see all the records set with liquid nitrogen setups. That really only helps you to squeeze the last drops of performance, though, so just a good cooling setup is adequate for most uses.

But where it comes back to power consumption is that both increasing the voltage and increasing the clock speed increase the amount of power consumed by the chip. And 100% of power consumed becomes heat, so even if you aren't trying to drop the temperature down, overclocking usually requires a good heat management setup just to prevent overheating. And laptops are disadvantaged on both of these. You have limited amount of energy available to power the CPU - certainly on batteries, but even when plugged into a wall, that portable power supply can only do so much, and you are limited on the thermals as well. Laptop just doesn't have room to fit a good cooling setup. Because of that, the laptop CPUs are usually designed to be more power efficient and less performant, but they do usually come with multiple power modes. You need very little CPU power to browse the internet, and being able to throttle down is important to conserving battery power. What usually happens is that the clock speeds will go down, voltage might adjust down as well, and some cores might become disabled to make the batteries last as long as possible, and only kicking into high gear when you're playing a game or something.

Even with a laptop, the manufacturers will usually stay on the safe side of keeping the thing reliable, so you can usually squeeze a bit more performance out of the system, especially, if you're plugged into the wall. Whether it's easy to configure will depend on the motherboard and CPU used. I have been able to overclock some laptops a little bit to make the games run better, but there is really not a lot of room to work with usually before you start running into heat throttling. That is, CPU detecting that it's overheating and starting to drop the clock speeds. (If it continues, it may shut down entirely.) So if you want to tinker with performance, you really ought to be doing that on a custom-built desktop actually designed to take the punishment.

Link to comment
Share on other sites

17 hours ago, wumpus said:

Covid logistics are mostly behind us

I agree with everything else, but this.  Shelves are empty.  We've not enough truck drivers.  Modern warehouse jobs suck and warehousers can't find employees to fill vacancies - or stick around long enough to become skilled.  Longshoremen can't empty the ships in the harbors, much less those parked outside the harbors - and that's just in the US.  Upstream problems continue, and second-wave spot covid - based lock downs are only just now easing. 

Total accordion effect. 

It's going to take a while for this to settle out - which is causing real scarcity and is increasing prices where computer goods can even be found (vs the false scarcity we have become accustomed to in the last decade).  It's been a year since the RTX 30xx series came out* - and that tier of card should be pretty much ubiquitous, but people are scrambling to find cards today.  Hell, you can't buy a car b/c of the chip shortage 

 

 

 

* (I know about the BS corporate decisions, mining fiasco, soft launch and yield issues that are not covid related) 

Link to comment
Share on other sites

4 hours ago, JoeSchmuckatelli said:

I agree with everything else, but this.  Shelves are empty.  We've not enough truck drivers.  Modern warehouse jobs suck and warehousers can't find employees to fill vacancies - or stick around long enough to become skilled.  Longshoremen can't empty the ships in the harbors, much less those parked outside the harbors - and that's just in the US.  Upstream problems continue, and second-wave spot covid - based lock downs are only just now easing. 

Total accordion effect. 

It's going to take a while for this to settle out - which is causing real scarcity and is increasing prices where computer goods can even be found (vs the false scarcity we have become accustomed to in the last decade).  It's been a year since the RTX 30xx series came out* - and that tier of card should be pretty much ubiquitous, but people are scrambling to find cards today.  Hell, you can't buy a car b/c of the chip shortage 

* (I know about the BS corporate decisions, mining fiasco, soft launch and yield issues that are not covid related) 

Well, true.  But the point is that even power supplies (the  heaviest and lowest margin parts in the PC) are available, so the problem has moved elsewhere except for the GPU.  And I wouldn't call it so much a "covid shortage", but a "lean manufacturing" dogma and "cut out any robustness in the supply chain in the name of cost-cutting" shortage.  Covid just lit the match to a disaster waiting to happen.  But most of the PC parts are available again.

And as far as I can tell, you might as well claim the reason you can't get a RTX 30xx board is more a lack of surplus.  AMD sells the 5600G for $250 (for a chip with everything working, i.e. the 5800G expect to pay $350) and it uses up 180 mm**2 per chip on each wafer.  Unfortunately, it doesn't look like AMD will even try to supply the 6600[XT] at list price ($330/$380), which uses 237mm**2 of the same class of wafer (TSMC 7nm, although one is "7N" and one is "Finfet").  Presumably they mostly use the same machines) and the GPU requires 8GB of extra-expensive GDDR6 ram, an entire circuit board, and cooling system.  You can see why such a chip would be one of their less profitable lines and last in line  for production. 

Nvidia is a bit more curious.  The 30xx line is made by Samsung at its bleeding edge line, so perhaps has a shortage (it also uses massive chips considering the tiny transistors, a 3060 uses 200mm**2, 3080 and up use 625 mm**2, and everything else (including scavenged 3060s) use 400mm**2.  On the other hand, the 16xx line uses 284mm**2 of TSMC's older 12nm line, but perhaps a spat between TSMC (and other companies needed 12nm for their bread and butter) prevents them from shipping these.

One thing to remember is that while nvidia may have started as a graphics company, they moved on to become a HPC (high performance computing, i.e. supercomputer) company.  And now they are even more an AI (machine learning) company as well.  They have (well had at one point, perhaps Intel is recovering) a higher market cap than Intel, and you don't get there by making boards used in a small percentage of Intel's machines (Intel itself makes its profit on servers, laptops, and desktops.  In that order).  AMD might have been a graphics company during the dark days of phenom and bulldozer, but is trying to position itself as a server company (those ryzens make great ads and test vehicles for EPYC processors).  GPUs are an afterthought, although I'd assume they want to compete with nvidia in the datacenter in AI (presumably the reason they bought Xilinx).  GPUs just aren't a priority, and they use a ton of somewhat constrained silicon that could be used more profitably and strategically elsewhere.

The other issue is that as long as you can mine etherium economically at these prices, expect the miners to drive the price up to this point simply by competing with one another.  Perhaps AMD is already making "enough" 6600XTs, and the miners are all driving up the price.  Supposedly etherium was supposed to go from "proof of work" (spend as much GPU power as possible) to "proof of stake" (own enough etherium as possible) which would kill the miner's chance to mine the most profitable GPU-based cryptocoin, but that appears to be more or less indefinitely postponed.

Link to comment
Share on other sites

On 10/26/2021 at 10:19 AM, wumpus said:

Well, true.  But the point is that even power supplies (the  heaviest and lowest margin parts in the PC) are available, so the problem has moved elsewhere except for the GPU.  And I wouldn't call it so much a "covid shortage", but a "lean manufacturing" dogma and "cut out any robustness in the supply chain in the name of cost-cutting" shortage.  Covid just lit the match to a disaster waiting to happen.  But most of the PC parts are available again.

The underlying problem is not transportation, but the die shortage. There are only so many factories in the world capable of processing silicone wafers into chips of various quality. The process is energy, resource, labor, and equipment intensive. That means that coming up short on even one of these can cause problems. COVID has acted as a catalyst for a perfect storm of shortages across the sector, and now the demand has backed up. The problem is that absolutely everyone with need for a high end chip is competing over the same manufacturing capacity. PC and console components (RAM, CPU, GPU, SSDs...), cell phones, car computers, components for servers, routers, and gateways, and a bunch of other tech you might not even think of in day-to-day. Some of the lower end stuff can be shifted to older processes, but a lot of it can't simply because it was designed with more modern components in mind. The recent releases of new generations of consoles, GPUs, and CPUs kind of matched up to bring the problem to the attention of the consumer, but just because demand for these died down a bit and supply had a chance to catch up a little, doesn't mean the problem went away. On the contrary, we are expecting the situation with some of the existing manufacturing to get worse, as some areas are now also impacted by draught, labor shortages, and potentially makes it difficult to replace some of the equipment used on production lines.

There are new facilities being built, and supply will eventually catch up. But some people expect another year or two of shortages and outrageous component prices.

Link to comment
Share on other sites

In many sci-fi settings, there's a lot of plot points of mining asteroid or planet to satisfy the demand of precious metals, and just now, I saw an article about 2 high-density ore asteroid that's supposedly could satisfy global demand of precious metal. If we take a real-life logic to this situation, is it worth setting up a space mining operation (orbital operations, spaceflight, the mining, and logistic transport) for ores with jacked up market price just to satisfy economic demand? (assuming that space travel is still limited, like today)

Link to comment
Share on other sites

The technical possibility will appear in such far future that nobody can know what will be called "economics" to that date, comrade @ARS.

P.S.
Usually they are dreaming about https://en.wikipedia.org/wiki/16_Psyche

But according to its density, not that platinum ingots cover its surface.
More likely, somewhere at 20 km depth there are deposits of rocks containing twice more platinum than here.

Link to comment
Share on other sites

11 minutes ago, ARS said:

In many sci-fi settings, there's a lot of plot points of mining asteroid or planet to satisfy the demand of precious metals, and just now, I saw an article about 2 high-density ore asteroid that's supposedly could satisfy global demand of precious metal. If we take a real-life logic to this situation, is it worth setting up a space mining operation (orbital operations, spaceflight, the mining, and logistic transport) for ores with jacked up market price just to satisfy economic demand? (assuming that space travel is still limited, like today)

There have been several discussions about this in the past year - consensus is that space mining is for the space economy and just bringing raw ores to the surface isn't worth it. 

Link to comment
Share on other sites

3 minutes ago, kerbiloid said:

The technical possibility will appear in such far future that nobody can know what will be called "economics" to that date, comrade @ARS...

Suddenly Armageddon's quote makes much more sense now: 'being a driller is more difficult than being an astronaut' (as long as it's in space)

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...