Jump to content

Integrated Graphics


Mmmmyum

Recommended Posts

Why do you hate them so much? Sure they used to be bad, but they really aren't that bad nowadays. Back in the days of the Intel GMA (with the Intel Core 2) were useless, yes, but the current top of the range Intel IGP (HD4000) has worked quite well for me, and the Haswell generation looking to compete with Nvidia and AMD in the midrange market.

My laptop has a Intel HD4000 with it's i7, that uses 1.6GB of RAM as VRAM and that's managed to pull 30fps for BF3; Source Engine games at max; Skyrim at medium high with a few beautification mods and draw distance upped; Fallout New Vegas at medium high (it's old but still rather tough); ARMA 2 at medium low (full draw distance); and basically it will make any cross platform game run like it's on a console.

I just don't get why people think they're that terrible. Most games need a few tweaks, but it does rather well at games.

Link to comment
Share on other sites

It depends on what you want out of it. If your happy with games running on low or medium settings then fantastic. You can save a lot of money. The newest integrated chips are admittedly pretty impressive when you look at what they can do and what they do it with.

But if you want to run all your games on maximum settings with a high frame rate, you need way more computational power than integrated graphics can provide, and the graphical difference between playing a game at medium and playing it at max is often quite substantial, particularly when it comes to things like shadows, water/water reflection, and ambient occlusion, which are all things notorious for being extremely demanding to do well.

Link to comment
Share on other sites

It all depends on what you want to do with your computer. If all you need to do is watch a film, surf the web, and occasionally play some game, they'll do just fine. But even the best integrated chip can't come close to what a dedicated graphics card can do, and most people simply don't care for playing a game on medium or lower graphics and still not getting above something as low as 30fps.

Link to comment
Share on other sites

To start with, Most laptops video systems (especially ones designed for gaming) don't technically qualify as "Integrated". They usually have their own GPU and VMEM source separate from the main processing capabilities. They qualify as a vairiant of dedicated cards but are soldered directly into the motherboard. Those don't perform at the desktop level not because they're integrated (because they're not) but because they're limited to reduce heat levels.

The proper term for an integrated GPU is one that appropriates resources from other areas such as system RAM and the like. These either draw too much that other system areas are affected or are intentionally limited to prevent it but the result is a far inferior performance compared to a dedicated GPU. Multi-core CPUs and much larger RAM stores have mitigated this to a degree but it's still a case of trying to get those resources to do something that they''re not conventionally designed for so they continue to suffer.

Link to comment
Share on other sites

The proper term for an integrated GPU is one that appropriates resources from other areas such as system RAM and the like. These either draw too much that other system areas are affected or are intentionally limited to prevent it but the result is a far inferior performance compared to a dedicated GPU. Multi-core CPUs and much larger RAM stores have mitigated this to a degree but it's still a case of trying to get those resources to do something that they''re not conventionally designed for so they continue to suffer.

Well, no. Integrated these days means that your video chip resides inside the CPU package. They almost always share resources, but that is not what defines them as integrated as what they share in what magnitude is different per implementation and generation.

Link to comment
Share on other sites

To start with, Most laptops video systems (especially ones designed for gaming) don't technically qualify as "Integrated". They usually have their own GPU and VMEM source separate from the main processing capabilities. They qualify as a vairiant of dedicated cards but are soldered directly into the motherboard. Those don't perform at the desktop level not because they're integrated (because they're not) but because they're limited to reduce heat levels.

The proper term for an integrated GPU is one that appropriates resources from other areas such as system RAM and the like. These either draw too much that other system areas are affected or are intentionally limited to prevent it but the result is a far inferior performance compared to a dedicated GPU. Multi-core CPUs and much larger RAM stores have mitigated this to a degree but it's still a case of trying to get those resources to do something that they''re not conventionally designed for so they continue to suffer.

Maybe read the OP... A HD4000 does use system RAM and other system resources...

Link to comment
Share on other sites

This thread is quite old. Please consider starting a new thread rather than reviving this one.

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...