JoeSchmuckatelli Posted February 2 Share Posted February 2 If you have been following the Nvidia 5k series drama... It's a mess. Extremely high priced cards with lackluster uplift over the 4k generation. (only reason to buy one is if you SKIPPED the 4k cards) Months ago we learned that AMD wasn't going to compete with Nvidia for the enterprise / extreme gamer level of cards, but instead was coming in to the consumer friendly mid range. They even changed the naming scheme to align with Nvidia - so the plan appears to be to compete with the 5070 cards. 5090 and 5080 are released - and sold out - so the high end is gone. Leaving the mainstream market available. AMD strategy seems sound. Except. AMD is releasing their cards a full month AFTER Nvidia drops the 5070 cards. If AMD had a product that could compete with Nvidia on raster performance... Why wait? Quote Link to comment Share on other sites More sharing options...
Nuke Posted February 2 Share Posted February 2 (edited) i guess its amd's turn to be the greed monger. this is why my next gpu is going to be intel. but i figure i got some time before my 7900xt is long in the tooth. plenty of time for intel to get gud. i personally dont see how making a gpu a once in 5-6 year purchase is good for profit margins. when they were in the < $300 range, i upgraded every 2 years.. Edited February 2 by Nuke Quote Link to comment Share on other sites More sharing options...
JoeSchmuckatelli Posted February 2 Author Share Posted February 2 32 minutes ago, Nuke said: greed monger This is the part I don't get. PC market has two players in mid range. Nvidia owns the high end and Intel is coming for the low end, leaving only the middle contested. AMD can compete on raster. But given that Nvidia enjoys mindshare due its high end performance, I would think AMD would want to be first out of the gate with an 'affordable' card. Those waiting for a "70" series card are likely ready to buy. So what happens to AMDs profits if they cede all early adoption to the other team? Quote Link to comment Share on other sites More sharing options...
Nuke Posted February 2 Share Posted February 2 (edited) 2 hours ago, JoeSchmuckatelli said: This is the part I don't get. PC market has two players in mid range. Nvidia owns the high end and Intel is coming for the low end, leaving only the middle contested. AMD can compete on raster. But given that Nvidia enjoys mindshare due its high end performance, I would think AMD would want to be first out of the gate with an 'affordable' card. Those waiting for a "70" series card are likely ready to buy. So what happens to AMDs profits if they cede all early adoption to the other team? lack of better descriptor. im all about free markets, "greed is good" and everything. but im not the only one disturbed by the behavior of some of the companies in this market sector. the best way to combat that is to vote with your wallet. nvidia started it, now amd is doing the same thing to keep up. and its been some time since i bought a team blue product. besides, the whole gpu market is going to be in a completely different state entirely by the time im hurting for a new gpu. consequence of making a gpu purchase a once in 5-6 years affair by driving prices so high. all gpu makers go through phases where they are hands down the best option, then get complacent and proceed to lose ground to a competitor. having a third player certainly will force better competition. at least assuming intel survives the next 5 or so years. also i have a feeling intel will have better linux support, since i want to get away from windows when 10 is eol. but again whats that going to look like in 5 years. amd and nvidia have flip flopped on that many times in the past. its also getting to where integrated is an effective alternative, so we may not even need gpus in 5 years. barely need them now given the games i play. Edited February 2 by Nuke Quote Link to comment Share on other sites More sharing options...
JoeSchmuckatelli Posted February 2 Author Share Posted February 2 30 minutes ago, Nuke said: barely need them now given the games i play. You know... that's something I (and, as I read, a lot of us older gamers) are coming to grips with. I've been tinkering with/building my own computers since 1995. It's been a fun hobby and the gaming was generally good. But the domination of Consoles (= PC port = games dumbed down), expense of production and general "Been there, done that" of reskinned games with mechanics I just don't enjoy is making Sudoku much more of a common game for me these days. Don't get me wrong - I loved RDR2, but since then I can't think of a mainstream title that's grabbed me. Satisfactory plays just fine on my 4k monitor with a 3070. Maybe it just is my "every other generation" thinking that has me itching for something new. Certainly don't need a 5070 for Sudoku. ... If you'll pardon me - I'm gonna go wander off and mutter into my beer for a while. Quote Link to comment Share on other sites More sharing options...
Nuke Posted February 2 Share Posted February 2 8 minutes ago, JoeSchmuckatelli said: You know... that's something I (and, as I read, a lot of us older gamers) are coming to grips with. I've been tinkering with/building my own computers since 1995. It's been a fun hobby and the gaming was generally good. But the domination of Consoles (= PC port = games dumbed down), expense of production and general "Been there, done that" of reskinned games with mechanics I just don't enjoy is making Sudoku much more of a common game for me these days. Don't get me wrong - I loved RDR2, but since then I can't think of a mainstream title that's grabbed me. Satisfactory plays just fine on my 4k monitor with a 3070. Maybe it just is my "every other generation" thinking that has me itching for something new. Certainly don't need a 5070 for Sudoku. ... If you'll pardon me - I'm gonna go wander off and mutter into my beer for a while. well im not about to throw in the towel and play sudoku as my primary game. but im getting there. i can still sink hours into minecraft or my favorite emulated console games from the 16-bit era. i still question why i need >120hz or 4k out of my screens (as vision grows fuzzy bigger is really the only thing i want out of a new display). even the newer games ive bought are kind of not making the gpu blow heat like they used to. i ask myself can it run doom, but these days everything runs doom. Quote Link to comment Share on other sites More sharing options...
magnemoe Posted February 3 Share Posted February 3 21 hours ago, Nuke said: lack of better descriptor. im all about free markets, "greed is good" and everything. but im not the only one disturbed by the behavior of some of the companies in this market sector. the best way to combat that is to vote with your wallet. nvidia started it, now amd is doing the same thing to keep up. and its been some time since i bought a team blue product. besides, the whole gpu market is going to be in a completely different state entirely by the time im hurting for a new gpu. consequence of making a gpu purchase a once in 5-6 years affair by driving prices so high. all gpu makers go through phases where they are hands down the best option, then get complacent and proceed to lose ground to a competitor. having a third player certainly will force better competition. at least assuming intel survives the next 5 or so years. also i have a feeling intel will have better linux support, since i want to get away from windows when 10 is eol. but again whats that going to look like in 5 years. amd and nvidia have flip flopped on that many times in the past. its also getting to where integrated is an effective alternative, so we may not even need gpus in 5 years. barely need them now given the games i play. Well the PS 5 pro uses an integrated gpu, now consoles get en benefit that you can write specific for the gpu but who many does that now? I think most games are initially developed on PC as its much more convenient than developer consoles as you have all the tools you want on the desktop with multiple screens and you making an pc version anyway if not Nintendo. I upgrades my very old 980ti for an 4070ti for KSP 2 granted it was very long in the tooth and helps in other games to. Still do you need 4K and 120 fps? And don't get the AI up-scaling, apparently it works but it increases lag as you only have 33 ms rendering and showing the frame at 30 hz and 16.7 ms in 60 hz. Quote Link to comment Share on other sites More sharing options...
JoeSchmuckatelli Posted February 3 Author Share Posted February 3 2 hours ago, magnemoe said: 4K and 120 fps That is what I wish my 3070 provided - and all the wizardry + lag isn't a solution to me. I'm a raster guy. Until Ray tracing stops looking like a clown show and runs on midrange gear - which is all I will afford. Quote Link to comment Share on other sites More sharing options...
JoeSchmuckatelli Posted February 3 Author Share Posted February 3 I will say I'm tempted to get Kingdom Come Deliverance 2 ~ spring break. Comes out tomorrow and I cannot afford to like it until much later in the semester! Bonus is most of the early game bugs should be fixed (?) Quote Link to comment Share on other sites More sharing options...
Nuke Posted February 4 Share Posted February 4 (edited) im not even sure i need 4k - 120hz. refresh beyond 300hz doesnt make any sense to me either. that just means you have to run physics at a much slower frequency. practically all of the rendered frames you are interpolated from the physics states of objects. biggest complaint from players is bad collision detection, and no wonder because they are confusing interpolation artifacts with actual bugs. that's before you employ ai magic to generate frames. want to put ai accelerators on gpus, why not use them for, you know, ai opponents. if we had better fp64 performance (which is intentionally gimped on gaming cards vs workstation cards), we could offload high quality physics to the gpu. 5 years ago if you had asked me if id rather have that or ray tracing, i probibly wouldn't have said ray tracing. super-sampling was really the only way to make rt work at all. advances in monitor technology are definitely continuing to drive supersampling and frame gen, since no gpu can fill the frame buffer that fast. Edited February 4 by Nuke Quote Link to comment Share on other sites More sharing options...
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.