-
Posts
2,030 -
Joined
-
Last visited
Content Type
Profiles
Forums
Developer Articles
KSP2 Release Notes
Everything posted by PDCWolf
-
SWDennis has posted a video showing the newest stuff they've worked into the game. Of course everything is a WIP and has been exfiltrated from their discord.
- 1 reply
-
- 2
-
-
Ah fences! I didn't realize this was a huge necro.
-
I don't know the type of discussion you're looking for, but with 10+ years in this forum I can tell you how it ends: "Nbody is cool" "Nbody is too complex and nobody would pick the game up" "Nbody is expensive to run" (a fallacy) "Nbody would destroy -niche thing this poster does in his game-" "Nbody is just not fun" (because fun is clearly objectively measurable) "But NASA uses patched conics" (a literal lie) "I have no idea what Nbody is but I have an opinion" But yeah, at the end of the day talking about the inclusion of N-Body simulation, bigger planets, axial tilt (back in the day), and so on and so forth, literally ANYTHING that isn't vanilla KSP, ends up like this known XKCD comic:
-
The whole video is gold, specially because some solutions were being applied -at least on a title soon to be popular- for the first time. Also there's some clearly lost art of the kraken, early versions, and so on. It's an absolute treasure trove. Compared to this...https://www.youtube.com/watch?v=kvytgzvqlgQ which now is fired people central talking about things they copied from KSP1, things they never got to put on the game, and planet tech that was completely mangled by their bad art-style (and for the convention they show in a more neutral art style). They also talk about interplanetary lighting... on a game where they never cracked eclipses because 'interplanetary lighting' was clearly a fickle fabrication. Still, that second video also showcases the scaled-space trick. Funnily enough KSP1 uses(d?) a scene up to 6km big for vehicular operations, think driving rovers, meanwhile KSP2 never managed to properly work the origin shift at their much smaller 2.5km.
- 937 replies
-
- 1
-
-
- ill-advised
- sos
-
(and 3 more)
Tagged with:
-
So far that's been handled by the scaled space planetarium method, no asset is actively (for realsies) at more than 2.5km from the player (I've seen a number like 1/600000 claimed), at least as far as I know. I however claim 0 knowledge about how they do this in KSA.
- 937 replies
-
- 1
-
-
- ill-advised
- sos
-
(and 3 more)
Tagged with:
-
I wouldn't say hope, it's a simple balance of effort: They already had the chance to shed the forum off when assets were passed over to them, would be really weird if they went through that effort to transfer the dns, very probably the hosting service, and only to say "nah" when the bill for the license hits. Of course I can't say it's 100% what's gonna happen, just that it'd be very weird of them to do that.
-
I'll say that if they wanted the forum gone they just wouldn't have gone through the trouble of transferring the DNS and very probably the hosting service. They'll probably pay when the date draws nearer.
-
Yes and no... in KSP2 we were still facing the same middleware bugs for example (orientation of arrows for gizmos in the VAB being the most obvious). Unity itself... still has the same limitations it had long ago, they've just pushed the limits some but really many iterations of rigidbodies are still difficult to handle, much more multithread consistently (the whole unity physics thing is really bad and unoptimized at scale), serialization sucks when you need to save tons of data, video playback tech is really poor, compilation for other platforms is shoddy as hell, and so on, and the answer to that for Unity only is "buy the license so you can dive into the source and fix it yourself". No tech is upscaling from that low a resolution, you'd lose too much detail to upscale from. You'd at least upscale from 720p like previous-gen consoles and that'd still look hideous, like previous-gen consoles. Framegen also doesn't work great if you don't have 60+ FPS as a base, it just makes input latency shot up the roof. The minimum for good-looking upscaling is like 1440p from 1080p... and for framegen, you'd want to use it to bump say from 90fps to 144fps for a high refresh rate monitor, not for 30 to 60. The problem with "optimizing" the current graphical tech implanted in off-the-shelf engines is that you'd really be paying just buy the license for source access and then you'd still need to do the work yourself. So you'll see most lowish budget titles either run with what's default (discounting any extra misuse of tech) or implement pre-packaged alternatives, which also won't be optimized by them (either by license impossibilities or because there's no time and resources to do the work). But don't worry, this will get worse, as most studios have already shifted to Unreal Engine 5 to cut down dev time and costs so the garbage games they put out at least become cheaper and faster to make and it's easier to break even. So when I say we're about to face a generation of absolute slop, I'm not lying, being a cynic or a doomer. Even studios which are supposed to run their own engines (RE Engine for example) have clearly pre-packaged solutions "motivated" by money under the table from Nvidia, so even games not made in UE5 will still enjoy the crutch that is just throwing whatever garbage you have at DLSS and hoping it passes 60fps. MHWilds looks HIDEOUS, and it barely runs in a justifiable way for how garbage it looks, even if you were to somehow think the looks are similar. Meanwhile we're 4 generations into "RTX" hardware and real time raytracing is -barely- playable without fake frames and upscaling. We're not in the 90s anymore, Moore's law is dead, and we're also slowly walking out of the generalized rendering era back into the wild west early videocards used to be (remember when you had to pick a rendering mode before starting a game?)
- 937 replies
-
- ill-advised
- sos
-
(and 3 more)
Tagged with:
-
I mean, not even this thread has been stopped... Imagine.
-
I kinda was there, as an infant having to watch my dad deal with his soundblasters as his FM stations (he owned like 3, some of the first in the country, let alone the town) had him jump aboard the PC era very early for automating multiple aspects of them. However as you talk about buying new hardware, remember we're starting from a minimum of what should be a 3060, that's a 3 years old card, and people with 2070s are still in the fight too with their 7 years old hardware. If you further discount obligatory raytracing, people with 1080ti are going on even stronger with their 8 years old cards. That's not new hardware at all, even the most recent of those is 2 generations old now. It's really not about buying new hardware, as most games really do support the 2070 or 3060 at their lowest levels (rip those who got scammed with a 4060) My deepest condolences. Remember we're neighbors and at some point our presidents were aligned in their red-tape. We still 'enjoy' having to pay up to 50% tax on purchases made to foreign sources, no matter what is bought, even books. Still... most people that upgraded during the pandemic are -just fine-, and will be fine for a good chunk of the upcoming unreal engine spam era. In fact, thanks to this and thanks to the economic conditions, the fact that new hardware isn't pushing the envelope as hard as it was back at the turn of the millennium, it's not hard to see we're about to live one of the longest generations of hardware as people are incapable of upgrading or refuse to because generational uplifts are gone whilst prices keep climbing. Still, my point stands, we're not talking about RTX 5XXX series and RX 9XXX series, we're talking about hardware from 7 years ago still having a really good chance at not just being compatible, but being able to push games properly beyond playability. Volumetric clouds and other things talked about in this thread are nowhere near requiring current gen stuff, and it's clear the biggest majority of gamers have that hardware... which is why I don't really buy that games are failing because people don't have the hardware. But at this point, I have two suggestions: Let's not keep hogging this thread. Let's wait for less controversial games people are actually looking forward to to see if it's really hardware or games being garbage or some mix or both.
- 937 replies
-
- ill-advised
- sos
-
(and 3 more)
Tagged with:
-
Because he's only reviled by a vocal, increasingly ignored, increasingly minoritarian group. This thread should really be locked before it turns into outright politics rather than implied politics. Forgot to mention, anyone who doesn't instantly show disgust at Musk a stan doesn't help.
-
Because games stopped being fun around 7 years ago. We all here play KSP1 because KSP2 is garbage, and tKSP1 is like 13 years old at least from release. No competitive shooter has beaten CS, No moba has beaten Dota and LoL. No BattleRoyale has beaten Fortnite and PUBG. No live service has beaten Destiny 2 or Warframe. No vehicle combat game has beaten War Thunder/World Of ---. No farming game has beaten Stardew Valley. No workplace sim has beaten Farming Simulator... GTA remains unbeaten too, Baldur's Gate 3 is probably the newest thing that got some traction... Civ VII failed to beat VI, Cities Skylines 2 failed to beat one. And so on and so forth but new offerings for those exist... However, the Steam survey clearly shows It's not hardware. 70% of people have computers that can absolutely run everything that's come out, even the RT obligatory Indiana Jones game (which was garbage). You'll probably see this picture painted more clearly when Doom TDA comes out and it's not as drowned in controversy as most modern titles, and actually gets played by a large playerbase. It's not hardware at all, it's absolute garbage games making people stick to previous releases and classics. And to further prove that, I'm pretty sure that Avowed, the new Assassin's Creed, MHWilds, KF3 and others fail one on top of the other even though 80% of the entire steam userbase can run them.
- 937 replies
-
- 1
-
-
- ill-advised
- sos
-
(and 3 more)
Tagged with:
-
I think the fact this isn't done more often is the evidence needed to support the theory that the 'gains' in fact do not overweight all the extra costs. Because it is. There's zero business comparing a time back in the day where off-the-shelf engines where not a thing, to nowadays where not only are they impervious, but so is pre-packaged physics/graphics/UI middleware. Even RE-Engine on MHWilds has all the hallmark artifacts and quirks of pre-packaged upscaling and frame generation suites (and looks horrible with or without, like most gen 9 games). Back in the day, you'd work your way up implementing something like volumetric clouds, having to do it all yourself. We could probably assume the testing overhead is the same in both cases, sure, but what'll never be the same is that nowadays you'd have to work your way down into engine/middleware (normally a black box) code to see how deep the implementation of volumetrics is to see what can even be toggled off without breaking the whole thing. It's not the same workload and the capacity to work it into the code in a modular way is not a thing unless you want to, again, dive down to disable as much as you can, and then redo it all yourself which is still more work. When you consider that modern engines are black boxes (you need paid versions to be able to enter the source code and even then you can't just do whatever), then you'll realize the workload is completely different, and massive, and also limited by the black box scenario and that you really don't know if it's possible. Even for the two most popular engines, there's gonna be -very- little people diving into the engine source code, or worse, middleware source code to help you... both of which make the cost of development for the graphic setting and the testing overhead bigger and bigger.
- 937 replies
-
- ill-advised
- sos
-
(and 3 more)
Tagged with:
-
At the level of off-the-shelf engines, yes. Off-the-shelf stuff offers easy ways to implement 'good looks', with one of the most popular being the paid version of unity (before their whole debacle) offering almost one click post-processing effects that looked gorgeous. It's much more than 1%, even for something basic like say you enable those post-processing effects... In engine that's a rendering flag that conjures up some pre-made shaders near the end of the rendering pipeline, honestly one of the 'easiest' options to make toggleable... however that will derive in creating UI work to present the option to the customer, testing time to test the proper workings of said option (especially if you change unity versions down the line!) and it will duplicate visual inspections to see that things look correct with and without the PP shaders correctly. And again, that's just shaders, which in that particular case are really dumb, simple, lightweight end of the line shaders. Things like volumetrics (clouds, fog, sometimes even lighting) are much deeper and much more crucial to the rendering pipeline, so much so in most games you aren't even allowed to outright turn them off, just simplify them. And in the games you are allowed to turn them off? Most times your feedback will no longer be taken into account because you've destroyed the lighting of every scene, the expected view distance, and so on. Unlike shaders, this would need to be tested in different graphics APIs, different graphics card vendors (Remember how KSP2 clouds had issues at first with all AMD cards and then with AMD RX 6XXX cards?) and couple with different settings to see what breaks and what looks bad. It's a ton of overhead, much more than 1-2%. TL;DR For toggling graphics settings, the most you can hope for is the toggle already comes in the game engine you're using, meaning the work is just a couple hours to get the UI to have the option, and then some small overhead of testing on every version, a bit bigger when you change unity versions. If the setting involves changing to different shaders, or changing to different qualities of textures or meshes, then you're looking at days of work, days of design making the lower resolution assets, and you've basically at least duplicated your testing load by having to test scenes in multiple settings. You're also now testing performance, which requires a ton of probing, measuring, and tracking of data to see if the option is having the expected impact. If your setting requires different rendering techniques (volumetric to static for example, as you can't outright remove clouds or fog), or LODding of meshes, or anything that requires multiple versions of assets, you're looking at months of production work to create the assets and effects, and then you're looking at rigorous testing that has to take place in different hardware configurations, driver versions, and so on.
- 937 replies
-
- 1
-
-
- ill-advised
- sos
-
(and 3 more)
Tagged with:
-
From that same image: 65.86% of gamers have 8GB Vram or more. Including those who have 6 or more into the spec would add 11.83% of gamers into that pool, or about 60% of the remainder gamers previously left out. Going as low as 4GB Vram, when you consider real life things like the added overhead of testing, the fact those GPUs will not receive game-ready driver updates for your game, and the extra development time to add and test the low graphics options, and in some cases to even develop those low graphics options (like redoing textures or texture compression algorithms), it's not a question of "how much people we can sell our game to", it's a question of "how much extra cost is it to develop retrocompatibility?". Creating low graphics options, specially on newer engines, eats into development time and development costs. It's not economically viable, and even if it was, going as low as supporting 4GB is not even statistically sound, with 6GB being a somewhat acceptable tradeoff if you really need the extra market cap. It's not, but not because they're targeting the wrong hardware, the industry is where it is because of a series of bad decisions prioritizing the wrong things rather than making good games. Confidence is low, hardware sales are more or less as they were pre-covid.
- 937 replies
-
- 1
-
-
- ill-advised
- sos
-
(and 3 more)
Tagged with:
-
The mathematical average is somewhere around 7542Mb VRAM for the 50th percentile. Funnily enough in that post you link you use the "most popular" GPU (1060 at the time) to justify your point, so I'll stand by my "most popular" pc parts nowadays to justify mine just fine. On the contrary, UE5 for example brings all of those graphics options by default, and applying Nanites so that meshes with hundreds of millions of polygons "work" is as simple as a single button. Nowadays it's actually more work to optimize down than to slap a DLSS-ON label on things and ask for a 2070 than to painfully go through every option making sure it's toggleable and the toggle actually has the expected impact on performance. Of course, this changes a lot with a custom engine, but that's not what KSP2 did, is it? The way games are built nowadays means it's just easier to "add every single possible eyecandy" than to work the way down the performance tree and in come cases even the engine's own source code to ensure retrocompatibility and low-detail performance. And people are hellbent on moving forward with the policies that put the world in such a crisis, but I'll stand any day by the idea that a 3060 is not high end hardware, it's the bare minimum you get on basic and cheap prebuilts. Nowadays a 4060 can even run for cheaper (it is a worse card after all) and even that still hits 8GB VRAM. Meanwhile, AI is the biggest seller of graphics cards, and exactly what people have to thank for keeping the 5090 at the price it is, and keeping the 5080 and below with puny vram capacities, whilst still being expensive. Hardware has gotten expensive because it sells like hotcakes. Everything related to datacenters (AI or not) is booming and even client purchases have started to recover after the post-covid slump. The one thing really on its way down is gaming as servers and AI make the hardware more expensive. This won't blow up until after win10 is put out of service this october and the last warnings are sent to layman users. The blip of power/knowledgeable users switching away from Windows has already passed and it was a <1% blip on Linux and mac adoption (and mac is still going down). I know the uni lab I overwatch will not touch win11 and will remain on win10 until authorities get involved and demand it, which could be years from now. We're not internet or AI dependent thankfully. Another thing you have to thank the crypto and now AI boom and of course the datacenter proliferation. It's much more profitable to sell electricity to those guys whilst you have to manage with gas heating and be a good boy and use paper straws. This is not a crisis, this is centralized, high energy consumption technology and its peripheral culture proliferating at a very accelerated rate and the layman having to pay for it, both in the price of services, as well as the price of hardware, being perpetually assaulted by subscriptions without ownership, the commodification and gentrification of things, and so on. I wouldn't ask you to upgrade... I'm juggling money around to try and upgrade myself... and I still have to somehow buy a laptop for work too. However, what I would ask for, is for you to recognize that you're now part of a minority group running very old hardware, and that games have no reason (economical) to target your build... and your build should not be what dictates how a game is made.
- 937 replies
-
- ill-advised
- sos
-
(and 3 more)
Tagged with:
-
If you want % ranges: ~64% of gamers have -at least- a 3060 or similar and better (8GB or more VRAM). ~80% of gamers have 16GB or more ram. ~86% of gamers play at 1080p or above ~78% of gamers have a CPU with 6 or more cores ~75% of gamers have a CPU at or above 3Ghz (laptops -really- mess with this metric) ~92% of gamers have at least 500GB storage space with ~80% having at least 150GB free. Still, you can't apply the bell curve linearly to the target of your game, because (discounting previous interest) people with higher range PCs are less likely to try bad looking games their hardware can easily run, and the same is true for the opposite group, with people with weak hardware being much less likely to muck around with settings and INIs to try and get games to work on their systems. This is why you aim at a specific range of PCs with most blockbusters aiming specifically at the "average pc". Now the almost obligatory DLSS/FSR and soon to be obligatory (still, sadly) Raytracing... you'll see the bare minimum pumped to RTX2000 and RX6000 series... and just you wait for the upcoming 5 years of games when everything is an underperforming, disgustingly dithered, badly lighted UE5 mess.
- 937 replies
-
- 1
-
-
- ill-advised
- sos
-
(and 3 more)
Tagged with:
-
The leading factor is -how many- people have a computer better or worse than yours... However nowadays you'll quickly realize there's more than one company pushing devs with money under the table to 'popularize' the obligatory inclusion of certain technologies... Still, the "average" pc nowadays is: 6 core minimum, 3.7 Ghz AMD or Intel (r5 3600 or i5 12400 more or less being the most popular) Nvidia RTX3060, RTX4060 or at a hasn't upgraded yet tier, a GTX1650 as the lowest. (~8Gb as the target VRAM, RT obligatory more recently, sadly.) 16GB ram with 32GB right about to become the most mainstream. 1080p single monitor with 1440p as secondary option. Somehow most people do have more than 1TB total space (though steam exposes most people have <250gb free from that). Less than 2% steam users have access to a VR headset (and that number actually went down!) From that base line, it's a bit more fair to judge if one is right in asking for more/less graphics. KSP2 had a grossly outdated look, bar the opinion one might have on the artstyle, the graphics were just not up to par. Of course, some of those would be sacrificed for the performance of the simulation, but both games had the same issue with graphics being an afterthought. Raymarched volumetric clouds, weather effects, godrays and such are more than well-warranted for the most popular, entry level 'gamer' spec... Nowadays most gamers can run a level of raytracing even (sadly, again, hate how it's being massively rushed into the industry despite still looking like garbage)
- 937 replies
-
- 1
-
-
- ill-advised
- sos
-
(and 3 more)
Tagged with:
-
Believe me, nobody's forcing anyone to pick up KSP2 and continue it, what's being asked for is that if you say you're gonna make a game, either you do make the game (basic "action backing up words" stuff), or you give back the money when you decide you're no longer making the game (basic human decency). No one is being forced to work here, rather people would only be forced to not take money for something they didn't do. Absolutely. It also guidelines devs into not making promises, yet if you open the KSP2 store page, there's a roadmap right there, and that's just the store page... We're in the worst place for repeated, egregious, in-your-face violations to that guideline. This is another thing that needs to be addressed and fixed: Legally binding disclaimer for me, but only suggested guidelines for thee. Of course, this -grossly- ignores the whole rest of a contradicting mess that the guidelines and disclaimers are, honestly it's probably the most shameful thing you can pin on Steam right now, probably second to enabling gambling with marketplace items. It just happens to be a much less openly discussed topic, as every EA game community tends to be its own thing. They are made constantly and consistently by most devs/pubs involved in an EA project. A disclaimer keeps that from being legally binding, and acting deaf and blind to what pubs/devs do outside their store page helps a lot to keep the suspension of disbelief that legally no promises were made. But the fact is the fact. And we here are mostly adults... Steam is mostly used by teenagers and kids, which are much more susceptible to these dark patterns. What's written on paper is probably the furthest from reality. EA is actively used by devs/pubs to probe for interest, to crowdfund future development, and whatever other dark pattern you can imagine, even including pre-purchase-like bonuses to EA participants, which is also heavily advised against by the Steam EA guidelines. Those same guidelines mention EA is a way to get feedback, and even there has KSP2 failed grossly to adhere. Hello, unreadable font, still there... And that's an accessibility issue, not even a feature wishlist. I know this wasn't addressed to me, but the guidelines you mention also clearly state what happens when devs "move on": However, Steam, in their sometimes very questionable and anti-consumer laid back and hands-off stance, has left no guidelines related to developers not slotting their game into either of those two options, thus we get KSP2 and other games where the dev has "moved on" without properly taking out their garbage and having the basic decency of not pocketing money for something they're not going to work to 1.0 Again, everything discussed here would simply go away if IG or Haveli pushed an update that said "hey, this is 1.0 now, see ya", but their human decency doesn't even reach THAT level.
-
That's not for me to scrutinize, I'm just talking both from first hand experience, second hand experience, and basic public information like the bottom line on this little known E.A. title: It's a chore but you can absolutely put in a dollar or two for some no name early access and try it yourself. Release date is the date the product goes public, early access or not. I DO know that when a game reaches 1.0 the release date is -reset- to the date 1.0 launches.
-
Common misconception and a trick exploited by lots of early access (and some people in this forum defending them!) . Release date is when you allow the public to purchase and play, not 1.0 KSP2 released Feb 24 2023.
-
Imagine it as their "steam wallet". And yes, devs could do that... and face a ban from ever publishing again (which has already happened IIRC). You say that as if it was a negative. E.A. needs to be culled of scammers, tricksters and people trying to tiptoe those lines as they actively ruin the model for anyone else. If you publish a game, you will make a game, and if you cancel it, the money is given back. At this point I'm pretty sure such a statement should sound like common sense for anyone save for people actively trying to game the system and EA gamblers trying to somehow land good games for cheap (and price increases as versions move forward is almost completely phased out by now as a model). As for the financial cost to Valve... this new warning is probably a good way of exposing the fact that the opportunity cost of hosting these scams is starting to hurt the bottom line of E.A. and the viability of other projects in it. E.A. exists because people are willing to accept some risk, but if everyone wants to game the system like IG/T2/PD, then there's less and less people willing to accept what's no longer a risk but almost a certainty of getting scammed, specially when a tech demo costs $50. 14 days since the moment you purchase the game, E.A. or not. The only exception is preorders, which you can freely refund anywhere from the moment of purchase up to 14 days after release. Anyone getting a refund past 14 days and 2 hours playtime on an E.A. title is a manually handled exception.
-
Not at all, they can (and do, already) direct the debit of the refunded money directly to the publisher/dev's account when you do a normal refund. Whoever owns the account that publishes the game sees a nice "-$50" on their monthly resume. Not at all, if anything Steam is so powerful and widely used because it greases the wheels between customers and publishers/devs, not because it's a middleman that offers shady warranties. Plus, with the way they manage finances for people publishing on it, they really do not need to get a cent out of their pockets. Of course let's not forget the -ridiculous- amount of features Steam throws on top with your game purchase, both for customers as well as for people publishing their games on it. Steamworks, drop-in DRM, drop-in P2P and even drop-in dedicated server hosting, drop-in matchmaking, as well as a free forum, reviews, delta patching, options for internal (and now private too) dev and testing branches, social features like a friend list with the ability to compare which games you own, an entire social-media type side, a marketplace for items to be sold, off-market trading, the whole inventory system for said marketable and tradable items, VAC, global and friends-only leaderboards, global achievements, the capacity for devs to generate and sell steam keys anywhere so long as a small set of rules are respected, automated sale inclusion and discount-limit setting, cross platform connection with other accounts, support for DRM free titles, the whole wallet system and giftable games/wallet cards, the workshop, refunds, publicly traceable player data, screenshots, now recording of clips too, lan streaming, linux compatibility, family sharing, wishlists, capacity to roll back versions of products, the whole steaminput system, Big Picture, VR support, regional pricing, and so on. Other storefronts have no hope of competing other than by spamming money schemes like deals under the table for storefront exclusivity, 'better cuts' (in exchange for no features), engine gerrymandering, forceful inclusion of spyware, disinformation campaigns and even lawsuits. The disclaimer is there, doesn't make the practice any less shady and scummy. The refund window for E.A. games should be all the way until 1.0, and I'll insist the refunds should be automatic as soon as the game is abandoned, a metric which clearly, thanks to KSP2, should not be left at the publisher/dev's whim to decide because they'll gladly exploit that.
-
I don't think it's hidden at all... unless you're doing conspiracy on conspiracy. Sadly at the speed bigger organizations work to enact change + the speed at which games are developed, we've still got at least two full years of slop and record # or almost record # layoffs as games released under the "current zeitgeist" continue to flop one over the other (probably MHWilds, then Avowed, so on and so forth) . Sadly a lot of the responsible people are failing upwards, finding positions in Microsoft, Obsidian and so on. I think KSP2 is another couple drops of water on an already very full glass. Thankfully I stopped dabbling on E.A. long ago when I grew out of zombie survival + 4horsemen type games, and KSP2 was an exception for me, but if you dive back in... ton's of outright scams, with another ton of dubious projects and tons of people treating E.A. as github with pay-in (or sometimes f2p) testers. The original vision of "have the community shape the game" has been long lost. Heck, even our current title here had zero intention of listening to community feedback. Even non game-shaping feedback like the goshdang unreadable font that's STILL THERE. I think once a game is out in the public and money has exchanged hands, none of those scenarios should be an excuse for a dev to walk out on a game, save maybe declared bankrupcy, which should still trigger automatic refunds. Abandonment should just not be an option, period.