It's a question of what values the software has, and what values it does not have. The classic example is an AI installed to run a paperclip factory. All it values is the number of paperclips produced, but it does not value human life, so because it's a superintelligent machine with a factory, it expands the factory, gathers resources from the world and builds more paperclips. The fact that it's making paperclips from the ruins of New York doesn't bother it slightly. To be clear, you shouldn't be worried about this happening in the near term, but it's a clear problem with intelligent machines that there are no obvious solutions to, so we need to think hard about controlling intelligent machines before we build one and let it run the world for us. And we probably do actually want a giant computer to run the world, because it frees all the rest of us from having to work, so we can play KSP all day or garden or whatever you do with your free time.