Jump to content

[COMPUTERS] How far we've come.


Starwhip

Recommended Posts

My first PC was a Commodore64 - yes, you called it a PC back then - I was 14 and almost everyone else had at least moved on to an Amiga or IBM. :P

My first IBM PC was a 386SX16 with 1MB RAM and I think 80MB HDD?

Smartphones are actually pretty powerful devices. I think the average car ignition system has a more powerful computer than the Apollo craft.

Ve coult hav build a bik empiya vis it!

Edited by KerbMav
Link to comment
Share on other sites

My first was a Color Computer 2 with 16K (!) of memory. Nine colors, including black. 256x192 resolution, if you were willing to pare things down to two colors (including black).

I also remember an old episode of Doctor Who that focused on the Doctor retrieving a frightfully important paper from a filing cabinet in an interstellar spacecraft. Nowadays, we'd probably send up a five-gallon jug full of microSD cards, which would give us a lot more storage space (even allowing for redundancy to reduce storage errors to some negligible level). (Heck, New Horizons is using solid-state memory -- several gigabytes' worth.)

Link to comment
Share on other sites

i was poor so i used obsolete tech junk throughout the early 90s, had an old apple 2 and ti-99 console that i got from school for helping clean out their closets. the ti-99 had a basic interpreter in rom and so it was probibly one of the first computers i programmed (but it had no storage so i had to type in the program each time i turned the machine on). it had a cartridge slot but the only cart it came with was parsec. the apple 2 only had a few disks, the usual games and typing programs you would find in the school library. most were all self booting applications but i did have a pro dos disc to mess around with. i couldn't do any programming on it unfortunately. also tried to talk them into letting me have an old mac classic, but they wouldn't let one of those go.

around that time i also learned ibm dos inside and out after spending a summer with the grandparents (grandpa was an electrician who fixed monitors so there was always a computer or 5 in his workshop). when grandpa died grandma brought me a 386 that he was working on at the time. unfortunately it didnt work. i think it just needed an os, but before i could find one at the local computer shop, my older brother started messing with the hardware and messed it up more. finally i talked my grandma into getting me a 120 mhz pentium machine around '97 and around that time i took a programming class in highschool, so thats where my geekdom officially started.

Edited by Nuke
Link to comment
Share on other sites

The first computer I touched was a Soviet mainframe computer, made entirely on domestic (made in USSR) hardware: several TV sets and keyboards were connected to a wardrobe-size box. You could play multiplayer games like over network. The OS was domestic, and non-purpose software was rare and very simplistic. Programs were stored on big hard drives (50 cm in diameter).

ut88_tetris.png

But there were also amateur computers like Spectrums, and in 1990 someone published a game code in a journal. (It was PC XT, but Internet was still not available.)

%D0%BF%D0%B5%D1%81%D0%BE%D1%87%D0%BD%D0%B8%D1%86%D0%B0-198394.jpeg

Later in 1990s we got an 486 at home. I stopped following box PCs hardware in mid 00-s and finally switched to laptop in 2010. Now I work as a programmer and lag in hardware a bit: still have a good laptop (and a big screen at home), not interested in smartphones. Meanwhile my friends (architects) have big boxes and big smartphones and tablets.

Edited by Kulebron
Link to comment
Share on other sites

My first computer I can remember (I'm 25) was an original Pentium (I think), clocking in at a blazing 66 mHz. I could play Age of Empires, aaaaand that's about it.

Supposedly I played on a Commodore 64 when I was younger, but I really don't remember that. We also had a few 386 and 486's, but i don't think I ever messed with them.

Amazing to think how quickly our technology evolves.

Link to comment
Share on other sites

My first home computer was a Gateway 2000 desktop with a i386 running DOS, it looked kind of like this -

$T2eC16NHJGYFFkmj8cVfBSSNBfwftw~~60_59.JPG?set_id=880000500F

I remember playing Spirit of Excalibur on it! :D

SOE_Map_Level.jpg

SOE_Scene_Level.jpg

A couple years later we bought a Mac Color Classic (16 blazing MHz!) and we've been Mac people ever since (although I've tried to regain some of my OG PC gamer roots in recent years by building my gaming PC :))-

cQURQW6pnuGgYOpr

Edited by segaprophet
Link to comment
Share on other sites

My desktop PC probably spends more computing power to render a 3d 'OK' button than the whole operating system required back in 80s. Our smartphones have resolution that outstrips the capability of human eyes to distinguish a single pixel on it. Yet, the marketologists offer us 'even more resolution', 'even more gigaherz', 'even more terabytes'. Why? I have a 4 Tb drive. What am I supposed to store on it? It's 80% empty!

Is that not the epitome of development and technological sophistication? Spending decades and billions on research, ending up at a point where you can turn up your volume by using your smartphone > wifi > local network > world wide web > local network again > PC > your music player > amplifier/stereo system, rather than just turning up the sound on your stereo set. You used to do that on just one machine, but this is much more fun! While you are at it, you have all the world's information at your fingertips, but rather prefer to look at cat pictures. That is true progress!

And, of course, let us not underestimate the noble value of striving to develop hard drives that can be 95% empty, rather than the previous paltry 80% empty :D

Link to comment
Share on other sites

We talk about how far we've come, and yet in some ways we've come nowhere. Processing power gets squandered on abstraction and inefficiency, and has the user experience on PCs really improved much since the days of Windows 98? And the more software tries to be clever, the more it ends up being stupid, behaving in ways you don't expect or once.

Link to comment
Share on other sites

. . And the more software tries to be clever, the more it ends up being stupid, behaving in ways you don't expect or once.

Like this travesty? :D

"Hey, let's take everything a desktop PC is good at and throw that in the trash. Now, let's turn it into a glorified smartphone!"

2673523-9087426533-Engli.jpg

Link to comment
Share on other sites

Processing power gets squandered on abstraction and inefficiency

This is true, but there is something to be said for developers focusing actually on useful features, rather than using every tick to the ultimate maximum. Those things are fun from a technological point of view, but take an incredible amount of time. Development time costs money, a lot of it.

and has the user experience on PCs really improved much since the days of Windows 98?

Windows 98 was pretty good, but yes, it has improved incredibly. I remembered this when I tried getting some pictures from an old computer through a USB drive. Oh, right, I need to install drivers for that manually, that is something Windows does for you from XP on. Those are not available for Windows 98. Well, maybe I can burn a CD. Nope, pretty much the same issue. Everything you want to do is a tedious, technical version of the smooth experience it is now. Has the functionality changed much? In essence, no (though things that used to be professional supercomputer territory are now well within consumer/prosumer reach). Has the user experience changed? Yes, very yes.

And though intelligence often misses its mark, it is also quite often scarily accurate. Facebook manages to suggest to right friends quite a bit, even if they are not that obvious, and Google can also find what you need based on the most cryptic and obscure search terms. Translations also seem to become more and more legible. Those are only a few examples of successful intelligence.

Like this travesty? :D

For what it is worth, I feel Windows 8 is Microsoft's best OS ever - and possibly the best OS ever. Thing is, they never threw out what makes a desktop PC great. Yes, some of the new features were a bit haphazard, but that is what you get when you try new things after doing the same thing for almost three decades. The general set-up, even most of the UI, is really pretty slick. Admittedly, once you get the hang of it.

Edited by Camacha
Link to comment
Share on other sites

Ah... One of those threads that make me feel old...

You know the saying, only XT users know that Jan 1st 1980 was Tuesday. That's because those things didn't have a battery for the clock and every time you turned them off, you would loose your datetime. The first thing you did after booting up, usually, was to set the date and time again, or all your files would have 01-01-1980 as creation and last modification dates.

I'm one of those users. My first 'putter was an XT with a 12 Mhz processor (quite a lot for an XT in fact) and mind blowing 1024 KB of RAM. Yes, not 1 MB, 1024 KB. It actually told you so during boot up, "1024 KB RAM OK" it said (you could even see that number increasing as the computer was checking the RAM until it reached 1024, hahaha). I guess it didn't make much sense to use MB if you only had just the 1. :D

It's easy to look back now and think you could never get anything done in those dinosaurs, because people know how to do things with the technology of "their" time... But we had word processors (Wordstar, my God that thing was awful) , spreadsheets and games. We didn't have movies and MP3s thou. Programs back then were a lot smaller and a lot faster, too. Your entire OS fitted in a floppy disk! You think you're cool with your Linux distro booting from a USB thumbdrive? Think again! :) We usually had tons of booting diskettes for different things... yes that part sucked, but at the same time, it was so easy to have multiple OS setups for different things...

Link to comment
Share on other sites

Your entire OS fitted in a floppy disk! You think you're cool with your Linux distro booting from a USB thumbdrive? Think again! :) We usually had tons of booting diskettes for different things... yes that part sucked, but at the same time, it was so easy to have multiple OS setups for different things...

Ha!

I suppose that in thirty years, we young people will joke about how we could boot our computers off of USB drives with only 8 gigabytes, or off of CD's (who even remembers those things? :sticktongue:)

As for the Windows 8 debate, I have not really used the thing except for a couple of times, on public computers so I did little to mess with anything (well, apart from trying to disconnect from the operating server so I could boot off of my USB flashdrive without the connection interrupting the boot process:wink:), so I have fairly undeveloped opinions about it. It seems most people really dislike it, particularly if they have used Windows before. Some who are just starting to use computers really do not mind, and whether that is from naivety or from openness I am not sure. Whatever it is, though, we should come at it accepting that computers have developed, and will continue to develop, in various ways including the UI. Perhaps Win8 is not the next evolution, it is a stab at it, attempting to better integrate a new interface, the touchscreen, into a desktop computer. I personally, am thankful for my computer mouse, and the multicolour screen, and am glad that those were integrated into OS's.

@ Cantab, I will say, certainly much more power is spent of fancy graphics and display, but the actual amount of power that my computer has, versus one I might have had in 2000, is far better. And, often times, I do use my computer's cpu at near 100% for hours on end. As for software getting better, to some extent, I see what you are saying, and probably many new programs (some word processors, for instance) are pretty much at a roadblock in that just about every conceivable thing that they may be asked to do, they can do, and have been doing for years. They still need to be upgraded, to meet with the new OS compatibility, and the OS needs to be upgraded, because, well, um....

I think I may see your point.

Link to comment
Share on other sites

i was still doing the boot disk thing on my pentium rig. many dos games could run in windows, but if you wanted performance you absolutely needed to run your games in dos. some dos games didnt work in windows at all, and getting them to work in dos was also a chore. they needed to load the code within the 640k limit before expanded memory could be initialized, so you had to cull tsrs and drivers you weren't using, everything you didnt need. after you loaded the sound and joystick driver there was not a scrap of memory left for anything else but the game. it was tedius.

Link to comment
Share on other sites

Like this travesty? :D

"Hey, let's take everything a desktop PC is good at and throw that in the trash. Now, let's turn it into a glorified smartphone!"

http://static1.gamespot.com/uploads/original/1534/15343359/2673523-9087426533-Engli.jpg

When I think of all this wasted screen area I want to scream.

- - - Updated - - -

We usually had tons of booting diskettes for different things... yes that part sucked, but at the same time, it was so easy to have multiple OS setups for different things...

That's the part I definitely don't miss...

Please insert disk labeled 'Install disk #38' and press ENTER...

Error reading disk in dive A: (A) Abort, ® Retry, (F) Fail

Link to comment
Share on other sites

For some years I don't notice any difference in the UI from software upgrades. But when I tried Ubuntu 13.10 on a 2005 laptop, it could barely breathe. A tremendous amount of overhead is generated on the way, and older stuff not getting cleaned up.

In Ubuntu, the sound system has always been a mess, and with years the mess only grows. It was ALSA, then came PulseAudio to make one standard for all, and it hanged on ALSA, but did not push it out: a lot of programs still need it, you can't throw it away, but PulseAudio is used too widely to get rid of it.

A lot of software has been growing extensively for the sake of developer ease, but I think this proliferation fails to kill old and ineffective stuff, or force improvements.

In the late 1990s hardware finally could play video, and in 2002 any PC could play HD with its 1.5 Ghz. But now programs freeze when anything that's not a browser tries to use network. I saw this in Win 2000, and 15 years later nothing changed. In Linux, if one program uses one audio API, and another program takes the other one, they tap each other. When Skype makes sound, video player has hiccup. No developer gives a heck.

I read stories of programmers developing games for Dendy platform in 1980s. The story was exactly the same as with mobile games today. Most stuff was written from scratch, and on every platform they'd start from scratch. We pay a very big price for that in terms of reliability and smooth operation. Do you remember a dumb phone ever freeze for several seconds and not responding like smartphones do every day? Some did work slowly, but they were working at the same pace. Smartphones hiccup daily, even worse than desktops. I waited till 2014 to buy one, but they are still unreliable. They do have a convenient camera, I must say, it's really atractive, and that's the deal we get: killer features but huge gaps.

I'd actually vote for freezing the platforms for a decade, and letting programmers cover their gaps, and stop makers from competing in useless features, but rather in ironing their products. And I'd let hardware just miniaturize, so that it were more portable and probably foldable like paper. Cluster computing is out there for ages, and it could be possible to split the big "spade" phones into 2 objects: a necessary phone with minimum things, and a big paper-like screen to read docs and later maybe develop to allow editing them. (I think, one-finger text input on mobile platforms is a huge step back.)

P.S. Or maybe if I were god, I'd freeze battery capacity: "No bigger batteries tomorrow. Learn to use those you already have."

Edited by Kulebron
Link to comment
Share on other sites

those are just the joys of interrupt based programming. something triggers an interrupt and the computer has to do a task switch to run the interrupt service routine, then task switch again to go back to what it was doing. this is why old computers freeze when you open the cd drive, because the isr has to wait for mechanical components to move to do its job, blocking execution. fortunately we have removed most of the mechanical parts from the computer, almost everything is solid state now, except for the cooling fans (and low power devices get rid of those too). you also have additional cores to handle the isrs as they are needed. but you still need to handle interrupts which most of the hardware requires. the way phones cram in sensors, communications devices (most phones now have at least 3 different radios, and on top of that all the chip to chip communications across the mobo, and usb traffic). thats a lot of buffering to deal with. i dont think there is anything the programmers can do to speed things up (but to be fair ive never seen software so bad as can be found in an "app store").

Link to comment
Share on other sites

I believe that I read somewhere that the world's fastest computer can do more calculations in an hour than the entire earth's population could if they typed functions into a calculator for the rest of their lives, 24 hours a day.

Link to comment
Share on other sites

We talk about how far we've come, and yet in some ways we've come nowhere. Processing power gets squandered on abstraction and inefficiency, and has the user experience on PCs really improved much since the days of Windows 98? And the more software tries to be clever, the more it ends up being stupid, behaving in ways you don't expect or once.

Well... I really want to make notes on this.

First and foremost, Java is the worst programming language mucking its nonsensical head around (really, I don't care enough to defend this; but running a program inside a virtual machine makes things much slower; unless Java is now using some kind of abstraction layer which kind of makes the VM seem less secure but whatever)... but it is one of the most popular languages of our times. What happened here?

Well, two folds. First, Java makes it really easy to kiddy script your way into a working program; but, even better, Java makes it EXTREMELY EASY to use horrible programming techniques and then claim you tested your program across all platforms when you did not.

And we see this with other languages too. Many C++ programmers will load enormous libraries to the point where the work they're actually doing is hardly worth mentioning. But, even worse is when they load a library for one or two functions and you end up with several MB of DLLs for a very simple program. Most of the time, these are gnu libraries intended to help make code 'portable' but the person then never bothers compiling for different os'.

And we have C#'s utter mess of a system.

But this type of programming IS POPULAR! No one wants to think about how many instructions need be executed for EVERY function being called with C# and how C#'s turning EVERYTHING into a function easily begins chewing up cycles. How the plethora of runtime identification and processes meaninglessly chew cycles. How the lack of strict memory management causes occasional crashes; periods of lag (while the garbage collector activates); and could 'easily' be prevented with a much more complicated low level design.

Some abstraction is good, the hardware abstraction layer is vital for modern computing (unless you're addicted to paying $1000 more for a $200 computer). But higher levels of abstraction do start making things stupid, knowing HOW your processor works means you can program in a way to maximize performance, (such as using vectorized math); but at high level abstraction you're encouraged to program however and simply hope the compiler (which is far dumber than you) can figure out how to optimize your code.

The point is that the long forgotten art of optimization really is chewing up the advances we get.

As an aside: I really do get why it is popular, and I do agree to an extent... but it has changed the world in many ways that aren't for the best. That people keep trying to give rise to the horrible beast of graphical program (which has been slain MANY MANY times in the past) is of particular note because it shows just how much processing power we're willing to put to waste just to remove the necessity of THINKING about what we're doing.

Edited by Fel
Link to comment
Share on other sites

Well... I really want to make notes on this.

-snip-

thats kind of the way i see things. if you are going to go through the trouble of programming a virtual machine, why not program a real machine instead. im not sure if portability is worth the performance penalty. while programming my own game engine (in c++) ive probibly moved development back and fourth between linux and windows and vise versa about 4 times, and i think the only thing i had to do to my entire codebase was change my file system paths from windows like to linux like, and eventually all the os specific stuff was put into a single header file, which could be swapped with a preprocessor directive. this is because early on i made a decision to use only cross platform libraries, like opengl and sdl. you dont have to give up performance to be portable, you just have to cross compile.

there are a few places i use virtual machines though. i love lua. it has one of the faster bytecode interpreters in the interpreted world. its also a simple language that doesnt try to force any particular paradigm on you (you can go functional, or oop if you want). normally you compile the interpreter into a c/++ application, but i find that it is fast enough to run stand alone (and its very easy to build gui apps with it). i also wrote my own lightweight bytecode interpreter, which runs on an avr, so that i could run code from its internal eeprom or an external device like an sd card (since you can only run machine code on the built in flash with that architecture).

one thing i do find disturbing is how many coders depend on a big black code blobs. or where they simply #include a large fraction of their application. in my game i did not #include Physx.h, i wrote my own set of physics classes, my own collision detection and response code. collision detection code is fast becoming the biggest performance hog in gaming, because of all the n^2 complexity algos you need to use. why do you want to blob that in? its better you write it yourself so you can shave every scrap of slow off of it.

Link to comment
Share on other sites

This thread is quite old. Please consider starting a new thread rather than reviving this one.

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...