Jump to content

How does software "work"?


Recommended Posts

@5thHorseman That's not a computer, though. What was shown in the video is more akin to re-arranging transistors on a breadboard than programming a computer, since the machine's hardware is almost completely changed between runs. Now, sufficient transistors on sufficient breadboards can build a computer, and I'm pretty sure the same could be said of this thing (although I have neither the time nor the inclination to actually prove it), but the distinction is important. The thing is, simple systems that could be used to build a computer are all over the place (for the canonical example, I recommend looking into Conway's Game of Life). Making something that could be used to build a computer is very different from actually building a computer.

Link to comment
Share on other sites

  On 2/24/2019 at 8:03 PM, IncongruousGoat said:

@5thHorseman That's not a computer, though. What was shown in the video is more akin to re-arranging transistors on a breadboard than programming a computer, since the machine's hardware is almost completely changed between runs. Now, sufficient transistors on sufficient breadboards can build a computer, and I'm pretty sure the same could be said of this thing (although I have neither the time nor the inclination to actually prove it), but the distinction is important. The thing is, simple systems that could be used to build a computer are all over the place (for the canonical example, I recommend looking into Conway's Game of Life). Making something that could be used to build a computer is very different from actually building a computer.

Expand  

The OP was not asking how to write c#. Or even assembly. The OP wanted to know how computers know to run code the way they run code. Understanding that it is all - at the core - little switches flipping based on how they were set up before you start running power through them is critical to the whole thing. Also, it's the only one offered so far that doesn't involve using an already made computer to explain how to create a computer.

Edited by 5thHorseman
grammar
Link to comment
Share on other sites

  On 2/24/2019 at 9:21 PM, 5thHorseman said:

The OP was not asking how to write c#. Or even assembly. The OP wanted to know how computers know to run code the way they run code. Understanding that it is all - at the core - little switches flipping based on how they were set up before you start running power through them - is critical to the whole thing. Also, it's the only one offered so far that doesn't involve using an already made computer to explain how to create a computer.

Expand  

I found that video intriguing as it how it could process very simple things. When I get a chance I'm going to looo into shirt registers more

Link to comment
Share on other sites

  On 2/24/2019 at 9:21 PM, 5thHorseman said:

The OP wanted to know how computers know to run code the way they run code. Understanding that it is all - at the core - little switches flipping based on how they were set up before you start running power through them - is critical to the whole thing.

Expand  

Sort of, but from the way the discussion is going, I get the impression that the OP is looking for a comprehensive answer to a question whose comprehensive answer is very complicated. The video you provided is a good illustration of how physical systems can perform computations, but it says very little about actual computers, and it's easy to get bogged down trying to reconcile something like a simple digital adder or a shift register with the complexity of what a computer does if one doesn't realize this. I'm just pointing this out, for their sake.

Link to comment
Share on other sites

  On 2/24/2019 at 9:38 PM, IncongruousGoat said:

Sort of, but from the way the discussion is going, I get the impression that the OP is looking for a comprehensive answer to a question whose comprehensive answer is very complicated. The video you provided is a good illustration of how physical systems can perform computations, but it says very little about actual computers, and it's easy to get bogged down trying to reconcile something like a simple digital adder or a shift register with the complexity of what a computer does if one doesn't realize this. I'm just pointing this out, for their sake.

Expand  

That's fair. I think it's fair to say the modern computer is one of the most complicated things that we as humans actually are capable of understanding in its entirety.

Link to comment
Share on other sites

  On 2/24/2019 at 10:18 PM, 5thHorseman said:

That's fair. I think it's fair to say the modern computer is one of the most complicated things that we as humans actually are capable of understanding in its entirety.

Expand  

True, but it's mind-boggling to comprehend how a modern system does what it does. It's a long, tortuous path from abaci to Babbage's Difference Engine, through ENIAC to a modern smartphone. Although I'd hazard the only real difference between an Apple II from the 80's and a smartphone is size and computing power; the basic architecture of memory, CPU, I/O, network, and storage is basically the same.

Link to comment
Share on other sites

  On 2/24/2019 at 10:35 PM, StrandedonEarth said:

True, but it's mind-boggling to comprehend how a modern system does what it does. It's a long, tortuous path from abaci to Babbage's Difference Engine, through ENIAC to a modern smartphone. Although I'd hazard the only real difference between an Apple II from the 80's and a smartphone is size and computing power; the basic architecture of memory, CPU, I/O, network, and storage is basically the same.

Expand  

I too think so, though i am not a specialist. We have come a long way since the Apple ][. Frequency, miniaturisation, floating point arithmetic, heavy parallel and superscalar processing, pipelining, etc. hasn't made the entry level easier. Back in the day a single programming language was all one had, now there's networking, SQL databases, object oriented programming, scripting, a whole stack of APIs with their specialties, graphics programming languages, ....

Phew ... :-)

Edited by Green Baron
Link to comment
Share on other sites

  On 2/24/2019 at 10:23 PM, Cheif Operations Director said:

Tell me about it

Expand  

Its some 100 billion transistors, 99% of them is there to speed things up. 
Large memory pools and lots of cache, lots of parallel processing, huge part is about that to move into cache before its needed and stuff like that. 

You can make an very simple computer with 5000 transistors, its probably better to go up to 20K as the simplest will use loads of microcode who is just confusing. 
At university we assembled an computer with parts on an breadboard, an adder, registers an stack and memory.

 

Link to comment
Share on other sites

  On 2/21/2019 at 5:14 AM, LordFerret said:

The binary system has been a thing for literally thousands of years.
https://en.wikipedia.org/wiki/Binary_number

If you search the web for first computer you'll get a lot of different answers.  I'll tell you to consider the Abacus.

Expand  

I believe Claude Shannon wrote his master's thesis on binary being the ideal base for building a computer.  While I highly recommend a deeper dive in to Claude Shannon's work (especially if you really want to know what a bit really means), note that in much of his work "coding" meant "a means of transmitting data in such a way to recover it by the receiver with minimal or zero error or loss.  Since he worked at Bell Labs, this makes sense (he basically created the whole theoretical basis to transfer communications from analog to digital.  In the 1940s (he had to.  Bell Labs gear was expected to last 20 years, so he had to start early so the engineers could start building things a few decades later, all to have a digital transition in the 1970s-1980s).

  On 2/21/2019 at 4:54 PM, Nuke said:

so here is your 3-bit instruction set.

INC    --increment tape by one cell
DEC   --decrement tape by one cell
LD      --load the tape value to the register
ST      --store the register value on the tape
STR   --set the register to one
CLR   --set the register to zero
CMP  --compare/and the register value with the tape value and write the output to the register
OR     --bitwise or the values on the tape and register and write the output to the register

with that you can determine the bitwise not of the value in on the tape:

Expand  

PDP-7 instruction set?  Ok, it had a memory and not a tape.  But it looked like that.  I heard Woz mention that Dec was selling those computers in the early 1970s for "the lowest price ever for a computer".  Unfortunately for the young Woz (probably not in college yet), it still cost more than his father's house.

Once you understand how digital logic work and what an Instruction Set (assembler code) is, make sure you take some time to learn how microcode worked.  Microcode has been more or less obsolete since about 1990 (not sure they still teach it in electrical engineering school: at least one professor insisted that one computer designer swore "he'd never make a computer without microcode again" after learning it, but before I graduated it was essentially obsolete), but an amazing concept.  Basically it is a bunch of software that turns a pile of digital logic (and not much logic at that) into a computer.

If $7.00 for a computer game significantly more niche than KSP isn't a problem for you, I suspect that TIS-100 is probably the "CPU emulator" that is easily the most fun to play around in.  You might not want to play all that long, but it is meant to be an engaging CPU and set of problems.  If not, I'm sure there are ARM, 6502, and various types of emulators out there.  I'd probably recommend starting with the AVR 8-bit "RISC" (microcontroller chip) if you want to play with "the real thing", but I don't know of any emulators (especially free ones) off the top of my head (I wrote mostly 8086 and 8085 assembler, with a smattering of 6502.  I'd avoid x86 assembly like the plague).

Link to comment
Share on other sites

  On 2/25/2019 at 12:31 AM, wumpus said:

PDP-7 instruction set?  Ok, it had a memory and not a tape.  But it looked like that.  I heard Woz mention that Dec was selling those computers in the early 1970s for "the lowest price ever for a computer".  Unfortunately for the young Woz (probably not in college yet), it still cost more than his father's house.

Once you understand how digital logic work and what an Instruction Set (assembler code) is, make sure you take some time to learn how microcode worked.  Microcode has been more or less obsolete since about 1990 (not sure they still teach it in electrical engineering school: at least one professor insisted that one computer designer swore "he'd never make a computer without microcode again" after learning it, but before I graduated it was essentially obsolete), but an amazing concept.  Basically it is a bunch of software that turns a pile of digital logic (and not much logic at that) into a computer.

If $7.00 for a computer game significantly more niche than KSP isn't a problem for you, I suspect that TIS-100 is probably the "CPU emulator" that is easily the most fun to play around in.  You might not want to play all that long, but it is meant to be an engaging CPU and set of problems.  If not, I'm sure there are ARM, 6502, and various types of emulators out there.  I'd probably recommend starting with the AVR 8-bit "RISC" (microcontroller chip) if you want to play with "the real thing", but I don't know of any emulators (especially free ones) off the top of my head (I wrote mostly 8086 and 8085 assembler, with a smattering of 6502.  I'd avoid x86 assembly like the plague).

Expand  

nah just something i made up with what little i know about asm. a lot of those are common memonics used in a lot of flavors of asm. theres no data as none of the instructions in the very minimalistic computer i discribed can use them. you also arent going to find a lot of architectures out there with 3 bit instructions and a one bit memory bus. 

Link to comment
Share on other sites

All high-level programming languages are just macroassemblers.

The assembler is just a mnemonic notation of machine codes.

Machine codes represent the CPU registers literally.

CPU registers (logically) are the sets of output pins of corresponding transistors.

Edited by kerbiloid
Link to comment
Share on other sites

  On 2/25/2019 at 5:54 AM, kerbiloid said:

All high-level programming languages are just macroassemblers.

The assembler is just a mnemonic notation of machine codes.

Machine codes represent the CPU registers literally.

CPU registers (logically) are the sets of output pins of corresponding transistors.

Expand  

Many high level programming languages are interpreters, making it quite a stretch to call them macroassemblers.  C looks like a lot more like a macroassembler, especially if you are familiar with PDP-11 addressing modes (the computer K&R were using when they designed it).

"The assembler is just a mnemonic notation of machine codes." Ok, that is correct.

"Machine codes represent the CPU registers literally." Generally true for all computers programmed in assembler (currently and historically).  Don't expect modern computers (especially Intel and AMD beasts, but Out-of-Order ARM chips can get pretty weird themselves).

"CPU registers (logically) are the sets of output pins of corresponding transistors." Latches, not transistors (a latch needs at least two gates, and gates need a few transistors).  And again, in anything out of order (i.e. remotely high performance), expect all registers to be renamed from some group of available registers.  What you get in a multimillion transistor core looks nothing like a textbook CPU.

If you want to know "how a computer *can* work, look at AVR microcontrollers (textbooks may use ancient systems).  To learn how computers work *now*, you need the basics plus a long trip down the rabbit hole of Moore's law and finding new uses for all those transistors.  Also don't expect that all the good tricks are published.

https://www.amazon.com/Computer-Architecture-Quantitative-John-Hennessy/dp/012383872X

This was the gold standard a decade or two ago, and likely still holds up.  Of course it is for senior level engineering undergrads and/or grad students (and priced accordingly, although earlier editions are unlikely to be all that obsolete).

 

 

Link to comment
Share on other sites

  On 2/25/2019 at 2:49 PM, wumpus said:

Many high level programming languages are interpreters, making it quite a stretch to call them macroassemblers

Expand  

Interpreters — together with their runtimes.
A runtime still converts high-level definitions into machine codes, and an interpreter is a program for this runtime which is parsing another program.

  On 2/25/2019 at 2:49 PM, wumpus said:

C looks like a lot more like a macroassembler,

Expand  

C is almost a bare one.

Link to comment
Share on other sites

  On 2/25/2019 at 2:49 PM, wumpus said:

Many high level programming languages are interpreters, making it quite a stretch to call them macroassemblers.  C looks like a lot more like a macroassembler, especially if you are familiar with PDP-11 addressing modes (the computer K&R were using when they designed it).

"The assembler is just a mnemonic notation of machine codes." Ok, that is correct.

"Machine codes represent the CPU registers literally." Generally true for all computers programmed in assembler (currently and historically).  Don't expect modern computers (especially Intel and AMD beasts, but Out-of-Order ARM chips can get pretty weird themselves).

"CPU registers (logically) are the sets of output pins of corresponding transistors." Latches, not transistors (a latch needs at least two gates, and gates need a few transistors).  And again, in anything out of order (i.e. remotely high performance), expect all registers to be renamed from some group of available registers.  What you get in a multimillion transistor core looks nothing like a textbook CPU.

If you want to know "how a computer *can* work, look at AVR microcontrollers (textbooks may use ancient systems).  To learn how computers work *now*, you need the basics plus a long trip down the rabbit hole of Moore's law and finding new uses for all those transistors.  Also don't expect that all the good tricks are published.

https://www.amazon.com/Computer-Architecture-Quantitative-John-Hennessy/dp/012383872X

This was the gold standard a decade or two ago, and likely still holds up.  Of course it is for senior level engineering undergrads and/or grad students (and priced accordingly, although earlier editions are unlikely to be all that obsolete).

 

 

Expand  

ive used some avr asm in some embedded projects, useful when you need to prune a tight loop or to fix timing bugs. the atmegas only have like 130 or so instructions so its a pretty lightweight instruction set to learn. x86 is a beast beast in comparison. 

 

lately ive been thinking about what kind of hardware id need to make my theory crafted cpu. turing machine is just a memory with relative addressing. i got some 8 bit srams somewhere, i stick a mux/demux on the data bus, the inputs become part of the memory address, turning it from a 32x8k ram into a 256x1k ram. since all addressing is relative you need a bi-directional binary counter with enough output pins for the address, since its a 32k sram i would need 15+3 address pins total. simply setting the direction and cycling the counter covers the INC/DEC commands. the LD/SD commands would control the memory read/write logic and register (which i think could just be a flip flop). LD would require sending a read command to the chip and writing a bit to the flipflop, the ST would take the output of the flipflop feed it back into the memory with a write command. STR/CLR just requires putting a high/low on flipflop input. CMP and OR can be handled with simple gates.

putting the program on the sram would be somewhat problematic and would probibly double the complexity of the computer. doing what i suggested in my theorycrafted cpu would also be really slow, and it might be better to use a separate memory for program, with its own counter (this would be the program counter). a 4-bit flash or eeprom would work great here. also you need some kind of timing, each cycle, get the 3 bit instruction decode and execute. decoding is probably the hard part. probably require a bunch of muxes and buffers to control all the signals between chips. probably wouldn't be worth designing in hardware. probably do it in software or with an fpga if i was so inclined.

incidentally i do kind of need a simple virtual machine of some sort for a project i was working on. but its going to be a little more robust and somewhat specialized.

Edited by Nuke
Link to comment
Share on other sites

  On 2/25/2019 at 8:47 PM, radonek said:
  On 2/25/2019 at 3:05 PM, kerbiloid said:

In notation. It follows the assembler abstractions very closely.

Expand  

... which is why it is still relevant and useful.

Expand  

... as well as C++ does.

And every time when they use C instead of C++, they just make a bunch of same methods, a structure with same members, but instead of compact

C++:
 

class MyClass
	{
	public:
		int m_nNumber;
		string m_sText;

		string fnA(int a, double b);
	void fnB(string const & s, int const x);
	};

C

struct MyClass
	{
	int nNumber;
	string sText;
	};

int MyClass_fnA(MyClass * p, int a, double b, char * pc);

void MyClass_fnB(MyClass * p, char * const pc, int const x);

 

So, C is appropriate for very short programs (like they were in 1969), but for a big program you don't need to reinvent DIY C++ when you already have C++.

Edited by kerbiloid
Link to comment
Share on other sites

  On 2/26/2019 at 11:46 AM, Green Baron said:

Seriously, C is the second most popular language today.

Expand  

The first is Java. So what, Java is better than C?

  On 2/26/2019 at 11:46 AM, Green Baron said:

Yeah, just 20 million lines or so ... like in OS kernels, various compilers, libraries ... peanuts :-)

Expand  

1. Many of them have been started in 90s, before a C++ standard appeared.
2. MFC is definitely a C framework, yeah.
 

Edited by kerbiloid
Link to comment
Share on other sites

  On 2/26/2019 at 5:09 AM, kerbiloid said:

C is appropriate for very short programs

Expand  

I'm not so sure about that.  Back in the day, one of the major issues I/we faced with people developing with C is that they'd use the 'large model' during compilation... this resulted in a huge exe, which often called in equally huge overlays as well.  Nothing like pulling in an entire runtime library when you only need 2 or 3 modules from it. :/

I'm sure this is all changed these days. Yes? (Or am I dreaming lol)

Link to comment
Share on other sites

This thread is quite old. Please consider starting a new thread rather than reviving this one.

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...