How does computer coding work?

I know it’s just a bunch of 1s and 0s but how do computers understand it? There is no rule for what is what so what is it that determine what these numbers mean?

Each computer has a Processor.

That processor has a limited set of “instructions” it understands. And they are very simple. _Think of it as a calculator, or an old video recorder, an alarm clock.

The kind of instructions are such as

Sum A,B that they are actually like

Set A = 5

Set B = 3

Sum A,B

that inside the machine actually are a bunch of 0, 1s like

“32 5 33 3 48″

Where 32 Means “Set A”

33 Means “Set B”

and 48 Means “Calculate A+B” and set that value in C

That is the machine code that is resolved executing “32 5 33 3 48” by letting electricity and circuits flow.

That is the First Step

In addition to that, some of the sentences interact with other processors related to other devices, like Printers, Networks, Screens, or Elevators… what you want.

2nd. The nearest language to that is “Assembler”… that has a direct relationship between each Assembler Sentence to a Processor Sentence. The relation is one to one.

Then you can see a programmed code in assembler that seems like

Set A=5;

Set B=3;

Add A,B;

Then if you write something in Assembler you can use a “Compiler” that translates that to “32 5 33 3 48″ and that bunch of numbers can be executed by the processor.

Any other language needs another program that translates that bunch of characters you write on a text file to a set of numbers the processor can execute. That translation operation can be made all at once (compiler) or translating each sentence and executing it before translating the next (interpreter)

Leave a Reply