How is it decided on the writing patterns on the computer?

Asked

Viewed 65 times

0

I know that the binary language is "written" with the hardware itself with transistor and everything else (correct me if I’m wrong).

But how is it defined (0110 0001) in the ASCII standard? Through the operating system or through the hardware itself? How this letter is written in binary and defined?

  • Did the answer solve your question? Do you think you can accept it? See [tour] if you don’t know how you do it. This would help a lot to indicate that the solution was useful for you. You can also vote on any question or answer you find useful on the entire site (when you have 15 points).

1 answer

3

It is not simple to say whether it is in hardware or operating system, because it happens in both. This is defined in several ways. Don’t think that only exists in one place and everything be[running in that place. So for some things the hardware has how to do this "interpretation", but the operating system has others, and in different ways, and there may be in other instances, as specific applications.

Even there are wrong premises in the question, one of them is that everything is ASCII to deal with letters, is not. That one link shows much of what you should know, but the subject is far more complex than that.

Here I show a little more on the subject and how wrong the idea that ASCII is so special is.

Strictly speaking, the definition that a binary number is in ASCII exists outside the computer. For the computer there is only the number. He doesn’t know there’s letters there.

At some point these numbers will be shown as a letter or something equivalent to a letter will enter the computer. It can be text screen, graphic screen, a sound, a network system, disk, keyboard, microphone, some lens that captures image, etc.

There are complex algorithms (most of the time) to deal with this data and deal with what is a letter for the human being to be the number for the computer. It is the analog-digital conversion and vice versa. This can be processed by the hardware and deliver ready to the operating system (pure or by a driver) or may be a function of the operating system or even a specific application.

It must be clear that the letter is never written anywhere on the computer, it will be the number or some object that represents the letter in a specific way.

To quote an example: you can have a drawing formed by several graphic dots that draw the letter to show to the user on a screen, paper or other visual medium. This drawing will be done by joining a binary number and a specification somewhere of how points should be drawn when you find that number which means a letter in an ASCII, Unicode or other table. This specification can be found in a file we know as a source. So each font can draw the same letter in a different way.

The secret here is the number engraved, not the letter. The ASCII table or others that do the same is exactly that, a table that associates a number with something that a human understands as letter and in each context that letter will be interpreted in a way, can be a drawing on screen, a sound, or another means of expression. It’s just a way to connect the number to something representative.

Effectively the letter does not exist, there are only bytes. Even when you see a letter is a belief of your own, you have learned that it is a letter so it has become a truth, but this concept was created by the human being, it does not exist in nature, so it does not exist in the computer. Most of what we deal with in life and put into the computer are attempts to simulate in the best possible way what a human invented. Some things are simple, others are not. The way to put on the computer or expose can be simpler or more complicated. It all depends on the context.

Who knows this is more enlightening.

We can say that the table was chosen by someone because they thought it made sense, just as someone created the Roman alphabet (or not) the Arabic numbers (or not). Someone placed symbols that were established as letters and numbers in a certain order and this order established the binary number that each one would be. Of course it had a logic, but it could be different if someone else had done it. Actually other people did it differently but the one that "caught" it. Until the day that did not serve much more and there created other better tables (all were based on ASCII and maintained compatibility in this part).

The subject is complex and can fit more specific questions, in general is this.

Browser other questions tagged

You are not signed in. Login or sign up in order to post.