How to convert ASCII to binary?

Asked

Viewed 1,283 times

7

I’m trying to implement the conversion of a text to binary, I found one on the internet:

static string ASCII_binary(string texto)
        {
            string converted = string.Empty;
            byte[] byteArray = Encoding.ASCII.GetBytes(texto);

            for (int i = 0; i < byteArray.Length; i++)
            {
                for (int j = 0; j < 8; j++)
                {
                    converted += (byteArray[i] & 0x80) > 0 ? "1" : "0";
                    byteArray[i] <<= 1;
                }
            }

            return converted;
        } 

But I’m not getting it 0x80 What is the point? There is some other way to make this conversion?

2 answers

5


Take a look at the code showing each step in the .NET Fiddle that helps to understand a little better. See working in the ideone. Also put on the Github for future reference.

The 0x80 is the hexadecimal of the 128 decimal number which in binary is the 10000000. When it applies the operator & of and, each bit of a number is compared to another and the result will be 1 if both are 1, or 0 if both are 0 or if only one of them is 1. That’s how this operator works. So in the letter example t it starts with

01110100
&
10000000
--------
00000000

Gave zero.

In the next step the operator is applied << which is the displacement, which is it plays all the bits to the left, so it looks like this:

11101000
&
10000000
--------
10000000

It’s 128, which means it’s bigger than zero, so he knows to use the string 1.

And you do it with all the other bits.

Note that using the number 128 (0x80) we are always comparing only the first bit, or others will always give 0 in any situation since at number 128 all bits will be 0, except the first. The operator of and, in this case, it will only vary according to the first bit of the number, the others will result in 0.

This is the way to walk step by step through the bits and analyze only the first. There are even other ways, but it will be less efficient.

See the inefficient way in .NET Fiddle. It may seem more readable to those who don’t understand bit operators. See working in the ideone. Also put on the Github for future reference.

See the performance test on .NET Fiddle. Behold working in the ideone. Also put on the Github for future reference.

If one considers that other operations of the algorithm have a very high cost, the simple use of the bit operator in place of the arithmetic operator probably generates, in the isolated operation, more than an order of speed magnitude.

The test should be done on your computer. The . NET Fiddle is unreliable because it has multiple processes running at the same time. But it gives an initial basis.

  • And how to convert from binary to ASCII?

  • @Moribundochat that’s another question, but something like that works Convert.ToInt32("1001101", 2).

  • with good efficiency or would not be the most appropriate way?

  • @Moribundochat is the simplest, I’ve done tests, I don’t know if it’s as efficient as it could be.

3

Basically what the code is doing on the line

converted += (byteArray[i] & 0x80) > 0 ? "1" : "0";

is to compare the most significant bit of Bytearray and checking whether it is 1 or 0 by doing an and bit by bit with 0x80

Dai takes the value of Bytearray and shifts the character bits one house to the left, making this process for all 8 bits of the character.

The hexadecimal representation of 0x80 is this : 10000000.

Browser other questions tagged

You are not signed in. Login or sign up in order to post.