TL;DR
Some statements in the question use contemporary premises. You have to abandon all this to understand the functioning of the "old" computers. And I doubt that anything succinct will give a good idea of how computers worked. In fact, this will only happen more concretely with a lot of study. It’s a lot of variation.
0s and 1s can be stored and processed in various ways. Their input and output as well.
it would take another programming language to create the Assembler
You don’t have to. You can do it by hand. It’s not a pleasant task for a human being, but it was what you had. And that, I think, deciphers a lot of the doubt.
In the premise of the question there was no first programming language on computers. It only existed on paper. Someone would write what the program was supposed to run, turn it right there into numerical code and then convert it to binary, all in hand, just like the computer does, except the conversion to binary because internally everything is already binary anyway. Internally the language was binary code.
So the Assembly, or a rudimentary form of it, existed only on paper. The assembler was human. The automated assembler emerged later, since it made no sense not to use the computer to automate a lot of problem, but not to automate its own problem.
High-level languages were born that way too. They got tired of writing something understandable and then also writing Assembly to do that.
the result of a montage is another file in text format with 0’s and 1’s only?
In a way we can say yes, but it is not a precise definition.
Or are electrical impulses sent to the valve machine.
Everything in computers is electrical pulses. But it’s not easy for a human being to generate these pulses on his own, so an electromechanical device is used to generate it. Keys are used to generate this. Until today we use keys to enter this data. In English key is key, that we call the key. Then a key panel is a Keyboard, which we call the keyboard. These keys are organized in an easier way for us to use than were the keys in the early days.
Binary codes were entered by mechanical on and off keys. Yes, bit by bit.
Introducing
Digital computers, i.e., electronics based on 0 and 1 were created in the 30s/40s. More or less together with the programmable computers, that is, it was a universal Turing machine. At least that’s what the story tells. There are always those who can say that it happened a little different from the official story.
Initially all information was stored in one rudimentary form of memory (another example and the most used for a long time). In fact, each computer tried a new way a little different or a very different way, until a form came to dominate. But the memory as we know it today only appeared in the 1970s.
There is dispute over which was the first computer. For a long time the ENIAC was officially considered. Even if it is not true, it is important to understand evolution. The wikipedia article has many details that help you understand how it works.
The problem with the question is that it seems there was a unique event that structured everything. Actually there was a lot of experimentation, not all of them became so famous. Several paths were taken.
The Mark I (one of the variations) was an important computer created by John Von Newmann, credited as the creator of basic computer architecture "modern". Another variation evolution is also important. If you want details follow the links. It’s hard to understand everything without getting all the details.
There’s a list of computers that started the story and which, as a rule, were hand-programmed.
Forget this idea of files, of texts. It was binary. Think concretely.
How it worked
There was even some point where someone "programmed" the computer by rearranging valves. But this cannot be considered programming as we know it today. It quickly became apparent that a more volatile form was needed. It needed the program to be entered in some way (and could even be punched card), stored and followed as dictated to alan turing theory.
You should imagine that there wasn’t much room for error. The programmer had to make sure that his code was error-free. Everyone needed to be checked in the eye. Of course, at that time there were only good people working with it, they were careful because they knew they could not err, and they dominated the whole process, after all they had invented everything, and, of course, only made things very simple, to the current standards.
Data entry has nothing to do with programming itself. Just as today, programs are given for a special purpose. So if you were entering numbers to calculate or numbers that would instruct the computer on what to do, it didn’t matter, just as today if you’re using the keyboard to send a message or to create a program.
One of the problems of fiddling with the keys is that the moment you were going to put in another program, the current one was obviously lost. There they started using punched cards for the programs (or paper ribbons). The cards were punched according to the bits that had previously been programmed on the paper. Then it was never lost (except by deterioration of the material, but it could be reproduced).
Of course, the card was only used as a means of entry. It modified keys as they had holes or not. The specific mechanical process changed but the basic concept and operation was the same.
Note that this doesn’t matter much. The abstractions created are what allowed programmers to productivity. That is why higher level languages were created. They were conveniences for the human.
It also needed productivity in data entry. The gain is not so great, but important. Just as they had keyboards to punch the cards, at least in more modern versions of the punchers, they realized they could make the keyboard already change the programming keys. Of course they were no longer mechanical, at least not purely.
Punched cards were used in mechanical computers for pure data entry, not programs that did not exist.
The image below starts to show the computer as we know it today. That was the middle of the 1960s. There was the keyboard input, the output by printer or video, storage by magnetic means (which is already becoming obsolete), still sequential, although perhaps it has a random medium somewhere not visible. I had little because it was too expensive.
See how what is considered the first micro-computer in 1975 was so simple that it functioned like the first modern computers of the 1940s. He could have an optional keyboard, I don’t know if in the first version.
Look at his simulator. Try to program.
The first language
In the sense of the question I believe that the binary code is the first programming language. Although it’s highly likely that there was something more abstract before.
I have no evidence, but it is possible that at least one form of assembly was used as the first language of these modern computers, but not used on computers, only on paper. If you’re going to consider what was done on paper to build computational machines, then you have to see what the Ada Lovelace used, the Note G, as shown below.
High-level languages were initially created to give programmers more cognitive power. But it was also a way to not depend on the specific instructions of a computer and be understood by any computer, given a transformation, manual or not (at that time already used assemblers and the creation of a compiler was the natural step).
Completion
Everything is evolution. You have to take it one step at a time, and there are many. You have to create the egg first and then the chicken. Or would it be the other way around? Fortunately this dilemma does not exist in computing, first came binary (which is already an abstract concept), then numerical code (which is a higher level), later the text with Assembly and only in the 50’s appeared the languages more abstract and close to what we humans understand.
To show better I will have to go deeper, I would have to show each step. It is too long. The links already help search more. You can’t be too lazy to click on them all here, or on Wikipedia. Yeah, it gets old, but it’s the only way to learn everything and understand it all.
There’s a question about the functioning of modern processors. It is not so different from the beginning, although they had "several beginnings".
Also useful: How a programming language is developed?
I didn’t quite understand your doubt, you need to be more specific to it. I would suggest you start with generations of computers, for vc etender how they work, detail, the concept of digital computer (binary code) arose with the transistor, that was when the machine code that is the binary language and the basis of computing digital.
– gato
I understood perfectly, it is an interesting question, but not for the format of the site, will end up generating very broad answers. I still went to read the answer to see if it could save, and honestly I think not. I gave +1 but I put to close.
– Jorge B.
Maybe learning a little bit of digital electronics will help. All the developments on the software side were ways to make life easier just.
– Bacco
@Denercarvalho actually transistors came well after
– Maniero
I think it will depend on what you would accept as a language. Example, a switch is a language. It just says ON/OFF. Look at the walls of your house, there are several of them and you use them every day to turn on and off lamps. Now, try to observe from another angle. Think about how you could improve this mechanism so simple. Most would say that you have nothing to improve or that you have nothing to do. A few would say "I think I can add a gear or slider structure to slide/rotate under a resistor circuit and thus control energy levels".
– Daniel Omine