Answer the question
In order to leave comments, you need to log in
History of programming?
I am looking for clear information on this issue. There is a lot of information on the web, but in addition to being scattered, it is also very generalized. How and why significant transitions were made is poorly covered and extremely lacking in detail.
I need this in order to, oddly enough, learn to program. In my humble opinion, without this knowledge, it is impossible to understand any programming language. If someone could clearly shed light on this issue or point to reliable sources, it would be extremely great.
I myself make up and I can solve some simple task on a script, using my brain and Google. But the harder I tried to understand the principles of OOP and the like, the more I came to the conclusion that without understanding where the legs grow from, it is impossible to delve into programming normally.
Addition:
Here it is necessary to clarify that I am interested in the history of computers in general and, as a result, the history of programming.
As far as I understand, the current state of affairs is tied to the Turing machine and the von Neumann architecture. But further than this understanding, provided that it is correct, I have not advanced. I also, in principle, introduced the Turing machine. But again, I did not find a clear and detailed explanation of what it is and why.
Answer the question
In order to leave comments, you need to log in
Well, for example, "History of the Development of Programming Languages"
knowledge.allbest.ru/programming/2c0a65625b2bd69a4...
On such sites you will dig up a lot of interesting things))
Almost every good programmer I know has mastered this skill without knowing it.
I'm afraid you're ascribing magical properties to history.
The history of the computer has a different weight with the history of programming. In principle, the history of computers includes the history of programming, since in fact, all these high-level OOP-type abstractions arose from the desire to simplify work with assembler, assembler - to simplify work with machine codes, machine codes - a layer above the gates, gates - to simplify the work with transistors.
I advise you to concentrate on the development of computer architecture, and not its history.
Recommended Reading:
The whole book is one big and detailed story.
The coolest book, where the lowest level is chewed. You assemble the whole book, together with the author, a computer on relays, and in the end change everything to transistors) It is read in one breath.
I haven't read this book, but I plan to. In terms of content, it seems like it’s better not to read it to complete zeros. Although, it may turn out that this is just a quick and compact entry into modern trends and the state of affairs in the field of computers.
First there was Blaise Pascal and his calculating machine
. Then there was something else, and then the Von Neumann architecture
appeared
. And the first computers on it 1 , 2 . If I'm not mistaken it was somewhere in the forties of the 20th century.
Then transistors and other electronic components appeared, which made it possible to fit several logical elements of an electronic circuit on one chip.
Then everything developed and the 8086 architecture from Intel appeared - the same one with which the x86
instruction set appeared.
Then the amd64 instruction set appeared - that is, the instruction set and architecture for 64-bit processors.
At first, programs were stored on punched cards ..
At first, programs were written in machine code . Then came the assembler .
And then there were high-level compiled languages. Algol, Pascal, Fortran, C, C++.
Then there were already languages working through virtual machines - Java, C # and others, including interpreted ones (PHP, Ruby, Python, Javascript).
Didn't find what you were looking for?
Ask your questionAsk a Question
731 491 924 answers to any question