P
P
P7472020-04-15 23:06:59
Java
P747, 2020-04-15 23:06:59

Java at the hardware level, differences from C++?

Good afternoon!

I recently started learning Java from the book "Java 8. Beginner's Guide | Schildt Herbert", it says, I quote:

Java was conceived in 1991 by Sun Microsystems employees James Goslin, Patrick Naughton, Chris Worth, Ed Frank, and Mike Sheridan. It was originally called Oak, but in 1995, when
marketers took over its promotion in the market, it was renamed Java. Surprisingly, at
first, the language developers themselves did not set themselves the task of developing Internet applications. Their goal is to create a platform-independent language in
which to write embedded software for various
microprocessor-controlled household appliances, including toasters, microwave ovens and remote controls. As a rule, controllers based on microprocessors of various architectures were used in devices of this type, and the executable code generated by the compilers of most
programming languages ​​that existed at that time was focused on certain types of processors. A typical example of this is the C++ language.


It is quite possible to create a C++ compiler that would generate bytecode instead of executable code, but the C++ language has a number of properties that
prevent its use for developing Internet applications.
The most important of these is pointer support. A pointer contains
the address of some object in memory. Pointers
can be used to access resources outside of the program, which creates security holes
. Java does not support pointers, and therefore
there are no security complications.


From here I had the following questions:
1. If Java simply does not know how to access RAM addresses, then how can it be used at all or was used in hardware?
2. What niche does Java occupy and where is it used for applications for the Internet, if PHP, Javascript and third-party programs (Redis, Node.js, Sphinx, etc.) written in C / C ++ have long proven themselves in this niche, so same used for building HighLoad applications?

Answer the question

In order to leave comments, you need to log in

1 answer(s)
A
Armenian Radio, 2020-04-15
@P747

The authors hoped that it would be possible to access the hardware using abstractions - the hardware manufacturer implements a java machine at home, and application code can be run on top of it.
You forget about a large layer of the history of technology, which was strong before that - 8-bit home computers (dozens of them) and the BASIC language.
So, the computers of that time were catastrophically different - different media (cassettes, floppy disks, cartridges), different equipment (put another central processor into the drive? Yes, nivapros! Different speed for PAL and NTSC versions - easy!). But BASIC was everywhere and allowed, with some alterations, to drag programs from car to car.
Thus, the authors were in this historical context (they spent all their childhood in an embrace with some VIC-20). Here it already becomes logical to want to create the same thing, only cooler - we take a modern language, standardize the syntax - and do the same - the hardware manufacturer provides us with a Java machine with the right level of abstraction - and all programs run from architecture to architecture without alterations.
In part, it worked on old mobile phones, in the j2me era - this is just Java stuffed into the phone - games, browsers, maps (there were even YandexMaps), readers, chatilki - a bunch of everything.
Another round of evolution that you apparently missed - (chord, I'm old, I'm very old) - attempts to create interactive web pages.
"We've got Java portable everywhere - let's get support for it in the browser in the form of applets - small applications." Here, virus writers and other rabble were sharply activated - thanks to the sprawling and leaky architecture, applets could do all sorts of crazy things.
Now a java applet can be found in any old hardware, it has a GUI for remote administration of servers, switches and storage systems. You go to the piece of iron with a browser, and then a bunch of warnings fall out on you that some terrible leaky software will be launched now, hold your pants, otherwise they will fly off. Sure? Are you sure? Do you agree with this? And so?
And only then they began to actively use JavaScript, mainly by inventing Ajax - continuing to fiercely pull the owl (a system for creating electronic libraries for scientists / HTML) on the globe (the task of creating an interactive GUI with a beautiful layout).
And you haven't refactored backing to Perle yet

Didn't find what you were looking for?

Ask your question

Ask a Question

731 491 924 answers to any question