D
D
Daniel2021-02-13 12:33:44
Cloud computing
Daniel, 2021-02-13 12:33:44

HOW TO USE THE RESOURCES OF ANOTHER COMPUTER?

Purely a question, it's hard to google the details, or I don't know some basic keywords (some game servers and big companies).
I heard about cloud computing, but I always thought it was something for big companies, I once made my program like TeamViewer for fun, but now it’s realized that you can get real benefits.
For example, I have a laptop, I have a computer, I want to run some program with complex calculations, but you can execute them on another computer. And get results.

The question is what are the ready-made solutions, programs, am I not the only one who thought of this?
The question is, how can Visual Studio force the program to compile to compile the project (I heard this somewhere) on another computer (server).
And is it possible to do this for any program at all, for example, let's say such a code.

public static void Main(string []args){
  Console.WriteLine( fun()));  // больше не и чего нету 
}
 double fun(){
  int a;
   for(int i=0;i<100000;i++)
        for(int j=0;j<9999999;j++)
          for(int k=0;k<99999;k++)
                a = Math.Sqrt(Random.Next()*i+Math.Sqrt(j+k)) ;} /// все
}// писал в тостере.

And compiled
. Is it possible to run the program and make this code FROM OUTSIDE run on the cloud? That is, the program itself did not assume that part of it could be run in the cloud. Maybe with Code Injection?
It's the same I was the first to think of this, there are solutions or implementations.

Answer the question

In order to leave comments, you need to log in

3 answer(s)
R
rPman, 2021-02-13
@daniil14056

The cloud is just a way to abstract from the specific machines and services of which it consists and operate with this abstraction to say - I launched the application in the cloud, instead - I launched the application on the server, which is currently the least loaded and takes data from this storage, which I also launched, well, or something like that.
In fact, you are writing an application, for simplicity, let's assume that the developer environment is similar / compatible on the server, and then you just run it somehow on remote servers.
For example, you run your application locally simply on the command line, and now add ssh [email protected] to it at the beginning, and now your script is running on a remote server, but its output is also broadcast to you locally. From the point of view of usability, you kind of run everything locally, but the capacities involved are cumulative to the servers used. It goes without saying that you need to develop your software in such a way that it can, in principle, be run on several machines (although there are tools that allow you to simulate the operation of one machine on a cluster, automatically sharing, including shared memory, getting, as it were, a multi-core machine with a huge amount of memory, though it's not as effective as doing it yourself).
It is quite possible that you will need some kind of control over this process and, most importantly, the data with which your applications work. When you have 1-2 .. well, five servers, you can do something with your hands, monitor the result yourself and make a decision depending on the result, but it’s more logical to entrust it to the machine as well (write an application that will allow you to control the process of starting / stopping services, monitor the result of their work, errors, monitor the load and efficiency, and somehow provide it all to you in the form of ui, and remember that all this will be in the process of developing and modifying your applications).
With data, a separate conversation, if it is possible to share / allocate a separate server for this or use ready-made paid services like Amazon - it’s already good, but if there is little data used, you can simply copy it to the server and back every time you start it with the same rsync (if you store in files).
Why the whole world fell in love with services like amazon? because they, like a true cloud, allow you not to bother with exactly how your task is implemented, and there are also tools for automating everything and everyone. even when it comes to virtual machines, you can automatically deploy new instances, remove unnecessary ones, stop, backup, snapshot... For example, if your task can be accelerated by parallelization, then in the end, no matter how complex it is, you can get the result as quickly as possible, using the maximum necessary capacity (clearly for the maximum money, but sometimes the time is worth it).
ps do not try to find a ready-made solution for everything, a framework, a library, a toolkit ... especially when the task is too abstract, you can spend more time disassembling than throwing in a simple set of utilities yourself, because no one understands your task better than you

A
Armenian Radio, 2021-02-13
@gbg

Yes it is possible. There are solutions and a huge number of them, for different tasks - different.
For mathematics, for example, students are taught MPI. For bigdata, someone is using HADOOP.
Running programs that are not adapted for work in the cloud, as a rule, results in terrible brakes. But you can do this - you can just rent a powerful computer and go to it via RDP.

G
Griboks, 2021-02-13
@Griboks

Google for miner botnet .

Didn't find what you were looking for?

Ask your question

Ask a Question

731 491 924 answers to any question