Answer the question
In order to leave comments, you need to log in
How to choose a server for computing on Apache Spark?
Good evening everyone!
I write in Python and I'm new to Spark (studied Apache Spark docs/Did basic tasks on local machine) and distributed computing. I have pretty good experience in ML, but I've always done all my calculations on my local machine and never had dealing with remote machines.
People who already had experience with Spark, could you recommend which servers are best for a beginner to use (by what criteria to choose), or links to useful resources where the process of connecting to servers and further calculations on them is described in a very detailed form
Thank you
Answer the question
In order to leave comments, you need to log in
Hello, I've also recently become interested in this topic. I chose AWS services for myself. There you can deploy a cluster (a set of computing devices that share computing power for processing). Apache Spark can be deployed there, another thing is how you will upload data there, there are also many options there.
Didn't find what you were looking for?
Ask your questionAsk a Question
731 491 924 answers to any question