Answer the question
In order to leave comments, you need to log in
Hadoop + ApacheSpark deploy cluster?
Hey, I'm just getting into architecture.
You need to deploy hadoop for data storage, apache spark for processing.
What will the architecture look like in terms of servers?
Do you need a lot of servers and collect them in a cluster? Or will there be one cloud where everything will spin?
How to do it right?
Answer the question
In order to leave comments, you need to log in
Didn't find what you were looking for?
Ask your questionAsk a Question
731 491 924 answers to any question