Suppose we have apache spark running on only one machine. It runs in standalone mode with 5 workers and 2 cores for each worker.
We need to submit three spark streaming applications. How can I submit the first application and allocate two workers to it, then submit the second application and allocation two workers to it, then submit the latest application and allocate the remaining worker to it?
Is there a way to do that?
No responses yet.