When we talk about spark on top of Hadoop its generally Hadoop core with Spark compute engine instead of MapReduce, i.e (HDFS, Spark, YARN) Spark follows a master-slave architecture where the master is called a Driver in spark and is responsible for ...
blog.yashsrivastava.link2 min read
No responses yet.