2. The Spark Context object in the driver coordinates all distributed processes and allows resource allocation.
3. The Cluster Manager executes programs, which are JVM processes with logic.
4. The Spark Context object sends the application to the executor.
5.Spark Context executes tasks in each executor.