By default
the value for spark.executor.memory is 4608m and spark.executor.instances is 2
Change
spark-executor.memory to 1608m and run Spark-Shell. Below is the memory
consumption footprint.
Now Changed
spark.executor.instances to 1 and run Spark-Shell. Below is the memory
consumption footprint.
No comments:
Post a Comment