Thursday 21 July 2016

Spark Memory Consumption Optimization

Changing spark.executor.memory and spark.executor.instances will bring down memory consumption.

By default the value for spark.executor.memory is 4608m and spark.executor.instances is 2




When I run Spark-Shell after SSH, below is the memory consumption footprint.






Change spark-executor.memory to 1608m and run Spark-Shell. Below is the memory consumption footprint.


Now Changed spark.executor.instances to 1 and run Spark-Shell. Below is the memory consumption footprint.


No comments:

Post a Comment