Spark Error “could not reserve enough space for object heap”

Today I tried compiling the latest version of Apache spark. Spark uses SBT ( Simple Build Tool) for compilation. But I faced an error “could not reserve enough space for object heap”. I haven’t tried to check the root cause of the error by checking more because of my laziness.. 🙂 . I tried an alternate fix and it worked for me.

Step 1: Go inside spark installation directory (SPARK_HOME)
Step 2: Download sbt manually from sbt website
Step 3: Then extract the downloaded sbt tar ball
Step 4: The extracted sbt will be present inside $SPARK_HOME/sbt/bin
Step 5: Execute the following command from SPARK_HOME

sbt/bin/sbt assembly

This worked for me… 🙂

Jenkins/Hudson not executing jobs (pending – waiting for next executor)


I faced an issue in jenkins while executing multiple jobs in parallel. Upto a specific number of jobs are running in parallel, but beyond that, if we submit any job, it was going to waiting state.


In this case, jenkins is executing jobs in parallel upto some number. That means somewhere we need to increase the parallelism paramater.

  • go to Jenkins -> Manage Jenkins -> Manage Nodes
  • check whether all nodes are active and healthy
  • Check whether disk space is available in nodes
  • If everything is fine and if the machine have enough hardware configuration, you can increase the number of executors.

For that, click on configure and increase the number of executors as per your requirement. 

NB: If the jenkins is not at all executing jobs, then it might be because of offline master.

An Introduction to Distributed Systems