Spark Error “could not reserve enough space for object heap”

Today I tried compiling the latest version of Apache spark. Spark uses SBT ( Simple Build Tool) for compilation. But I faced an error “could not reserve enough space for object heap”. I haven’t tried to check the root cause of the error by checking more because of my laziness.. 🙂 . I tried an alternate fix and it worked for me.

Step 1: Go inside spark installation directory (SPARK_HOME)
Step 2: Download sbt manually from sbt website
Step 3: Then extract the downloaded sbt tar ball
Step 4: The extracted sbt will be present inside $SPARK_HOME/sbt/bin
Step 5: Execute the following command from SPARK_HOME

sbt/bin/sbt assembly

This worked for me… 🙂

“cannot remove `libtoolT’: No such file or directory”.

Sometimes while compiling some tools or source code in linux we may face an error like this

“cannot remove `libtoolT’: No such file or directory”.

A quick fix for that error is, edit the configure file search for the line containing  $RM  “$cfgfile”, replace it  with $RM  -f  “$cfgfile”

I faced this issue while compiling apache httpd server and php.

This fix worked for me. 🙂