Spark Error “could not reserve enough space for object heap”

Today I tried compiling the latest version of Apache spark. Spark uses SBT ( Simple Build Tool) for compilation. But I faced an error “could not reserve enough space for object heap”. I haven’t tried to check the root cause of the error by checking more because of my laziness.. šŸ™‚ . I tried an alternate fix and it worked for me.

Step 1: Go inside spark installation directory (SPARK_HOME)
Step 2: Download sbt manually from sbt website http://www.scala-sbt.org/download.html
Step 3: Then extract the downloaded sbt tar ball
Step 4: The extracted sbt will be present inside $SPARK_HOME/sbt/bin
Step 5: Execute the following command from SPARK_HOME

sbt/bin/sbt assembly

This worked for me… šŸ™‚

Advertisements

About amalgjose
I am an Electrical Engineer by qualification, now I am working as a Software Engineer. I am very much interested in Electrical, Electronics, Mechanical and now in Software fields. I like exploring things in these fields. I like travelling, long drives and very much addicted to music.

One Response to Spark Error “could not reserve enough space for object heap”

  1. Sameer says:

    Thank You for sharing the solution šŸ™‚

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: