Run a Scala program in Apache Spark

Hi Readers,

In this post you will learn how to Run a Scala program in Apache Spark.

In your machine you need to install scalasbt(build tool), java 8 and spark cluster( aws / use cloudera vm).

Open command prompt and type ‘sbt new scala/hello-world.g8’.

write your logic in src/main/scala/Main.scala file.

To compile/ run / package the scala project open the root path of the project in command prompt

to compile- sbt compile

to create a package – sbt package

to run that project – sbt run

In that spark cluster, please check the spark version is same as the build.sbt spark version.

You can also bookmark this page for future reference.

You can share this page with your friends.

Follow me Jose Praveen for future notifications.

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.