In this post you will learn how to Run a Scala program in Apache Spark.
In your machine you need to install, (build tool), java 8 and spark cluster( aws / ).
Open command prompt and type ‘sbt new scala/hello-world.g8’.
write your logic in src/main/scala/Main.scala file.
To compile/ run / package the scala project open the root path of the project in command prompt
to compile- sbt compile
to create a package – sbt package
to run that project – sbt run
In that spark cluster, please check the spark version is same as the build.sbt spark version.
You can also bookmark this page for future reference.
You can share this page with your friends.
Follow mefor future notifications.