This is a Scala Maven project.
If you are running Spark on Windows (without inbstalling HDFS), download spark-tutorial/winutils, create a local bin folder and place it in a certain drive e.g. C:\SparkDev\bin\winutils Set the environment in your IDE. For Eclipse : Right click on project -> Run As -> Run Configurations -> Scala Application In the environment tab in the required configuration, add new variable HADOOP_HOME with value C:\SparkDev
For any general compatibility issues with Scala IDE, refer to this very helpful tutorial - How to set up a Spark project with Scala IDE Maven and GitHub https://www.youtube.com/watch?v=aB4-RD_MMf0