/spark-chef

Chef cookbook for Apache Spark

Primary LanguageRuby

Spark Chef Cookbook

Installs and configures Apache Spark.

Requirements

Java needs to be installed. Although this cookbook does not have a dependency on it you are encouraged to use the Java cookbook.

Attributes

spark::default

Key Type Description Default
['spark']['version'] String Apache Spark version 1.0.0
['spark']['url'] String URL to download the tarball from n/a
['spark']['home'] String Directory to install Spark in /usr/local/spark
['spark']['username'] String User that all Spark daemons will run as spark
['spark']['local_dirs'] String Comma separated list of directories Spark will use to persist shuffles "/usr/local/spark/local_dir"

Usage

Just include spark in your node's run_list:

{
  "name":"my_node",
  "run_list": [
    "recipe[spark]"
  ]
}