mongodb/docs-spark-connector

Mongo Spark Connector docs page.

ksambhav opened this issue · 1 comments

Referring to https://docs.mongodb.com/spark-connector/sparkR/ documentation:-

My environment:-
OS - Windows 10
Java - 1.8
Spark - 2.01
MongoDB - 3.2.1
R - 3.1

Command used to trigger SparkR:-

sparkR --conf spark.mongodb.input.uri=mongodb://127.0.0.1/pud-dev.financialTransaction?readPreference=primaryPreferred --conf spark.mongodb.output.uri=mongodb://127.0.0.1/pud-dev.financialTransaction --packages org.mongodb.spark:mongo-spark-connector_2.11:1.1.0

PROBLEM:-

  1. The doc says "In the sparkR shell, SparkContext is available as sc, SQL context is available as sqlContext." But I am unable to access 'sqlContext' from SparkR command line. 'Error: object 'sqlContext' not found'. What is the right of using it then ? How can I create sqlContext if needed using 'spark' session. It would be helpful if that's mentioned in documentation.
  2. I had to drop double quotes (") in the command to run it. Not sure if it works with double quotes in linux shell.

MongoDB doc is based on Spark 1.x & I was using 2.x.