Showing posts with label MongoDB. Show all posts
Showing posts with label MongoDB. Show all posts

July 20, 2017

Situation: Creating spark context 2 times by mistake while creating spark mongo db configuration

Situation: Creating spark context 2 times by mistake.

 val spark = SparkSession.builder()
      .master("local")
      .appName("SparkMongoDB")
      .config("spark.mongodb.input.uri", "mongodb://127.0.0.1/Spark.sparkCollection")
      .config("spark.mongodb.output.uri", "mongodb://127.0.0.1/Spark.sparkCollection")
      .getOrCreate()
   //val sparkConf =  new SparkConf().setAppName("SparkMongoDB").set("spark.driver.allowMultipleContexts","true"); val sc = new SparkContext(sparkConf);
      val sc = spark.sparkContext

 I got below exception in console:
Exception in thread "main" org.apache.spark.SparkException: Only one SparkContext may be running in this JVM (see SPARK-2243). To ignore this error, set spark.driver.allowMultipleContexts = true. The currently running SparkContext was created at:
org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:860)

Action: Verified https://issues.apache.org/jira/browse/SPARK-2243, Take action: Removed unnecessary spark context objects.

Result: Running the application without any issue.
Read more ...

My Favorite Site's List

#update below script more than 500 posts