Q-3 What is difference between SparkSession Vs SparkContext In Apache Spark.

Опубликовано: 21 Октябрь 2024
на канале: AICoders
11
0

Spark Context
Spark Context is the primary entry point to use Spark capabilities. After creating the SparkContext, we can use it to create RDDs, broadcast variables, and accumulators.
In order to use APIs of SQL, HIVE, and Streaming, We need to create separate SQL Context, Hive Context, Spark Context.


In Spark 2.0, we have a new entry point built for DataSet and DataFrame APIs called SparkSession.
It combines Spark Context, SQLContext, HiveContext, and StreamingContext. All of the APIs accessible in those contexts are likewise available in SparkSession,