Various Entry Points for Apache Spark
In **Data Engineering** Apache Spark is probably one of the most popular framework to process huge volume of data. In this blog post I am going to cover the various entry points for Spark Applications and how these have evolved over the releases made.
Every Spark Application needs an entry point that allows it to communicate with data sources and perform operations such as reading and writing data.
In Spark 1.x, three entry points were introduced:
- SparkContext
- SQLContext
- HiveContext
The rest of the blog can be read using this link: Various Entry Points for Apache Spark
Looking to learn Big Data? Join our Training program Big Data Masters Program to learn the concepts in-depth according to industry standards.