This Introduction to Spark Programming course introduces the Apache Spark distributed computing engine and is suitable for developers, data analysts, architects, technical managers, and anyone who needs to use Spark in a hands-on manner.
The course provides a solid technical introduction to the Spark architecture and how Spark works. It covers the basic building blocks of Spark (e.g., RDDs and the distributed compute engine), as well as higher-level constructs that provide a simpler and more capable interface (e.g., Spark SQL and DataFrames). It also covers more advanced capabilities such as the use of Spark Streaming to process streaming data and provides an overview of Spark Spark ML (machine learning). Finally, the course explores possible performance issues and strategies for optimization.
The course is very hands-on, with many labs. Participants will interact with Spark through the Spark shell (for interactive, ad hoc processing) as well as through programs using the Spark API.
The Apache Spark distributed computing engine is rapidly becoming a primary tool in the processing and analyzing of large-scale data sets. It has many advantages over existing engines, such as Hadoop, including runtime speeds that are 10-100x faster, as well as a much simpler programming model. After taking this course, you will be ready to work with Spark in an informed and productive manner.
Delegates will learn how to
- Understand the need for Spark in data processing
- Understand the Spark architecture and how it distributes computations to cluster nodes
- Become familiar with basic installation / setup / layout of Spark
- Use the Spark shell for interactive and ad-hoc operations
- Understand RDDs (Resilient Distributed Datasets), and data partitioning, pipelining, and computations
- Understand/use RDD ops such as map(), filter(), reduce(), groupByKey(), join(), etc.
- Understand Spark’s data caching and its usage
- Write/run standalone Spark programs with the Spark API
- Use Spark SQL / DataFrames to efficiently process structured data
- Use Spark Streaming to process streaming (real-time) data
- Understand performance implications and optimizations when using Spark
- Become familiar with Spark ML