What you'll learn:
- Learn about the similarities and differences between Spark and Hadoop.
- Explore the challenges Spark tries to address, you will give you a good idea about the need for spark.
- Learn “How Spark is faster than Hadoop?”, you will understand the reasons behind Spark’s performance and efficiency.
- Before we talk about what is RDD, we explain in detail what is the need for something like RDD.
- You will get a strong foundantion in understanding RDDs in depth and then we take a step further to point out and clarify some of the common misconceptions about RDD among new Spark learners.
- You will understand the types of dependencies between RDD and more importantly we will see why dependencies are important.
- We will walk you through step by step how the program we write gets translated in to actual execution behind the scenes in a Spark cluster.
- You will get a very good understanding of some of the key concepts behind Spark’s execution engine and the reasons why it is efficient.
- Master fault tolerance by simulating a fault situation and examine how Spark recover from it.
- You will learn how memory and the contents in memory are managed by spark.
- Understand the need for a new programming language like Scala.
- Examine object oriented programming vs. functional programming.
- Explore Scala's features and functions.
When our students asked us to create a course on Spark,we looked at other Spark related courses in the market and also what are some of the common questions students are asking in websites like stackoverflowand other forums when they try to learn Spark and we saw a recurring theme.
Most courses and other online help including Spark's documentation is not good in helping students understand the foundational concepts.They explain what is Spark, what is RDD, what is "this" and what is "that" but students were most interested in understanding core fundamentals and more importantly answer questions like -
- Why do we need Spark when we have Hadoop ?
- What is the need for RDD ?
- How Spark is faster than Hadoop?
- How Spark achieves the speed and efficiency it claims ?
- How does memory gets managed in Spark?
- How fault tolerance work in Spark ?
and that is exactly what you will learn in this free Spark Starter Kit course.The aim of this course is to give you a strong foundation in Spark.