Learn how to use the Hadoop ecosystem
Understanding Hadoop is a highly valuable skill for anyone working with large amounts of data. Companies such as Amazon, eBay, Facebook, Google, LinkedIn, Spotify, and Twitter use Hadoop in some way to process huge chunks of data.
On this three-week course, you’ll become familiar with Hadoop’s ecosystem and understand how to apply Hadoop skills in the real world.
Exploring the history and key terminology of Hadoop, you’ll then walk-through the installation process on your desktop to help you get started.
Explore Hadoop Distributed File System (HDFS)
With a solid introduction to Hadoop, you’ll learn how to manage big data on a cluster with Hadoop Distributed File System (HDFS).
You’ll also discover MapReduce to understand what it is and how it’s used before moving onto programming Hadoop with Pig and Spark.
With this knowledge, you’ll be able to start analysing data on Hadoop.
Understand MySQL and NoSQL
Next, you’ll learn how to do more with your data as you understand how to store and query data. To help you do this, you’ll learn how to use applications such as Sqoop, Hive, MySQL, Phoenix, and MongoDB.
Develop core data analyst skills
Finally, you’ll hone your data analyst skills by learning how to query data interactivity. You’ll also gain an overview of Presto and learn how to install it to ensure you can quickly query data of any size.
By the end of the course, you’ll have the skills to effectively work with big data using Hadoop and be able to streamline your processes.
This course is designed for anyone who works with big data.
You don’t need any prior experience of using Hadoop as you’ll start with the very basics.
On this course, we’ll show you how to install the Hadoop environment on your operating system.