About the content
The Apache™ Hadoop® project develops open-source software for reliable, scalable, distributed computing. Learn the fundamental principles behind it, and how you can use its power to make sense of your Big Data.
Lesson 1What is "Big Data"? The dimensions of Big Data. Scaling problems. HDFS and the Hadoop ecosystem.
Lesson 2The basics of HDFS, MapReduce and Hadoop cluster.
Lesson 3Writing MapReduce programs to answer questions about data.
Lesson 4MapReduce design patterns.
Final ProjectAnswering questions about big sales data and analyzing large website logs.
- Sarah Sproehnle - Sarah Sproehnle is the Vice President of Educational Services at Cloudera, a company that helps develop, manage and support Apache Hadoop. While she is a geek at heart, her passion is helping people learn complex technology. In addition to teaching people how to use Hadoop, she's taught database administration, various programming languages, and system administration.
- Ian Wrigley - Ian Wrigley is currently the Senior Curriculum Manager at Cloudera, responsible for the team which creates all the company's Hadoop training materials. He's been a tech journalist, an instructor, and a course author for over 20 years, during which time he's taught everything from C programming to copywriting for the Web. He describes his job as "teaching geeks to be geekier".
Udacity is a for-profit educational organization founded by Sebastian Thrun, David Stavens, and Mike Sokolsky offering massive open online courses (MOOCs). According to Thrun, the origin of the name Udacity comes from the company's desire to be "audacious for you, the student". While it originally focused on offering university-style courses, it now focuses more on vocational courses for professionals.