Introduction to Big Data
date_range Débute le 20 mars 2017
event_note Se termine le 10 avril 2017
list 3 séquences
assignment Niveau : Introductif
label Informatique & Programmation
chat_bubble_outline Langue : Anglais
card_giftcard 5.4 points
- /5
Avis de la communauté
0 avis

Les infos clés

credit_card Formation gratuite
timer 9 heures de cours

En résumé

Interested in increasing your knowledge of the Big Data landscape? This course is for those new to data science and interested in understanding why the Big Data Era has come to be. It is for those who want to become conversant with the terminology and the core concepts behind big data problems, applications, and systems. It is for those who want to start thinking about how Big Data might be useful in their business or career. It provides an introduction to one of the most common frameworks, Hadoop, that has made big data analysis easier and more accessible -- increasing the potential for data to transform our world! At the end of this course, you will be able to: * Describe the Big Data landscape including examples of real world big data problems including the three key sources of Big Data: people, organizations, and sensors. * Explain the V’s of Big Data (volume, velocity, variety, veracity, valence, and value) and why each impacts data collection, monitoring, storage, analysis and reporting. * Get value out of Big Data by using a 5-step process to structure your analysis. * Identify what are and what are not big data problems and be able to recast big data problems as data science questions. * Provide an explanation of the architectural components and programming models used for scalable big data analysis. * Summarize the features and value of core Hadoop stack components including the YARN resource and job management system, the HDFS file system and the MapReduce programming model. * Install and run a program using Hadoop! This course is for those new to data science. No prior programming experience is needed, although the ability to install applications and utilize a virtual machine is necessary to complete the hands-on assignments. Hardware Requirements: (A) Quad Core Processor (VT-x or AMD-V support recommended), 64-bit; (B) 8 GB RAM; (C) 20 GB disk free. How to find your hardware information: (Windows): Open System by clicking the Start button, right-clicking Computer, and then clicking Properties; (Mac): Open Overview by clicking on the Apple menu and clicking “About This Mac.” Most computers with 8 GB RAM purchased in the last 3 years will meet the minimum requirements.You will need a high speed internet connection because you will be downloading files up to 4 Gb in size. Software Requirements: This course relies on several open-source software tools, including Apache Hadoop. All required software can be downloaded and installed free of charge. Software requirements include: Windows 7+, Mac OS X 10.10+, Ubuntu 14.04+ or CentOS 6+ VirtualBox 5+.

more_horiz Lire plus
more_horiz Lire moins
dns

Le programme

  • Week 1 - Welcome
    Welcome to the Big Data Specialization! We're excited for you to get to know us and we're looking forward to learning about you!
  • Week 1 - Big Data: Why and Where
    Data -- it's been around (even digitally) for a while. What makes data "big" and where does this big data come from?
  • Week 2 - Characteristics of Big Data and Dimensions of Scalability
    You may have heard of the "Big Vs". We'll give examples and descriptions of the commonly discussed 5. But, we want to propose a 6th V and we'll ask you to practice writing Big Data questions targeting this V -- value.
  • Week 2 - Data Science: Getting Value out of Big Data
    We love science and we love computing, don't get us wrong. But the reality is we care about Big Data because it can bring value to our companies, our lives, and the world. In this module we'll introduce a 5 step process for approaching data science problems.
  • Week 3 - Foundations for Big Data Systems and Programming
    Big Data requires new programming frameworks and systems. For this course, we don't programming knowledge or experience -- but we do want to give you a grounding in some of the key concepts.
  • Week 3 - Systems: Getting Started with Hadoop
    Let's look at some details of Hadoop and MapReduce. Then we'll go "hands on" and actually perform a simple MapReduce task in the Cloudera VM. Pay attention - as we'll guide you in "learning by doing" in diagramming a MapReduce task as a Peer Review.
record_voice_over

Les intervenants

  • Ilkay Altintas, Chief Data Science Officer
    San Diego Supercomputer Center
  • Amarnath Gupta, Director, Advanced Query Processing Lab
    San Diego Supercomputer Center (SDSC)
store

Le concepteur

UC San Diego is an academic powerhouse and economic engine, recognized as one of the top 10 public universities by U.S. News and World Report. Innovation is central to who we are and what we do. Here, students learn that knowledge isn't just acquired in the classroom—life is their laboratory.
assistant

La plateforme

Coursera est une entreprise numérique proposant des formation en ligne ouverte à tous fondée par les professeurs d'informatique Andrew Ng et Daphne Koller de l'université Stanford, située à Mountain View, Californie.

Ce qui la différencie le plus des autres plateformes MOOC, c'est qu'elle travaille qu'avec les meilleures universités et organisations mondiales et diffuse leurs contenus sur le web.

Quelle note donnez-vous à cette ressource ?
Contenu
0/5
Plateforme
0/5
Animation
0/5

Vous pourriez être intéressé par...