By Edwin Bomela, Big Data Engineering

Building Big Data Pipelines with R & Sparklyr & Tableau

Language: English
All Levels

Course description

Welcome to the Building Big Data Pipelines with R & Sparklyr & Tableau course. In this course we will be creating a big data analytics solution using big data technologies for R. In our use case we will be working with raw earthquake data and  we will be applying big data processing techniques to extract transform and load the data into usable datasets. Once we have processed and cleaned the data, we will use it as a data source for building predictive analytics and visualizations. Tableau Desktop is a powerful data visualization tool, used for big data analysis and visualization. It allows for data blending, real-time analysis and collaboration of data. No programming is needed for Tableau Desktop, which makes it a very easy and powerful tool to create dashboards apps and reports. Sparklyr is an open-source library that is used for processing big data in R, by providing an interface between R and Apache Spark. It allows you to take advantage of Spark's ability to process and analyze large datasets in a distributed and interactive manner. It also provides interfaces to Spark's distributed machine learning algorithms and much more. - You will learn how to create big data processing pipelines using R - You will learn machine learning with geospatial data using the Sparklyr library - You will learn data analysis using Sparklyr, R and Tableau - You will learn how to manipulate, clean and transform data using Spark dataframes - You will learn how to create Geo Maps in Tableau Desktop - You will also learn how to create dashboards in Tableau Desktop

Related Skills

Course overview - 20

  • Introduction

  • R Installation

  • Installing Apache Spark

  • Installing Java (Optional)

  • Testing Apache Spark Installation

  • Installing Sparklyr

  • Data Extraction

  • Data Transformation

  • Data Exporting

  • Data Pre-processing

  • Building the Predictive Model

  • Creating the Prediction Dataset

  • Installing Tableau

  • Loading the Data Sources

  • Creating a Geo Map

  • Creating a Bar Chart

  • Creating a Donut Chart

  • Creating the Magnitude Chart

  • Creating the Dashboard

  • Source Code

Learners who have already enrolled in this course

Meet your instructor

Edwin  Bomela
Edwin BomelaBig Data Engineering
Big Data Engineering and Consulting, involved in multiple projects ranging from Business Intelligence, Software Engineering, IoT and Big data analytics. Expertise are in building data processing pipelines in the Hadoop and Cloud ecosystems and software development. Currently consulting at one of the top business intelligence consultancies helping clients build data warehouses, data lakes, cloud data processing pipelines and machine learning pipelines. The technologies he uses to accomplish client requirements range from Hadoop, Amazon S3, Python, Django, Apache Spark, MSBI, Microsoft Azure, SQL Server Data Tools, Talend and Elastic MapReduce.