Courses
Location
Corporate
Our Students
Resources
Bootcamp Programs
Short Courses
Portfolio Courses
Bootcamp Programs

Launch your career in Data and AI through our bootcamp programs

  • Industry-leading curriculum
  • Real portfolio/industry projects
  • Career support program
  • Both Full-time & Part-time options.
Data Science & Big Data

Become a data engineer by learning how to build end-to-end data pipelines

 

Become a data analyst through building hands-on data/business use cases

Become an AI/ML engineer by getting specialized in deep learning, computer vision, NLP, and MLOps

Become a DevOps Engineer by learning AWS, Docker, Kubernetes, IaaS, IaC (Terraform), and CI/CD

Short Courses

Improve your data & AI skills through self-paced and instructor-led courses

  • Industry-leading curriculum
  • Portfolio projects
  • Part-time flexible schedule
AI ENGINEERING
Portfolio Courses

Learn to build impressive data/AI portfolio projects that get you hired

  • Portfolio project workshops
  • Work on real industry data & AI project
  • Job readiness assessment
  • Career support & job referrals

Build data strategies and solve ML challenges for real clients

Help real clients build BI dashboard and tell data stories

Build end to end data pipelines in the cloud for real clients

Location

Choose to learn at your comfort home or at one of our campuses

Corporate Partners

We’ve partnered with many companies on corporate upskilling, branding events, talent acquisition, as well as consulting services.

AI/Data Transformations with our customized and proven curriculum

Do you need expert help on data strategies and project implementations? 

Hire Data, AI, and Engineering talents from WeCloudData

Our Students

Meet our amazing alumni working in the Data industry

Read our students’ stories on how WeCloudData have transformed their career

Resources

Check out our events and blog posts to learn and connect with like-minded professionals working in the industry

Read blogs and updates from our community and alumni

Explore different Data Science career paths and how to get started

Blog

Student Blog

Building Superset Dashboard and Pipeline using Apache Airflow and Google Cloud SQL

October 28, 2019

The blog is posted by WeCloudData’s Data Science Bootcamp student Ryan Kang

Like Amazon AWS, Google Cloud is a popular cloud used by data analytics companies. Google Cloud allows continuous automation of workflow and big data computation. In this blog, I will briefly introduce how I set up Google Cloud for workflow.

Each Google Cloud account includes a $360 free trial credit, and creating a Google Cloud account in the console is as easy as creating a Gmail account. So one should consider signing up now to take advantage of this great tool!

The Google Cloud Console has a variety of interesting features such as compute engines, virtual machines (VM), app engines, storage, etc. There are however three dominant components I will be expanding on. As shown in the graph below, the following components are used for a workflow:

  • Google Cloud VM, the remote server; functions like a regular computer
  • Google Cloud Storage, remote hard-disk
  • Google Cloud SQL, the remote SQL server

Keys for VM:

  1. Better CPU configurations cost more; If the virtual machine you build on Google Cloud has a better configuration, it will cost more, so it is important not to set it too high
  2. Setting a VM to “allowing access to API” is very important for data processing
  3. The connection between local and VM is SSH (security shell), which is conducted via Google Cloud SDK;  Through the command line we get into VM

Then later, install environments like we do when working on a local server. Notice VM’s system is Linux which makes the installation straightforward (for instance, pip install python). After environment setup, upload files from local to VM using the “gustil” command line. We can now run simple workflows.

When the workflow involves SQL or big data, we need a Google Cloud SQL instance to do the job.

Keys for SQL:

  1. Remember to select 2nd generation SQL
  2. Also, because names of SQL (or storage) are global, make sure the names are unique
  3. After setting up Google Cloud SQL,  connect to Google Cloud SQL as such:

Workflow is orchestrated using the Python package titled “Airbnb airflow”, which was developed by Airbnb data engineers. Orchestration includes data collection, database computation, documentation, and storage.  It has a UI which allows for easy scheduling and checking.

Finally, we need data visualization. Visualization is realized on your local computer which connects to Google Cloud SQL. Visualization tools include Tableau, ds3.js, and python packages like superset. The table above demonstrates a real-time visualization of Twitter activities.

You should now have a good idea of the basics on how workflow works in Google Cloud. If you have any questions, please visit WeCloudData for more information or comment below.

To find out more about the courses our students have taken to complete these projects and what you can learn from WeCloudData, click here to see our upcoming course schedule.

Other blogs you might like
Student Blog
The blog is posted by WeCloudData’s student Luis Vieira. I will be showing how to build a real-time dashboard on…
by Student WeCloudData
October 21, 2020
Uncategorized
Big Data for Data Scientists – Info Session from WeCloudData…
by WeCloudData
November 9, 2019
Previous
Next

Kick start your career transformation