- BigQuery (bq)
- BigQuery Machine Learning (
BQML
)
- BigQuery Machine Learning (
- Storage (gsutil)
- Google Cloud (gcloud)
- Project (
gcloud projects
): config settings - Compute Engine (
gcloud compute
): firewall rules, backend services - Pub/Sub (
gcloud pubsub
) - Cloud function (
gcloud function
) - Container (
gcloud container
): K8s clsuters - IAM (
gcloud iam
) - IoT (
gcloud beta iot
)
- Project (
- Vision (Vision Annotation)
- Restful API
- Speech (Speech to Text)
- Restful API
- Translation (Translation from one language to another)
- Restful API
- Natural Language (Classify text into categories)
- Restful API
- ML Engine (a.k.a.
gcloud ai-platform
.) - Video Intelligence
- Restful API
- Datalab (datalab)
A workflow shows how to do an End-to-End ML or AI works on the Google Cloud Platform.
-
Introduction to APIs in Google (GSP294)
- Definition of (RESTful)
APIs
andEndpoints
- Introduction to
Google APIs
- Definition of (RESTful)
-
Predict Taxi Fare with a BigQuery ML Forecasting Model (GSP246)
- Train and evaluate a linear model via
BQML
on a publicBigQuery
dataset. - Use a
BQML
built-inregression
model.
- Train and evaluate a linear model via
-
Predict Visitor Purchases with a Classification Model in BQML (GSP229)
- Train and evaluate a logistic regression model via
BQML
on a publicBigQuery
dataset. - Use a
BQML
built-inclassification
model.
- Train and evaluate a logistic regression model via
-
Detect Labels, Faces, and Landmarks in Images with the Cloud Vision API (GSP037)
- Use
curl
command to send a JSON request toCloud Vision API
and parse its response. - Use an image from a
cloud storage bucket
as the data source for aCloud Vision API
request.
- Use
-
Awwvision: Cloud Vision API from a Kubernetes Cluster (GSP066) ⭐
- Deploy and run multiple services on several
containers
of aKubernetes cluster
on a time. - Create a worker for a web crawler and requests for
Vision API
, a web-app for viewing results and a cached database.
- Deploy and run multiple services on several
-
Google Cloud Speech API: Qwik Start (GSP119)
- Use a
curl
command to request theSpeech API
for speech to text transformation.
- Use a
-
Speech to Text Transcription with the Cloud Speech API (GSP048)
- Ask a Speech-to-Text translation request in a
restful API
(Speech API
) format. - Ask for doing translation over multiple languages.
- Ask a Speech-to-Text translation request in a
-
Translate Text with the Cloud Translation API (GSP049)
- Request restful
Translation API
calls for the language translation to a target language, or for detecting which the language of the sent text is.
- Request restful
-
Classify Text into Categories with the Natural Language API (GSP063) ⭐
- Request a
Natural Language API
call for a categorical analysis on a text dataset (from BBC). - Save the result from requests into a
BigQuery
table and analyze it onBigQuery
. - Create a service account via
Cloud IAM
to allow python scripts to requestNatural Language API
and access toBigQuery
.
- Request a
-
Entity and Sentiment Analysis with the Natural Language API (GSP038)
- Create a request and send a sentence to
Natural language API
in order to do theentity analysis
,sentiment analysis
,entity-sentiment analysis
,syntax analysis
andlinguistic analysis
.
- Create a request and send a sentence to
-
Extract, Analyze, and Translate Text from Images with the Cloud ML APIs (GSP075) ⭐
- Request a number of
machine learning restful APIs
orderly. - Firstly request a
Vision API
call for text detection with OCR to an image, secondly request aTranslation API
call for detecting language and translating it from the text, thirdly request aNatural Language API
call for the entity analysis from the translation result.
- Request a number of
-
Scanning User-generated Content Using the Cloud Video Intelligence and Cloud Vision APIs (GSP138) ⭐
- An example scenario firstly an image is uploaded to
cloud storage bucket
triggering a notification toCloud Pub/Sub
that then triggers cloud function, secondly cloud functions would request bothVision API
andVideo Intelligence API
calls, and after the responses from APIs are received the result would be written intoBigQery
table. In the final, you can analyze the data in theBigQuery
.
- An example scenario firstly an image is uploaded to
-
Classify Images of Clouds in the Cloud with AutoML Vision (GSP223)
- How to use
AutoML
to train and evaluate a model on the data hosted onCloud Storage
. - How to deploy the model trained over
AutoML
.
- How to use
-
Implementing an AI Chatbot with Dialogflow (GSP078)
- How to use
Dialogflow
to build achatbot
. - How to deploy or integrate the
chatbot
with your own service.
- How to use
-
Machine Learning with TensorFlow (GSP273) ⭐
- Introduce using Tensorflow
estimator API
to build a linear classifier from scratch. - Introduce how to submit a job to the Google
Cloud AI platform
.
- Introduce using Tensorflow
-
TensorFlow for Poets (GSP077)
- Introduce how to use
transfer learning
to speed up a image classification retraining task.
- Introduce how to use
-
Creating an Object Detection Application Using TensorFlow (GSP141)
- Train an
object detection
model or fetch a pre-train model viaobject detection API
- Establish a
web app
allowing a request for object detection using the above model.
- Train an
-
AI Platform: Qwik Start (GSP076) ⭐
- Use a structured dataset as an example to demonstrate a complete flow of machine learning on the local and furtherly submit the job to the
AI platform
.
- Use a structured dataset as an example to demonstrate a complete flow of machine learning on the local and furtherly submit the job to the
-
Predict Housing Prices with Tensorflow and AI Platform (GSP418)
- Perform how to start a
Datalab
and run a Python notebook on it.
- Perform how to start a
-
Image Classification of Coastline Images Using TensorFlow on AI Platform (GSP014)
- Perform how to start a
Datalab
and run a Python notebook to do a task of image classification.
- Perform how to start a
Dataprep: Data Transformation Pipeline via Trifacta
|
Dataflow: Batch or Streaming Data Processing Pipeline
|
Dataproc: Hadoop or Spark Computing Core
-
Dataprep: Qwik Start (GSP105)
- This tutorial helps you preprocess datasets on
Dataprep
that is actually a data wrangling tool namedTrifacta
.
- This tutorial helps you preprocess datasets on
-
Dataprep: Creating a Data Transformation Pipeline with Cloud Dataprep (GSP430) ⭐
- This tutorial guides you to use the
Dataprep
module (actually isTrifacta
) preprocessing aBigQuery
table and then exporting the processed results back into a new table inBigQuery
.
- This tutorial guides you to use the
-
Dataflow: Qwik Start - Templates (GSP192)
- This tutorial guides you to use a template in
Dataflow
to process the dataset inBigQuery
and to insert the processed data into a new table inBigQuery
.
- This tutorial guides you to use a template in
-
Dataflow: Qwik Start - Python (GSP207) ⭐
- This tutorial guides you to run a
Python
script onDataflow
, it processes the dataset on abucket
and then exports the result on thebucket
.
- This tutorial guides you to run a
-
Dataflow: Run a Big Data Text Processing Pipeline in Cloud Dataflow (GSP047)
- This tutorial guides you to run a processing task on
Dataflow
using a given Maven project.
- This tutorial guides you to run a processing task on
-
Dataproc: Qwik Start - Console (GSP103)
- This tutorial guides you to use
Dataproc
, that is a cloud service forHadoop
orSpark
, and to submit a job running on it.
- This tutorial guides you to use
-
Dataproc: Qwik Start - Command Line (GSP104)
- This tutorial guides you to use the shell commands operating a cluster on
Dataproc
and submitting a job running on it.
- This tutorial guides you to use the shell commands operating a cluster on
-
Cloud IoT Core: Building an IoT Analytics Pipeline on Google Cloud Platform (GSP088) ⭐
- The tutorial shows you how to operate the Cloud IoT Core module as well as its components (registries and devices, the devices manager and the protocol bridge).
- The tutorial guides you integrating Cloud IoT Core with the Pub/Sub module, parsing the subscribed data with Dataflow, and at the end writing the data into BigQuery.
-
Cloud Pub/Sub: Streaming IoT Kafka to Google Cloud Pub/Sub (GSP285)
- The tutorial guides you to integrate two different kinds of streaming architectures,
Kafka
andPub/Sub
. - The integrated architectures maybe not the best solution but can be used for various extensions of concatenated systems, for example, the
Cloud IoT Core
module.
- The tutorial guides you to integrate two different kinds of streaming architectures,
-
ETL Processing on GCP Using Dataflow and BigQuery (GSP290) ⭐
- This tutorial guides you on how to use the
Dataflow
service to do a specific data processing task like ETL through running Python scripts on it. - After that, you can assign the script for inserting processed data into a
BigQuery
table.
- This tutorial guides you on how to use the
The following courses are mainly related to GCP Essentials
on qwiklabs.
- A Tour of Qwiklabs and the Google Cloud Platform (GSP282) ⭐
- Include a brief introduction of the
Google Cloud Platform
and the components as well.
- Include a brief introduction of the
- Creating a Virtual Machine (GSP001)
- Introduce how to create a VM instance via the
Cloud Shell
or theCloud Platform Console
.
- Introduce how to create a VM instance via the
- Compute Engine: Qwik Start - Windows (GSP093)
- Create a
Windows
VM instance via theCloud Shell
or theGoogle Cloud Platform Console
.
- Create a
- Getting Started with Cloud Shell & gcloud (GSP002)
- This tutorial introduces the
Cloud Shell
andgcloud
commands.
- This tutorial introduces the
- Kubernetes Engine: Qwik Start (GSP100)
- This tutorial guides you on how to operate a
Kubernetes
cluster viagcloud
commands.
- This tutorial guides you on how to operate a
- Set Up Network and HTTP Load Balancers (GSP007) ⭐
- This tutorial guides you on how to establish a network (L3) or HTTP/S (L7) load balancer.
- This tutorial demonstrates how to establish a cluster of web services through an
Instance Template
and theManaged Instance Groups
.
- Introduction to Docker (GSP055)
- This tutorial guides you on how to use Docker to run, build, push and delete the images and containers.
- Kubernetes Engine (GSP100)
- This tutorial gudies you on how to use
gcloud
command to operate a k8s cluster on GCP. - This tutorial gudies you on how to use
kubectl
command to manage the resources within a k8s cluster.
- This tutorial gudies you on how to use
- Orchestrating Kubernetes on the Cloud (GSP021) ⭐
- Managing Deployments Using Kubernetes Engine (GSP053) ⭐