Create a custom VPC with three subnets, one in usa, asia and Europe. Create appropriate firewall rules.
-
Open Cloud-shell.
-
Make your project as current project.
gcloud config set project project-id
-
Create a cloud pub/sub topic.
gcloud pubsub topics create terramearth-topic
-
Create a subscription to receive messages from pub/sub topic.
gcloud pubsub subscriptions create terramearth-sub --topic terramearth-topic
-
Send some messages on the topic.
gcloud pubsub topics publish terramearth-topic --message hello-world
-
Pull messages from the topic.
gcloud pubsub subscriptions pull terramearth-sub --auto-ack
Now your pub/sub topic is set.
Create Service Account with pubsub admin and iot core admin and monitoring admin and logging admin Roles.
Create a VM with debian-9 OS and 10 GB Std Disk, f1-micro instance type. Use the service account for this VM or otherwise use all api scope.
-
Create a public and private key for cloud iot core.
mkdir ca
cd ca
openssl req -x509 -newkey rsa:2048 -keyout rsa_private.pem -nodes -out rsa_cert.pem -subj "/CN=unused"
private key-file: rsa_private.pem
public key-file: rsa_cert.pem
-
Display contents of public key-file
cat rsa_cert.pem
-
Copy the all contents.
-
Go to cloud core console.
-
Create a new registry with following settings:
Registry ID - terramearth-registry
Region - asia-east1
cloud pub/sub topic - terramearth-topic
Click on the advanced option and keep default settings.
CA certificate - insert public key-file contents
Click on create.
Now your registry is ready.
-
Click on Device. Create a new device with following settings:
Device ID - dragline-100
publicKey Format - RS256_X509
leave default settings and clock on create.
-
Now, Install the required software on VM.
apt install git
curl -sL https://deb.nodesource.com/setup_14.x -o nodesource_setup.sh
bash nodesource_setup.sh
apt install -y nodejs
-
Download application code.
-
Make terramearth as current folder.
ls
npm install
-
Copy private-file from ca folder to terramearh folder
cp ~/ca/rsa_private.pem ~/terramearth
-
send messages on cloud iot core by the following command.
node terramearth.js mqttDeviceDemo --projectId=$DEVSHELL_PROJECT_ID --cloudRegion=asia-east1 --reg istryId=terramearth-registry --deviceId=dragline-100 --privateKeyFile=rsa_private.pem --numMessages=2 --algorithm=RS256
-
Retrieve messages through pub/sub scbscriber.
gcloud pubsub subscriptions pull terramearth-sub --auto-ack
-
If you can retrieve messages.
-
Create a golden image from the disk
-
Create two vm with this image. (same configuration as above) .
sudo -s cd /terramearth
node terramearth.js mqttDeviceDemo --projectId=$DEVSHELL_PROJECT_ID --cloudRegion=asia-east1 --reg istryId=terramearth-registry --deviceId=dragline-100 --privateKeyFile=rsa_private.pem --numMessages=2 --algorithm=RS256
-
three vm as simulator
-
means everything is set and ready for phase-3.
-
Go to Bigquery console.
-
Select your project and create a new dataset with following settings.
Dataset Name - terramearh_dataset
Default Location - asia-southeast1
Default table expiration - None
click on create.
-
Select the dataset and create a new table with following settings.
Source - Empty table
Table Name - terramearth_table
Schema:
registry:STRING,deviceID:STRING,deviceName:STRING,pressure:FLOAT,speed:INTEGER,engineUpTime:INTEGER,oilLevel:INTEGER,timestamp:TIMESTAMP
Leave Default setting and click on create.
-
Now your Bigquery dataset and table is ready to get messages.
-
Now create dataflow job from template. Goto Dataflow console.
-
Click on Create job from template with following settings.
Job name - terramearth-job
Regional endpoint - asia-southeast1
Dataflow template - pub/sub topic to BigQuery
input pub/sub topic - projects/project-id/topics/terramearth-topic
BigQuery output table - project-id:terramearth_dataset.terramearth_table
Temporary location - gs://bucket-id/temp
Max workers - 2
Number of workers - 1
Worker region - asia-southeast1
Machine type - n1-standar-1
Network - custom VPC
subnetwork - https://www.googleapis.com/compute/v1/projects/project-id/regions/asia-southeast1/subnetworks/subnet-id
Note: Make sure you have subnet in asia-southeast1 otherwise job will fail.
click on Run Job.
-
Make sure your job is running successfully.
-
Send messages from cloud-shell.
node terramearth.js mqttDeviceDemo --projectId=$DEVSHELL_PROJECT_ID --cloudRegion=asia-east1 --registryId=terramearth-registry --deviceId=dragline-100 --privateKeyFile=rsa_private.pem --numMessages=10 --algorithm=RS256
-
Wait for some time.
-
Go to Bigquery and refresh.
-
Check your table to see messages.
-
You got messages from cloud iot core to Bigquery through Dataflow job successfully.
You are ready for next phase now.