This tutorial code is intended to allow data to flow from Google Cloud Pub/Sub to Google Cloud Datastore. This is part of a larger tutorial for sending data from Particle devices into a hosted database.
You can find the full documentation on the Particle <> Google Cloud integration on our docs .
Before this script will become useful to you, please make sure you have done all of the following:
-
Have Particle device(s) collecting data and publishing events
-
Have a Google Cloud Platform account with a project and a Pub/Sub topic
-
Enabled the Google Cloud Platform integration on Particle
-
Created a Google Cloud Platform private key
-
Created a Google Cloud Pub/Sub subscription
For all required steps, check out the full tutorial
git clone https://github.com/spark/google-cloud-datastore-tutorial.git
Create a file in the root of your local repository called gcp_private_key.json
:
touch gcp_private_key.json
Next, paste in the contents of your Google Cloud Platform JSON private key. It should look something like this:
{
"type": "service_account",
"project_id": "[GOOGLE CLOUD PLATFORM PROJECT ID]",
"private_key_id": "[PRIVATE_KEY_ID]",
"private_key": "[PRIVATE_KEY]"
"client_email": "[CLIENT_EMAIL]",
"client_id": "[CLIENT_ID]",
"auth_uri": "https://accounts.google.com/o/oauth2/auth",
"token_uri": "https://accounts.google.com/o/oauth2/token",
"auth_provider_x509_cert_url": "[CERT_URL]",
"client_x509_cert_url": "[CLIENT_CERT_URL]"
}
Open up tutorial.js
and update the config
object for your Google
Cloud Platform configuration:
var config = {
gcpProjectId: '[YOUR PROJECT ID]',
gcpPubSubSubscriptionName: '[YOUR PUB/SUB SUBSCRIPTION NAME]',
gcpServiceAccountKeyFilePath: './gcp_private_key.json'
}
gcpProjectId
: Your Google Cloud Platform Project IDgcpPubSubSubscriptionName
: Your Google Cloud Platform Pub/Sub topic's subscription name
- Install dependencies:
npm install
- Run the script:
node tutorial.js