- Demo
- Project requirements
- Clone and install
- Set up new instance
- Run locally
- Build and deploy
- Lint and fix
- Import production data from Cloud Firestore to local Firestore
- Import production data
- Automated Backup with Cloud Functions
- Slack Integration
- Supported Providers
- Common problems
If you would like to check out how the application works, you can go to the demo-site and sign in with a test-user
- Site: https://origo-okr-tracker.web.app
- User/pass: testuser@okr.com / testuser
- Node 14.x
- Firebase >=8.x (v9 is not supported)
- Firebase tools >9.x
- Firebase Blaze plan - Pay as you go
Clone repository and run install:
npm install && cd ./functions && npm install && cd ..
Install Firebase CLI:
npm install -g firebase-tools
Follow this guide to set up a new clean instance of the OKR-tracker. Please read the whole readme and not sequentially. There are some steps throughout the readme that are important to set up a new instance.
- Create a Google Firebase project.
- Initialize the project with Firebase CLI
- Create a Google service account
- From the Project Overview, select Service accounts
- Click Generate new private key
This key is used for fetching data from Google Sheets (for automatically updating key results). In order to fetch data from Google Sheets, you must set up environment variables for Firebase Functions:
firebase functions:config:set
service_account="<service account private key json-file>"
storage.bucket="<your-storage-bucket-name>"
slack.active=false
slack.webhook="YOUR SLACK WEBHOOK HERE" (required if slack.active === true)
slack.token="YOUR SLACK OAUTH TOKEN HERE" (required if slack.active === true)
slack.host_url="HOST URL" (required if slack.active === true)
sheets.impersonator="email-address" (optional)
Cat the whole service account private key json file into the environment key service_account
.
zsh
firebase functions:config:set service_account="$(cat origo-okr-tracker-private-key.json)"
sh
firebase functions:config:set service_account="${cat origo-okr-tracker-private-key.json}"
Note: The private key string needs to have actual line breaks as opposed to \\n
because of an issue with how Firebase stores environment variables. Read more.
We have Slack integrations. You can read about how to use the slack integration in the slack section.
If you want to activate them, then you would need to add it to the Firebase functions config. If you do not want to use the slack integrations, then you don't need to do anything.
firebase functions:config:set slack.active=true
We use Google Auth to authenticate users and this needs to be enabled in the Firebase Console.
NOTE: This does not apply if you are only running this locally. We support Google and Microsoft as authentications
- Navigate to your project in the Firebase console
- Press the Authentication-button in the side menu
- Sign-in Method-tab
- Enable Google Auth
Get your Firebase SDK snippet from your Firebase Console:
- Navigate to Project settings
- Under Your apps, find Firebase SDK snippet and press Config
- Copy the following secrets to a
.env.production
file in the root directory. - Use also need
.env.local
to run this locally
Secret | Description |
---|---|
VITE_API_KEY |
from SDK snippet |
VITE_AUTH_DOMAIN |
from SDK snippet |
VITE_DATABASE_URL |
from SDK snippet |
VITE_PROJECT_ID |
from SDK snippet |
VITE_STORAGE_BUCKET |
from SDK snippet |
VITE_MESSAGING_SENDER_ID |
from SDK snippet |
VITE_APP_ID |
from SDK snippet |
VITE_MEASUREMENT_ID |
from SDK snippet |
VITE_SHEETS_SERVICE_ACCOUNT |
<service account email> |
VITE_I18N_LOCALE |
nb-NO OR en-US |
VITE_REGION |
europe-west2 |
VITE_LOGIN_PROVIDERS |
login providers allowed separated with hyphen - only implemented google, email. Ex: google-email |
VITE_HOST_URL |
URL which points to cloud functions that are set up as API CRUD endpoints |
VITE_MICROSOFT_TENANT_ID |
To limit the authentication to a certain TENANT, other wise everyone with a Microsoft account could log in |
VITE_ORGANIZATION |
Name of the organization |
firebase use --add
The local development environment uses Firebase Emulator Suite for Firestore and Cloud Functions. There is no need to do anything, only run the development script and everything is set up with a local user through Google auth.
Retrieve current Firebase environment configuration. This is needed for certain cloud functions to function locally.
firebase functions:config:get > ./functions/.runtimeconfig.json
Start Firebase emulators, import mock data and run the development server:
npm run dev
If you want to deploy to production or staging, you need to create multiple collections manually. Go to the Firestore Database in the Firebase Cloud Console
- audit
- departments
- keyResults
- kpis
- objectives
- organizations
- periods
- products
- requestAccess
- slugs
- users
The collection users
needs one document with the first user. Create a document and add the following fields:
{
"id": "<email the user is signing in with",
"email": "<email the user is signing in with",
"superAdmin": true,
"widgets": {
"itemHome": {
"children": true,
"missionStatement": true,
"progression": true,
"team": true
},
"keyResultHome": {
"details": true,
"notes": true,
"weights": true
},
"objectiveHome": {
"details": false,
"progression": true,
"weights": true
}
}
}
After successfully logging in to the OKR Tracker, navigate to the Admin panel. Here you can create new organisations, departments and products to use as your mock data. On each object you can also create periods, objectives, key results and KPIs.
To export your mock data run the following command:
firebase emulators:export ./mock_data
To update existing mock data, simply run the export command above and confirm overwrite existing export.
Firebase now exports storage emulator as well, even if you don't use it. These new folders are not checked into git because they are empty and git does not add empty folders. If you are a user that has problems running the mock data, you will need to add two folders to the /mock_data/storage_export
folder. These are blobs
and metadata
.
It is possible to set up open API end points for users outside of the OKR-tracker frontend to update progress of Key Results and KPI's. To do so, you only need to deploy all the functions as usual, and then give the users the Cloud Function URL, but we do not recommend to call the Cloud Function directly. The better approach would be to set up a Google Cloud API Gateway and then reroute all the calls to the right Cloud Function.
We have set up an Open API specification which you can check out here.
You can read more about on how to set up an API Gateway here.
The TL;DR is:
- Enable required services
- Create an API
- Create a new service account which has the correct access rights - we use the roles
APIGateway Admin
dsdsCloud Functions Invoker
- Create an API config
- Create a gateway
After an API Gateway has been set up, we have closed the gateway with an API Key, which means that you would need to create an API Key through the Google Cloud Console
If there are any questions regarding this, do not hesitate to get in contact with us and we will gladly help (i.e. create an issue)
Build and deploy to production:
npm run deploy
Run linter
npm run lint
Lint styles
npm run lint:style
Automatically fix lint issues
npm run lint:style:fix
If you want to use Google Sheets API for automatic key results or automatic KPIs, you will need to enable the Google Sheets API in Google Cloud Console.
If you are using Team Drives with domain-policy (only specific domains have access) then you need to turn on domain-wide delegation on your service accounts and then give that service account access through G Suite Admin. Read more about it here
Based on this tutorial with a few differences for our use case.
The newest version of the OKR Tracker uses the Firebase Local Emulator Suite, where you can play and test your data without being afraid of production changes. It is still in the early stages, which means that auth is still handled by the cloud firebase and not locally.
When you start up the local Firestore emulator you can see that the Firestore is completely empty because we don't have any production data. This is an amazing way of working because you can do what ever you want without doing damages, but it's real life data that you most likely want to test and fix.
We are going to show you how you can export your production data to a GCP bucket or use an existing backed up bucket to import into your local Firestore.
- Firebase CLI
- Google Cloud SDK
How to install Google Cloud SDK and Firebase CLI
Login to Firebase and Google Cloud
firebase login
gcloud auth login
See the list of your projects and connect to the on you'd like to export data from:
firebase projects:list
firebase use <your project id>
gcloud projects list
gcloud config set project <your project id>
For the sake of this how to, we'll be using okr-tracker-production
(production) for gcloud, and origo-okr-tracker
(development) for the Firebase. The reason is that we use auth from our development Firebase instance, and not from the production instance.
If you don't already have automated backups of your production data, we will need to export the production data to a backup on GCP:
gcloud firestore export gs://okr-tracker-production.appspot.com/<backup-folder-name>
Now copy the new folder to your local machine, we are going to do this from our functions folder:
cd functions
gsutil -m cp -r gs://okr-tracker-production.appspot.com/<backup-folder-name> .
If you already have automated backups of your production data, you don't need to export the production data, only import it. For this application our backup folder is not part of the Firebase storage bucket:
gsutil -m cp -r gs://okr-tracker-backup/<YYYY-MM-DD>
To import the production data into your local Firebase emulator, you will need a metadata-file on the root folder, named firebase-export-metadata.json
:
{
"version": "8.6.0",
"firestore": {
"version": "1.11.5",
"path": "functions/<backup-folder-name>",
"metadata_file": "functions/<backup-folder-name>/<backup-folder-name>.overall_export_metadata"
}
}
Start your local Firebase emulator suite with the imported data. Firebase will read the metadata-json file automatically.
firebase emulators:start --import=./
We use cloud functions to backup our database every night and only keep backup of the last 14 days. If a backup is older than 14 days it gets automatically and permanently deleted from the storage bucket.
- Firebase Blaze plan
- Set IAM Permission
- Manually create a storage bucket
- Cloud function
TLDR:
- Navigate to Google Cloud Console and choose your project
- Navigate to IAM & Admin - Your App Engine Service account needs the Cloud Datastore Import Export Admin role
- Navigate to Storage – Create a storage bucket – Give it a rule to delete storage that is >14 days old
- Run the command
firebase functions:config:set storage.bucket="<your-storage-bucket-name>"
This is called automated restore but we still need to manually trigger a cloud function that does the restore from the Google Cloud Console
TLDR:
- From your Google Cloud Console navigate to PubSub
- Create a topic and name it 'restore-backup'
- Trigger the topic by publishing a message and the restore will be triggered
Gif of the process:
Src/Citation: The cloud function blog
We have a slack integration that is connected with a couple of cloud functions.
There are two cloud functions that integrate with slack
handleSlackRequest
- users requesting access - slack app posts to a channel that someone wants accesshandleSlackInteractive
- button actions from channel - user presses accept/reject/ignore and slack app posts to a cloud function that gives access to a user or rejects it
For these cloud functions to work you need to add a webhook url from a slack app.
Firebase steps:
- Open your gcloud console and go to IAM section
- Give your Firebase account
Pub/Sub subscriber
role
Slack steps:
- Go to slack application page and create a new app or go to your existing app
- Activate
Incoming Webhooks
and create a new Webhook URL - Activate
Interactivity and Shortcuts
and add a new request URL which points to your Cloud Function
Copy the webhook URL and inject it into Firebase as an environment variable:
firebase functions:config:set slack.webhook="YOUR SLACK WEBHOOK HERE"
Request URL: https://<region>-<firebase-instance>.cloudfunctions.net/slackNotificationInteractiveOnRequest
We have added an integration with Slack, where a Slack user can subscribe to updates for an Organization, Department or Product.
If you have already set up a Slack integration from the previous point with Slack request and Slack interactive, you can go to the Slack commands site and add a new Slack command. We use /okr
as a Slack command. Find you app here
The slash command requires a couple of variables:
Command: /okr
Request URL: https://<region>-<firebase-instance>.cloudfunctions.net/okrSlackBot
Short Description: Subscribe to Org/Dep/Product
Usage Hint: subscribe [org/dep/prod] slug
Firebase needs a couple of new configs as well. These are slack.token
and host_url
. The host_url
is the URL of you okr-tracker site, for us, it is https://okr.oslo.systems
, and the token is an OAuth token from you Slack App settings page, under the sub-page OAuth & Permissions
, it is a Bot User OAuth Token.
firebase functions:config:set slack.token="YOUR SLACK OAUTH TOKEN HERE"
firebase functions:config:set slack.host_url="HOST URL"
OKR-tracker supports for the time being only four login providers: Microsoft, Google, email/pass. If you are looking for other providers that firebase support, we would love for you to open up a PR with the needed changes.
For the Microsoft-integration a TENANT must be specified as the environment-variable VITE_MICROSOFT_TENANT_ID.
Anyone with a google-account can login. To limit domain you have to implement this somehwhere, e.g. in set_user.js
- e.g. if (!user.email.lowerCase().endsWith('oslo.kommune.no')) rejectAccess();
If there are some problems running the project locally, or you get an infinite spinner: inspect the console in the browser, your terminal or firebase-debug.log
file for error messages. Some common messages when firing up the project for the first time:
- "No such file or directory, scandir storage_export/metadata"
- You need to create two directories under
mock_data/storage_export
-blobs
andmetadata
- You need to create two directories under
- It looks like you're trying to access functions.config().service_account but there is no value there
- Check if you have set the config key for service_account correctly. Read the readme again and se how you need to cat the private-key file correctly
- Missing permissions required for functions deploy. You must have permission iam.serviceAccounts.ActAs on service account
- Open the Google Cloud Console (check that you are in the correct project).
- Go to IAM & Admin -> Service Accounts
- Find the service account and click on it
- Click on the "Permissions" panel, then click
Grant Access
- Add your IAM member email address. For the role, select Service Accounts -> Service Account User
- Click Save
- Cannot read property
bucket
of underfined- Set the config key
storage.bucket
. Please read the readme again
- Set the config key