/headless-count

Extension to "austere-snap" project: Counting ships indicated by colored dots using Computer Vision

Primary LanguagePython

headless-count

Extension to "austere-snap" project: Counting ships indicated by colored dots using Computer Vision

In this project, we use an automated script that:

1) counts the number of vessels identified as magenta dots in the .png file provided

--> Able to reuse assets from "austere-snap" project

2) runs the aforementioned core feature in a Docker container

--> This will require some ancillary features to be omitted when building the Docker Image

An extension feature is suggested: Log the number of vessels in a publicy exposed Google Sheets

Credit to "Jie Jenn" for the following resources:

1) code snippet used in insert_to_sheet.py (shown in this repo)

2) code snippet used in Google.py (referenced, but not shown in this repo)


On first run:

1) pip install virtual environment

pip install virtualenv

2) Set up virtual environment in project

python<version> -m venv <virtual-environment-name>

3) Activate virtual environment

4) Navigate to main directory and install from requirements.txt

pip install -r requirements.txt


Note that you should supply your own credentials & values for the following:

1) client_secrets.json

-> This is a preparation step to link the above 'service' to a Cloud Service provider. Sample format shown below is for Google Cloud project

    {"installed":
        {"client_id":"",
        "project_id":"",
        "auth_uri":"https://accounts.google.com/o/oauth2/auth",
        "token_uri":"https://oauth2.googleapis.com/token",
        "auth_provider_x509_cert_url":"https://www.googleapis.com/oauth2/v1/certs",
        "client_secret":"",
        "redirect_uris":["http://localhost"]
        }
    }

2) mycreds.txt

-> When running the code for the first time, mycreds.txt is generated after linking to a Cloud Service provider

-> Afterwards, mycreds.txt is generated & reused for subsequent invokations.

3) value for public_folder_id variable in upload.py

-> This is used to upload to a specific Google Drive folder. Set up for this publicly accessible folder should be done prior to running screengrab.py (& by extension, upload.py)

4) value for spreadsheet_id variable in insert_to_sheet.py

-> This is used to upload to append values to the indicated Google Sheets. This allows processed values to be logged on a regular basis.


Integrate Google Cloud CLI with Docker

1) gcloud init

--> Log into gcloud using CLI: User Authentication will be done through browser

2) Select appropriate gcloud project

--> (Optional) It might be necessary to configure a default Compute Region and Zone that corresponds to that of the project's Artifact Registry

3) Configure Docker with the credentials for the same Compute Region and Zone

--> Sample code

gcloud auth configure-docker us-central1-docker.pkg.dev

--> Sample output

 {
  "credHelpers": {
    "us-central1-docker.pkg.dev": "gcloud"
  }
}

4) Build the Docker image

Sample code below is for MacOS with M1 chip & above

docker buildx build --platform linux/amd64 -t imgName .

5) Tag the Docker image

docker tag imgName us-central1-docker.pkg.dev/cloudProjName/artifactRegistry/imgName

imgName should correspond to the one above in Step 4

cloudProjName is determined when creating the project in Google Cloud Console

artifactRegistry is determined when creating the Artifact Registry in Google Cloud Console

The naming convention is used to push the Docker image to the intended directory in Artifact Registry

6) Push the Docker image

docker push us-central1-docker.pkg.dev/cloudProjName/artifactRegistry/imgName