DSC-McMaster-U/Auto-ML

helloworld.py Cloud Function Configuration using Terraform

Closed this issue · 5 comments

Follow this google documentation tutorial, but instead try to deploy a python helloworld function as done by this. Feel free to experiment with the console like they did in the second tutorial, but try to get this ticket done as a terraform block in main.tf.

End goal is to implement this part of the architecture:
image

Resources for later:

@rawanmahdi I am playing around with cloud functions first before I am even trying to get it to work with terraform and how it seems to work is that it Cloud Functions let you create an endpoint and then you can just go and do stuff with it

I am playing with it on my own gcp account cause why not

So essentially you would code a python http endpoint to handle the data intake, this is the example from google, where it processes a http request, and looks for a "name" parameter to print out hello "name"

import functions_framework

@functions_framework.http
def hello_http(request):
    request_json = request.get_json(silent=True)
    request_args = request.args

    if request_json and 'name' in request_json:
        name = request_json['name']
    elif request_args and 'name' in request_args:
        name = request_args['name']
    else:
        name = 'World'
    return 'Hello {}!'.format(name)

Hitting the endpoint

for my personal gcp cloud function the endpoint url is https://northamerica-northeast2-private-automl-404820.cloudfunctions.net/test_function

So if I try hitting it with a http get request to this endpoint https://northamerica-northeast2-private-automl-404820.cloudfunctions.net/test_function?name=austin

I get in return

'Hello austin!'
image

@ausbennett cool! Do you think there's a way for us to have our backend being making these HTTP requests to the cloud function? I'm having a hard time finding any docs/tutorials for this but it looks like this covers it. If thats possible, then we might be able to handle all the datascience and ML logic in our functions. What do you think?

@rawanmahdi If the end goal is to communicate to cloud storage to read/write data, using cloud functions seems VERY redundant

We will send a request from backend to cloud functions that will send a request to cloud storage?

Is it possible to communicate directly to cloud storage?

If so our backend should handle the reading and writing!

@ausbennett So should we stick with the alternative of having everything be flush with the backend? So we'd be implementing ML, the EDA and any other GCP APIs we want to use straight out of our FastAPI endpoints. Shall we scrap the cloud functions? I'm seeing there a good amount of tutorials dealing w/FastAPI making requests to GCP, so we could stick to it.

@rawanmahdi Unless there is an advantage to cloud functions? We would just need to figure out how the backend will know which api url to talk to

  • initially I think we would need to pass it as a env variable or smth to the docker backend build process?

Going to have to look into it