5
votes

I have a cloud scheduler which I am using to trigger my cloud function as http call, In my cloud function I would like to form a query (which should be dynamic). To do so, I am passing some parameter from cloud scheduler (Json Body), but when I trigger my cloud function it doesn't take parameter values which are coming from cloud scheduler as json body. Can anyone help me to resolve this issue.

json body from cloud scheduler:

{ 
   "unit":"QA",
   "interval":"3"
}

Cloud function code:

def main(request):

    request_json = request.get_json(silent=True)
    request_args = request.args

    if request_json and 'unit' in request_json:
        retail_unit = request_json['unit']
    elif request_args and 'unit' in request_args:
        retail_unit = request_args['unit']
    else:
        unit = 'UAT'

    if request_json and 'interval' in request_json:
        interval = request_json['interval']
    elif request_args and 'interval' in request_args:
        interval = request_args['interval']
    else:
        interval = 1

    query = "select * from `myproject.mydataset.mytable` where unit='{}' and interval ={}".format(                                                                                                    
    unit,interval)
    client = bigquery.Client()
    job_config = bigquery.QueryJobConfig()
    dest_dataset = client.dataset(destination_dataset, destination_project)
    dest_table = dest_dataset.table(destination_table)
    job_config.destination = dest_table
    job_config.create_disposition = 'CREATE_IF_NEEDED'
    job_config.write_disposition = 'WRITE_APPEND'
    job = client.query(query, location='US', job_config=job_config)
    job.result()

Note: It works when I pass same variables from cloud scheduler as argument values in http url (https://my-region-test-project.cloudfunctions.net/mycloudfunction?unit=QA&interval=3)

2
It might be ether utf-8 issue or you need to parse raw output, for hints see these other answers: stackoverflow.com/questions/53216177/…Pentium10
@Pentium10 Mentioned link works for me, and you are right the issue was with utf-8 and now it is resolved. Thank you so much. I made below changes in my code: raw_request_data = request.data string_request_data = raw_request_data.decode("utf-8") request_json: dict = json.loads(string_request_data)Kaustubh Ghole

2 Answers

2
votes

The best hint is UTF-8 issue.

Check out also the situations described in this other thread: HTTP Triggering Cloud Function with Cloud Scheduler

4
votes

You can override the default Content-Type by creating the cron job using gcloud with the flag --headers Content-Type=application/json.

For instance:

gcloud scheduler jobs create http my_cron_job \
  --schedule="every 5 hours" \
  --uri="https://${ZONE}-${PROJECT_ID}.cloudfunctions.net/${FUNCTION_NAME}" \
  --http-method=POST \
  --message-body='{"foo": "bar"}' \
  --headers Content-Type=application/json

This doesn't seem to be available from the GCP Console level yet. Update 08/2021: It seems that it has been implemented in the UI now:

scheduler headers


Alternatively, using force=True seems to help if you're using Flask:

request.get_json(force=True)

This is due to the fact that Cloud Scheduler seems to set the default Content-Type header to application/octet-stream. See the doc