D
D
Dannerty2019-07-29 14:13:45
Python
Dannerty, 2019-07-29 14:13:45

How to implement a timeout for BigQuery queries in a script?

Hello. Tell me, how can I implement a timeout in the script when executing a query in BigQuery?
The bottom line is this - on the server, a bash script is launched on the server, which works with tables. The request was always completed within 3-5 minutes, but one day, for some unknown reason, one request hung for 6 hours and fell due to Google's internal timeout.
I would like to track such a situation so that when the request hangs, it can be restarted. Tracking whether the script is running or not is not suitable, because. when the script is killed, and even the pid request itself, it continues to be executed in Google.
I tested it with a python script with the addition of the parameter 'query.timeout_ms = 100' and 'query.timeoutMs = 100' - it did not work, apparently it is not supported in this form.
Please let me know if anyone has come across this problem and what is the best way to do it?
Pull out job_id when a request is executed, and if it hangs - cancel? Or is there some other way?

Answer the question

In order to leave comments, you need to log in

2 answer(s)
3
3vi1_0n3, 2019-07-29
@3vi1_0n3

Coreutils has a timeout command, most likely it already exists on the server.

D
Dannerty, 2019-07-30
@Dannerty

In general, I implemented a little in a crutch, but a working way.
I run a query in python, and enter nested loops, the first one checks the status of the job by job_id, the second one counts the time (the a or b construct in python does not work quite the way we would like). If the request is completed within the specified time, then everything is OK, if not, then it is killed via client.cancel_job.

Didn't find what you were looking for?

Ask your question

Ask a Question

731 491 924 answers to any question