This may be a dummy question but I cannot seem to be able to run python google-clood-bigquery asynchronously.
My goal is to run multiple queries concurrently and wait fo
In fact I found a way to wrap my query in an asyinc call quite easily thanks to the asyncio.create_task()
function.
I just needed to wrap the job.result()
in a coroutine; here is the implementation. It does run asynchronously now.
class BQApi(object):
def __init__(self):
self.api = bigquery.Client.from_service_account_json(BQ_CONFIG["credentials"])
async def exec_query(self, query, **kwargs) -> bigquery.table.RowIterator:
job = self.api.query(query, **kwargs)
task = asyncio.create_task(self.coroutine_job(job))
return await task
@staticmethod
async def coroutine_job(job):
return job.result()