The biggest chunk of my BigQuery billing comes from query consumption. I am trying to optimize this by understanding which datasets/tables consume the most.
I am the
It might be easier to use the INFORMATION_SCHEMA.JOBS_BY_*
views because you don't have to set up the stackdriver logging and can use them right away.
Example taken & modified from How to monitor query costs in Google BigQuery
DECLARE gb_divisor INT64 DEFAULT 1024*1024*1024;
DECLARE tb_divisor INT64 DEFAULT gb_divisor*1024;
DECLARE cost_per_tb_in_dollar INT64 DEFAULT 5;
DECLARE cost_factor FLOAT64 DEFAULT cost_per_tb_in_dollar / tb_divisor;
SELECT
ROUND(SUM(total_bytes_processed) / gb_divisor,2) as bytes_processed_in_gb,
ROUND(SUM(IF(cache_hit != true, total_bytes_processed, 0)) * cost_factor,4) as cost_in_dollar,
user_email,
FROM (
(SELECT * FROM `region-us`.INFORMATION_SCHEMA.JOBS_BY_USER)
UNION ALL
(SELECT * FROM `other-project.region-us`.INFORMATION_SCHEMA.JOBS_BY_USER)
)
WHERE
DATE(creation_time) BETWEEN DATE_SUB(CURRENT_DATE(), INTERVAL 30 DAY) and CURRENT_DATE()
GROUP BY
user_email
Some caveats:
UNION ALL
all of the projects that you use explicitlyJOBS_BY_USER
did not work for me on my private account (supposedly because me login email is @googlemail and big query stores my email as @gmail`)WHERE
condition needs to be adjusted for your billing period (instead of the last 30 days)DECLARE cost_per_tb_in_dollar INT64 DEFAULT 5;
reflects only US costs - other regions might have different costs - see https://cloud.google.com/bigquery/pricing#on_demand_pricingUsing Stackdriver logs, you could create a sink with Pub/Sub topic as target for real-time analysis that filter only BigQuery logs like this :
resource.type="bigquery_resource" AND
proto_payload.method_name="jobservice.jobcompleted" AND
proto_payload.service_data.job_completed_event.job.job_statistics.total_billed_bytes:*
(see example queries here : https://cloud.google.com/logging/docs/view/query-library?hl=en_US#bigquery-filters)
You could create the sink on a specific project, a folder or even an organization. This will retrieve all the queries done in BigQuery in that specific project, folder or organization.
The field proto_payload.service_data.job_completed_event.job.job_statistics.total_billed_bytes
will give you the number of bytes processed by the query.
Based on on-demand BigQuery pricing (as of now, $5/TB for most regions, but check for your own region), you could easily estimate in real-time the billing. You could create a Dataflow job that aggregates the results in BigQuery, or simply consume the destination Pub/Sub topic with any job you want to make the pricing calculation :
jobPriceInUSD = totalBilledBytes / 1_000_000_000_000 * pricePerTB
because 1 TB = 1_000_000_000_000 B. As I said before, pricePerTB
depends on regions (see : (https://cloud.google.com/bigquery/pricing#on_demand_pricing for the exact price). For example, as of time of writing :
Also, for each month, as of now, the 1st TB is free.