bigquery/WARN/2024_001

BigQuery table does not exceeds the limit for imports or query appends

Product: BigQuery
Rule class: WARN - Something that is possibly wrong

Description

BigQuery returns this error message when your table reaches the limit for table operations per day for Standard tables. Table operations include the combined total of all load jobs, copy jobs, and query jobs that append or overwrite a destination table.

You can search in the Logs Explorer with the logging query:

resource.type="bigquery_resource"
severity>=ERROR
"Your table exceeded quota for imports or query appends per table"

Remediation

If you have not identified the source from where most table operations are originating, do the following:

  1. Make a note of the project, dataset, and table that the failed query, load, or the copy job is writing to.

  2. Use INFORMATION_SCHEMA.JOBS_BY_* tables to learn more about jobs that modify the table.

The following example finds the hourly count of jobs grouped by job type for a 24-hour period using JOBS_BY_PROJECT. If you expect multiple projects to write to the table, replace JOBS_BY_PROJECT with JOBS_BY_ORGANIZATION.

SELECT
  TIMESTAMP_TRUNC(creation_time, HOUR),
  job_type,
  count(1)
FROM `region-us`.INFORMATION_SCHEMA.JOBS_BY_PROJECT
#Adjust time
WHERE creation_time BETWEEN "2021-06-20 00:00:00" AND "2021-06-21 00:00:00"
AND destination_table.project_id = "my-project-id"
AND destination_table.dataset_id = "my_dataset"
AND destination_table.table_id = "my_table"
GROUP BY 1, 2
ORDER BY 1 DESC

This quota cannot be increased. To resolve this quota error, do the following:

  • If you frequently load data from multiple small files stored in Cloud Storage that uses a job per file, then combine multiple load jobs into a single job. You can load from multiple Cloud Storage URIs with a comma-separated list (for example, gs://my_path/file_1,gs://my_path/file_2), or by using wildcards (for example, gs://my_path/*). For more information, see Batch loading data
  • If you use load, select or copy jobs to append single rows of data to a table, for example, then you should consider batching multiple jobs into one job. BigQuery doesn’t perform well when used as a relational database. As a best practice, avoid running frequent, single-row append actions.
  • To append data at a high rate, consider using BigQuery Storage Write API. It is a recommended solution for high-performance data ingestion. The BigQuery Storage Write API has robust features, including exactly-once delivery semantics. To learn about limits and quotas, see Storage Write API and to see costs of using this API, see BigQuery data ingestion pricing.

Further information