Google Big Query Error: CSV table encountered too many errors, giving up. Row: 1 errors: 1

前端 未结 8 2351
星月不相逢
星月不相逢 2020-12-04 02:26

I am trying to run a query on a 12 GB csv file loaded in Google big query, I cant run any query on the dataset. I am not sure if the dataset is loaded correctly. It shows a

相关标签:
8条回答
  • 2020-12-04 03:03

    I was also getting the same error with no clue of the actual problem.

    <https://www.googleapis.com/bigquery/v2/projects/****/jobs/job_******?location=******> <{
    reason: invalid message: Error while reading data, error message: JSON table encountered too many errors, giving up. Rows: 115;
    errors: 1. Please look into the errors[] collection for more details.  }
    

    Tried bq --format=prettyjson show -j => This also didn't gave any more clue.

    I was trying to transfer data from a Database to Big Query using SAP BODS as the ETL Tool. To find the root cause I had to modify the ETL to transfer column by column i.e. I first transferred one column, then added 2nd, and so on. The transfer was successful for initial string columns. But when a FLOAT column came, the transfer gave the same error.

    When checking the data I found values as .0345 in the decimal column in database. For values less than 1, 0 was removed before decimal which was causing error during transfer to Big Query.

    To rectify, I had to apply to_decimal conversion of BODS.

    to_decimal(column_name, '.', ',', 4) 
    

    "," is the thousand separator

    "." is the decimal separator

    4 specifies the nos allowed after decimal

    Note: : I was also transferring records to Google Cloud Storage simultaneously and it was getting successful before conversion also. Also, when I manually used Cloud Storage file to populate the same BigQuery table, it was also working fine.

    0 讨论(0)
  • 2020-12-04 03:07

    Check your schema - it's possible that you forgot to include the schema for one of the columns - that's what happened to me!

    0 讨论(0)
提交回复
热议问题