将300 MB文件加载到BigQuery时出现超时错误

时间:2022-06-23 20:49:52

I followed the Node.js example provided on the loading data post request page (near the bottom here: https://cloud.google.com/bigquery/loading-data-post-request), but I’m running into a problem with larger files. The example code works for a .csv of 13 MB, but when I try bigger files, whether 25 MB or 300 MB, it doesn’t work. I see the following error:

我按照加载数据发布请求页面上提供的Node.js示例(靠近底部:https://cloud.google.com/bigquery/loading-data-post-request),但我遇到了问题更大的文件。示例代码适用于13 MB的.csv,但是当我尝试更大的文件时,无论是25 MB还是300 MB,它都不起作用。我看到以下错误:

events.js:154 throw er; // Unhandled 'error' event ^

events.js:154 throw er; //未处理的'错误'事件^

Error: ETIMEDOUT
    at null._onTimeout (/Users/Hertig/Development/BitDeliver/BigQuery/node_modules/request/request.js:772:15)
    at Timer.listOnTimeout (timers.js:92:15)

I thought that this method of loading data would support bigger files. Has anyone else experienced this? Any advice on loading much bigger files (around 400 MB or larger) would be appreciated.

我认为这种加载数据的方法会支持更大的文件。还有其他人经历过这个吗?任何关于加载更大文件(大约400 MB或更大)的建议都将受到赞赏。

1 个解决方案

#1


3  

When loading big files to BigQuery, the best method is uploading them to Google Cloud Storage first - then tell BigQuery to read them from your gs://bucket/path/file*.

将大文件加载到BigQuery时,最好的方法是先将它们上传到Google Cloud Storage - 然后告诉BigQuery从gs:// bucket / path / file *中读取它们。

#1


3  

When loading big files to BigQuery, the best method is uploading them to Google Cloud Storage first - then tell BigQuery to read them from your gs://bucket/path/file*.

将大文件加载到BigQuery时,最好的方法是先将它们上传到Google Cloud Storage - 然后告诉BigQuery从gs:// bucket / path / file *中读取它们。