Amazon Redshift COPY命令失败,并显示以下错误

时间:2021-03-09 23:09:20

I have encountered following error when trying to load a file from S3 to my redshift tables using copy command and unable to find any clue regarding this.

我尝试使用copy命令将文件从S3加载到我的redshift表时遇到以下错误,但无法找到有关此问题的任何线索。

  -----------------------------------------------
  error:  Failed writing body (0 != 776) Cause: Failed to inflateinvalid or incomplete deflate data. zlib error code: -3
  code:      9001
  context:   S3 key being read : s3://redshift-dev-sandbox/Moores.csv
  query:     2565852
  location:  table_s3_scanner.cpp:356
  process:   query0_33 [pid=10565]
  -----------------------------------------------

1 个解决方案

#1


1  

There is a mix up : You cannot specifying a GZIP algorithm on a simple csv file.

有一个混淆:你不能在一个简单的csv文件上指定GZIP算法。

You will either have a copy of a csv file:

您将拥有一个csv文件的副本:

copy "aw_tushar_allentity".dataset_customerdataset from 's3://redshift-dev-sandbox/Moores.csv' 
credentials 'aws_access_key_id=xxxx;aws_secret_access_key=xxxx'
delimiter ',' CSV IGNOREHEADER 1;

or if your file is compressed from GZIP file:

或者如果您的文件是从GZIP文件压缩的​​:

copy "aw_tushar_allentity".dataset_customerdataset from 's3://redshift-dev-sandbox/Moores.csv.gz' 
credentials 'aws_access_key_id=xxxx;aws_secret_access_key=xxxx'
gzip
delimiter ',' CSV IGNOREHEADER 1;

#1


1  

There is a mix up : You cannot specifying a GZIP algorithm on a simple csv file.

有一个混淆:你不能在一个简单的csv文件上指定GZIP算法。

You will either have a copy of a csv file:

您将拥有一个csv文件的副本:

copy "aw_tushar_allentity".dataset_customerdataset from 's3://redshift-dev-sandbox/Moores.csv' 
credentials 'aws_access_key_id=xxxx;aws_secret_access_key=xxxx'
delimiter ',' CSV IGNOREHEADER 1;

or if your file is compressed from GZIP file:

或者如果您的文件是从GZIP文件压缩的​​:

copy "aw_tushar_allentity".dataset_customerdataset from 's3://redshift-dev-sandbox/Moores.csv.gz' 
credentials 'aws_access_key_id=xxxx;aws_secret_access_key=xxxx'
gzip
delimiter ',' CSV IGNOREHEADER 1;