AWS S3提供gzip压缩文件但不可读

时间:2021-06-05 20:41:50

I'm using AWS S3 to host a static webpage, almost all assets are gzipped before being uploaded.

我正在使用AWS S3来托管静态网页,几乎所有资产都在上传之前进行了压缩。

During the upload the "content-encoding" header is correctly set to "gzip" (and this also reflects when actually loading the file from AWS).

在上传过程中,“content-encoding”标头被正确设置为“gzip”(这也反映了从AWS实际加载文件的时间)。

The thing is, the files can't be read and are still in gzip format although the correct headers are set...

问题是,虽然设置了正确的标题,但是文件无法读取并仍然是gzip格式...

The files are uploaded using npm s3-deploy, here's a screenshot of what the request looks like:

使用npm s3-deploy上传文件,这里是请求的截图:

AWS S3提供gzip压缩文件但不可读

and the contents of the file in the browser:

和浏览器中文件的内容:

AWS S3提供gzip压缩文件但不可读

If I upload the file manually and set the content-encoding header to "gzip" it works perfectly. Sadly I have a couple hundred files to upload for every deployment and can not do this manually all the time (I hope that's understandable ;) ).

如果我手动上传文件并将内容编码标题设置为“gzip”,它就能完美运行。遗憾的是,我为每个部署上传了几百个文件,并且无法一直手动执行此操作(我希望这是可以理解的;))。

Has anyone an idea of what's going on here? Anyone worked with s3-deploy and can help?

有谁知道这里发生了什么?有人使用s3-deploy可以提供帮助吗?

1 个解决方案

#1


1  

I use my own bash script for S3 deployments, you can try to do it:

我使用自己的bash脚本进行S3部署,你可以尝试这样做:

webpath='path'
BUCKET='BUCKETNAME'

for file in $webpath/js/*.gz; do
        aws s3 cp "$file" s3://"$BUCKET/js/" --content-encoding 'gzip' --region='eu-west-1'
done

#1


1  

I use my own bash script for S3 deployments, you can try to do it:

我使用自己的bash脚本进行S3部署,你可以尝试这样做:

webpath='path'
BUCKET='BUCKETNAME'

for file in $webpath/js/*.gz; do
        aws s3 cp "$file" s3://"$BUCKET/js/" --content-encoding 'gzip' --region='eu-west-1'
done