节点aws-sdk s3文件上传大小

时间:2022-04-20 23:06:07

When using the aws-sdk npm plugin for nods.js, I can upload a pdf on 50kb with the following code (to AWS s3):

当为nods.js使用aws-sdk npm插件时,我可以使用以下代码(到AWS s3)上传50kb的pdf:

var params = {
            Bucket: BUCKET, 
            Key: pdf_key, 
            Body: file,
            ContentType: 'application/pdf'
        };
        var s3 = new AWS.S3();

        s3.putObject(params, function(error, data) {
            console.log(data);
            console.log(error);
            if (error) {
                console.log(error);
                callback(error, null);
            } else {
                callback(null, pdf_key);
            }
        });

But when uploading a 11mb pdf, even with specifying the ContentLength, the upload just continues forever, even with a timeout of 2 minutes.

但是当上传一个11mb pdf时,即使指定了ContentLength,上传也会一直持续,即使超时为2分钟。

The question is how do I make aws s3 accept the large pdf file?

问题是如何让aws s3接受大型pdf文件?

UPDATE

I have still not found any documentation or anwers for the question.

我还没有找到任何关于这个问题的文档或文章。

UPDATE 2

I will accept answers which show's this or another framework that can do this. I will need that framework to be able to also allow auth-read of the object.

我会接受答案,这些答案显示了这个或另一个可以做到这一点的框架。我将需要该框架,以便能够允许对象的auth-read。

UPDATE 3 I got it working for now but I haven't found a reason it shouldn't work.

更新3我现在工作了,但我没有找到它不应该工作的原因。

Thanks in advance!

提前致谢!

1 个解决方案

#1


7  

Connecting to S3 isn't fast and then depending on the network fluctuations you can get timeouts and other weird behaviors.

连接到S3并不快,然后根据网络波动,您可以获得超时和其他奇怪的行为。

The code you provided is fine, but you could take advantage of multipart uploads that could solve problems especially with >5MB files.

您提供的代码很好,但您可以利用可以解决问题的分段上传,尤其是> 5MB文件。

I made a rough implementation of a multipart upload and also made it to retry the upload of any failing part up to 3 times, this will also work for smaller files than 5MB.

我粗略地实现了一个分段上传,并使其重新上传任何失败的部分最多3次,这也适用于小于5MB的小文件。

#1


7  

Connecting to S3 isn't fast and then depending on the network fluctuations you can get timeouts and other weird behaviors.

连接到S3并不快,然后根据网络波动,您可以获得超时和其他奇怪的行为。

The code you provided is fine, but you could take advantage of multipart uploads that could solve problems especially with >5MB files.

您提供的代码很好,但您可以利用可以解决问题的分段上传,尤其是> 5MB文件。

I made a rough implementation of a multipart upload and also made it to retry the upload of any failing part up to 3 times, this will also work for smaller files than 5MB.

我粗略地实现了一个分段上传,并使其重新上传任何失败的部分最多3次,这也适用于小于5MB的小文件。