如何scp到亚马逊s3?

时间:2022-11-19 09:32:29

I need to send backup files of ~2TB to S3. I guess the most hassle-free option would be Linux scp command (have difficulty with s3cmd and don't want an overkill java/RoR to do so).

我需要将~2TB的备份文件发送到S3。我想最麻烦的选择是Linux scp命令(对s3cmd有困难,并且不希望过度使用java / RoR这样做)。

However I am not sure whether it is possible: How to use S3's private and public keys with scp, and don't know what would be my destination IP/url/path?

但是我不确定是否可能:如何使用scp的S3私有和公共密钥,并且不知道我的目标IP / url /路径是什么?

I appreciate your hints.

我很感激你的提示。

7 个解决方案

#1


7  

You can't SCP.

你不能SCP。

The quickest way, if you don't mind spending money, is probably just to send it to them on a disk and they'll put it up there for you. See their Import/Export service.

最快的方式,如果你不介意花钱,可能只是将它发送给他们在磁盘上,他们会把它放在那里为你。查看他们的导入/导出服务。

#2


29  

As of 2015, SCP/SSH is not supported (and probably never will be for the reasons mentioned in the other answers).

截至2015年,不支持SCP / SSH(可能永远不会出于其他答案中提到的原因)。

Official AWS tools for copying files to/from S3

  1. command line tool (pip3 install awscli) - note credentials need to be specified, I prefer via environment variables rather than a file: AWS_ACCESS_KEY_ID, AWS_SECRET_ACCESS_KEY.

    命令行工具(pip3 install awscli) - 需要指定注释凭据,我更喜欢通过环境变量而不是文件:AWS_ACCESS_KEY_ID,AWS_SECRET_ACCESS_KEY。

    aws s3 cp /tmp/foo/ s3://bucket/ --recursive --exclude "*" --include "*.jpg"
    

    and an rsync-like command:

    和类似rsync的命令:

    aws s3 sync . s3://mybucket
    
  2. Web interface:

    网络界面:

Non-AWS methods

Any other solutions depend on third-party executables (e.g. botosync, jungledisk...) which can be great as long as they are supported. But third party tools come and go as years go by and your scripts will have a shorter shelf life.

任何其他解决方案都依赖于第三方可执行文件(例如botosync,jungledisk ......),只要它们受到支持,它们就可以很好。但是,随着时间的推移,第三方工具来来往往,您的脚本的保质期会缩短。


EDIT: Actually, AWS CLI is based on botocore:

编辑:实际上,AWS CLI基于botocore:

https://github.com/boto/botocore

https://github.com/boto/botocore

So botosync deserves a bit more respect as an elder statesman than I perhaps gave it.

所以botosync作为一个老政治家应该得到更多的尊重,而不是我给它。

#3


10  

Here's just the thing for this, boto-rsync. From any Linux box, install boto-rsync and then use this to transfer /local/path/ to your_bucket/remote/path/:

这就是为什么,boto-rsync。在任何Linux机器上,安装boto-rsync,然后使用它将/ local / path /传输到your_bucket / remote / path /:

boto-rsync -a your_access_key -s your_secret_key /local/path/ s3://your_bucket/remote/path/

The paths can also be files.

路径也可以是文件。

For a S3-compatible provider other than AWS, use --endpoint:

对于AWS以外的S3兼容提供程序,请使用--endpoint:

boto-rsync -a your_access_key -s your_secret_key --endpoint some.provider.com /local/path/ s3://your_bucket/remote/path/

#4


4  

Why don't you scp it to an EBS volume and then use s3cmd from there? As long as your EBS volume and s3 bucket are in the same region, you'll only be charged for inbound data charges once (from your network to the EBS volume)

为什么不将它scp到EBS卷然后从那里使用s3cmd?只要您的EBS卷和s3存储桶位于同一区域,您只需支付一次入站数据费用(从您的网络到EBS卷)

I've found that once within the s3 network, s3cmd is much more reliable and the data transfer rate is far higher than direct to s3.

我发现,一旦在s3网络中,s3cmd更可靠,数据传输速率远远高于s3。

#5


2  

I guess you can mount S3 within EC2 (http://michaelaldridge.info/post/12086788604/mounting-s3-within-an-ec2-instance) and do the SCP!

我想你可以在EC2中安装S3(http://michaelaldridge.info/post/12086788604/mounting-s3-within-an-ec2-instance)并做SCP!

#6


1  

There is an amazing tool called Dragon Disk. It works as a sync tool even and not just as plain scp.

有一个名为Dragon Disk的神奇工具。它甚至可以作为同步工具使用,而不仅仅是简单的scp。

http://www.s3-client.com/

http://www.s3-client.com/

The Guide to setup the amazon s3 is provided here and after setting it up you can either copy paste the files from your local machine to s3 or setup an automatic sync. The User Interface is very similiar to WinSCP or Filezilla.

此处提供了设置amazon s3的指南,在设置完成后,您可以将文件从本地计算机复制粘贴到s3或设置自动同步。用户界面与WinSCP或Filezilla非常相似。

#7


-3  

for our AWS backups we use a combination of duplicity and trickle duplicity for rsync and encryption and trickle to limit the upload speed

对于我们的AWS备份,我们对rsync和加密使用双重性和欺骗性双重性的组合,并使用特技来限制上载速度

#1


7  

You can't SCP.

你不能SCP。

The quickest way, if you don't mind spending money, is probably just to send it to them on a disk and they'll put it up there for you. See their Import/Export service.

最快的方式,如果你不介意花钱,可能只是将它发送给他们在磁盘上,他们会把它放在那里为你。查看他们的导入/导出服务。

#2


29  

As of 2015, SCP/SSH is not supported (and probably never will be for the reasons mentioned in the other answers).

截至2015年,不支持SCP / SSH(可能永远不会出于其他答案中提到的原因)。

Official AWS tools for copying files to/from S3

  1. command line tool (pip3 install awscli) - note credentials need to be specified, I prefer via environment variables rather than a file: AWS_ACCESS_KEY_ID, AWS_SECRET_ACCESS_KEY.

    命令行工具(pip3 install awscli) - 需要指定注释凭据,我更喜欢通过环境变量而不是文件:AWS_ACCESS_KEY_ID,AWS_SECRET_ACCESS_KEY。

    aws s3 cp /tmp/foo/ s3://bucket/ --recursive --exclude "*" --include "*.jpg"
    

    and an rsync-like command:

    和类似rsync的命令:

    aws s3 sync . s3://mybucket
    
  2. Web interface:

    网络界面:

Non-AWS methods

Any other solutions depend on third-party executables (e.g. botosync, jungledisk...) which can be great as long as they are supported. But third party tools come and go as years go by and your scripts will have a shorter shelf life.

任何其他解决方案都依赖于第三方可执行文件(例如botosync,jungledisk ......),只要它们受到支持,它们就可以很好。但是,随着时间的推移,第三方工具来来往往,您的脚本的保质期会缩短。


EDIT: Actually, AWS CLI is based on botocore:

编辑:实际上,AWS CLI基于botocore:

https://github.com/boto/botocore

https://github.com/boto/botocore

So botosync deserves a bit more respect as an elder statesman than I perhaps gave it.

所以botosync作为一个老政治家应该得到更多的尊重,而不是我给它。

#3


10  

Here's just the thing for this, boto-rsync. From any Linux box, install boto-rsync and then use this to transfer /local/path/ to your_bucket/remote/path/:

这就是为什么,boto-rsync。在任何Linux机器上,安装boto-rsync,然后使用它将/ local / path /传输到your_bucket / remote / path /:

boto-rsync -a your_access_key -s your_secret_key /local/path/ s3://your_bucket/remote/path/

The paths can also be files.

路径也可以是文件。

For a S3-compatible provider other than AWS, use --endpoint:

对于AWS以外的S3兼容提供程序,请使用--endpoint:

boto-rsync -a your_access_key -s your_secret_key --endpoint some.provider.com /local/path/ s3://your_bucket/remote/path/

#4


4  

Why don't you scp it to an EBS volume and then use s3cmd from there? As long as your EBS volume and s3 bucket are in the same region, you'll only be charged for inbound data charges once (from your network to the EBS volume)

为什么不将它scp到EBS卷然后从那里使用s3cmd?只要您的EBS卷和s3存储桶位于同一区域,您只需支付一次入站数据费用(从您的网络到EBS卷)

I've found that once within the s3 network, s3cmd is much more reliable and the data transfer rate is far higher than direct to s3.

我发现,一旦在s3网络中,s3cmd更可靠,数据传输速率远远高于s3。

#5


2  

I guess you can mount S3 within EC2 (http://michaelaldridge.info/post/12086788604/mounting-s3-within-an-ec2-instance) and do the SCP!

我想你可以在EC2中安装S3(http://michaelaldridge.info/post/12086788604/mounting-s3-within-an-ec2-instance)并做SCP!

#6


1  

There is an amazing tool called Dragon Disk. It works as a sync tool even and not just as plain scp.

有一个名为Dragon Disk的神奇工具。它甚至可以作为同步工具使用,而不仅仅是简单的scp。

http://www.s3-client.com/

http://www.s3-client.com/

The Guide to setup the amazon s3 is provided here and after setting it up you can either copy paste the files from your local machine to s3 or setup an automatic sync. The User Interface is very similiar to WinSCP or Filezilla.

此处提供了设置amazon s3的指南,在设置完成后,您可以将文件从本地计算机复制粘贴到s3或设置自动同步。用户界面与WinSCP或Filezilla非常相似。

#7


-3  

for our AWS backups we use a combination of duplicity and trickle duplicity for rsync and encryption and trickle to limit the upload speed

对于我们的AWS备份,我们对rsync和加密使用双重性和欺骗性双重性的组合,并使用特技来限制上载速度