I am trying to move images for my site from my host to Amazon S3 cloud hosting. These images are of client work sites and cannot be publicly available. I would like them to be displayed on my site preferably by using the PHP SDK available from Amazon.
我正在尝试将我的站点的图像从主机移动到Amazon S3云托管。这些映像是客户端工作站点的映像,不能公开使用。我希望它们能在我的网站上显示,最好是使用Amazon提供的PHP SDK。
So far I have been able to script for the conversion so that I look up records in my database, grab the file path, name it appropriately, and send it to Amazon.
到目前为止,我已经能够为转换编写脚本,以便在数据库中查找记录,获取文件路径,适当地命名它,并将其发送到Amazon。
//upload to s3
$s3->create_object($bucket, $folder.$file_name_new, array(
'fileUpload' => $file_temp,
'acl' => AmazonS3::ACL_PRIVATE, //access denied, grantee only own
//'acl' => AmazonS3::ACL_PUBLIC, //image displayed
//'acl' => AmazonS3::ACL_OPEN, //image displayed, grantee everyone has open permission
//'acl' => AmazonS3::ACL_AUTH_READ, //image not displayed, grantee auth users has open permissions
//'acl' => AmazonS3::ACL_OWNER_READ, //image not displayed, grantee only ryan
//'acl' => AmazonS3::ACL_OWNER_FULL_CONTROL, //image not displayed, grantee only ryan
'storage' => AmazonS3::STORAGE_REDUCED
)
);
Before I copy everything over, I have created a simple form to do test upload and display of the image. If I upload an image using ACL_PRIVATE, I can either grab the public url and I will not have access, or I can grab the public url with a temporary key and can display the image.
在复制之前,我创建了一个简单的表单来进行图像的测试上传和显示。如果我使用ACL_PRIVATE上传图像,我可以获取公共url,但我没有访问权限,或者我可以使用临时键获取公共url并显示图像。
<?php
//display the image link
$temp_link = $s3->get_object_url($bucket, $folder.$file_name_new, '1 minute');
?>
<a href='<?php echo $temp_link; ?>'><?php echo $temp_link; ?></a><br />
<img src='<?php echo $temp_link; ?>' alt='finding image' /><br />
Using this method, how will my caching work? I'm guessing every time I refresh the page, or modify one of my records, I will be pulling that image again, increasing my get requests.
使用这个方法,我的缓存将如何工作?我猜每次我刷新页面,或者修改我的一个记录,我将再次拉那个图像,增加我的get请求。
I have also considered using bucket policies to only allow image retrieval from certain referrers. Do I understand correctly that Amazon is supposed to only fetch requests from pages or domains I specify?
我还考虑过使用bucket策略,只允许从某些引用者检索图像。我是否理解亚马逊应该只从我指定的页面或域获取请求?
I referenced: https://forums.aws.amazon.com/thread.jspa?messageID=188183𭼗 to set that up, but then am confused as to which security I need on my objects. It seemed like if I made them Private they still would not display, unless I used the temp link like mentioned previously. If I made them public, I could navigate to them directly, regardless of referrer.
我引用:https://forums.aws.amazon.com/thread.jspa?messageID=188183????来设置,但是我对我的对象所需要的安全性感到困惑。如果我将它们设置为私有,它们仍然不会显示,除非我使用前面提到的临时链接。如果我将它们公开,我就可以直接导航到它们,而不用管是谁推荐的。
Am I way off what I'm trying to do here? Is this not really supported by S3, or am I missing something simple? I have gone through the SDK documentation and lots of searching and feel like this should be a little more clearly documented so hopefully any input here can help others in this situation. I've read others who name the file with a unique ID, creating security through obscurity, but that won't cut it in my situation, and probably not best practice for anyone trying to be secure.
我是不是偏离了我想要做的?这不是S3真正支持的吗?还是我漏掉了一些简单的东西?我查阅了SDK文档,进行了大量的搜索,我觉得这个文档应该更清楚一些,所以希望这里的任何输入都能在这种情况下对其他人有所帮助。我读过一些人,他们用唯一的ID命名这个文件,通过隐藏创建安全,但这并不会在我的情况下解决这个问题,对于任何试图保护安全的人来说,这可能不是最佳实践。
4 个解决方案
#1
23
The best way to serve your images is to generate a url using the PHP SDK. That way the downloads go directly from S3 to your users.
提供图像的最佳方式是使用PHP SDK生成url。这样下载就可以直接从S3下载到用户。
You don't need to download via your servers as @mfonda suggested - you can set any caching headers you like on S3 objects - and if you did you would be losing some major benefits of using S3.
您不需要像@mfonda建议的那样通过服务器进行下载——您可以在S3对象上设置任何您喜欢的缓存头——如果这样做,您将失去使用S3的一些主要好处。
However, as you pointed out in your question, the url will always be changing (actually the querystring) so browsers won't cache the file. The easy work around is simply to always use the same expiry date so that the same querystring is always generated. Or better still 'cache' the url yourself (eg in the database) and reuse it every time.
但是,正如您在问题中指出的那样,url总是在变化(实际上是querystring),因此浏览器不会缓存文件。简单的解决方法是始终使用相同的过期日期,以便始终生成相同的查询字符串。或者最好还是自己“缓存”url(如在数据库中),并且每次都重用它。
You'll obviously have to set the expiry time somewhere far into the future, but you can regenerate these urls every so often if you prefer. eg in your database you would store the generated url and the expiry date(you could parse that from the url too). Then either you just use the existing url or, if the expiry date has passed, generate a new one. etc...
很明显,您必须将过期时间设置在遥远的将来,但是如果您愿意,您可以经常重新生成这些url。在您的数据库中,您将存储生成的url和过期日期(您也可以从url解析它)。然后,您可以使用现有的url,或者,如果过期日期已过,则生成一个新的url。等等……
#2
7
You can use bucket policies in your Amazon bucket to allow your application's domain to access the file. In fact, you can even add your local dev domain (ex: mylocaldomain.local) to the access list and you will be able to get your images. Amazon provides sample bucket policies here: http://docs.aws.amazon.com/AmazonS3/latest/dev/AccessPolicyLanguage_UseCases_s3_a.html. This was very helpful to help me serve my images.
您可以在Amazon bucket中使用bucket策略来允许应用程序的域访问文件。事实上,您甚至可以将您的本地开发域(例如:mylocaldomain.local)添加到访问列表中,您将能够获得映像。Amazon在这里提供了示例桶策略:http://docs.aws.amazon.com/amazons3/latest/accesspolicylanguage_usecases_s3_a.html。这对我的形象很有帮助。
The policy below solved the problem that brought me to this SO topic:
下面的政策解决了让我想到这个话题的问题:
{
"Version":"2008-10-17",
"Id":"http referer policy example",
"Statement":[
{
"Sid":"Allow get requests originated from www.example.com and example.com",
"Effect":"Allow",
"Principal":"*",
"Action":"s3:GetObject",
"Resource":"arn:aws:s3:::examplebucket/*",
"Condition":{
"StringLike":{
"aws:Referer":[
"http://www.example.com/*",
"http://example.com/*"
]
}
}
}
]
}
#3
2
When you talk about security and protecting data from unauthorized users, something is clear: you have to check every time you access that resource that you are entitled to.
当您谈到安全性和保护数据不被未经授权的用户访问时,有一点是明确的:您必须在每次访问您有权访问的资源时进行检查。
That means, that generating an url that can be accessed by anyone (might be difficult to obtain, but still...). The only solution is an image proxy. You can do that with a php script.
这意味着,生成一个任何人都可以访问的url(可能很难获得,但仍然…)。唯一的解决方案是映像代理。您可以使用php脚本实现这一点。
There is a fine article from Amazon's blog that sugests using readfile, http://blogs.aws.amazon.com/php/post/Tx2C4WJBMSMW68A/Streaming-Amazon-S3-Objects-From-a-Web-Server
在亚马逊的博客中有一篇很好的文章,用readfile (http://blogs.aws.amazon.com/php/post/tx2c4wjbmsmw68a/streaming-amazon - s3 - objectsfrom- web - server)来支持
readfile('s3://my-bucket/my-images/php.gif');
#4
0
You can download the contents from S3 (in a PHP script), then serve them using the correct headers.
您可以从S3(在PHP脚本中)下载内容,然后使用正确的头来提供它们。
As a rough example, say you had the following in image.php
:
作为一个粗略的例子,假设您在image.php中有以下内容:
$s3 = new AmazonS3();
$response = $s3->get_object($bucket, $image_name);
if (!$response->isOK()) {
throw new Exception('Error downloading file from S3');
}
header("Content-Type: image/jpeg");
header("Content-Length: " . strlen($response->body));
die($response->body);
Then in your HTML code, you can do
然后在HTML代码中,您可以这样做
<img src="image.php">
#1
23
The best way to serve your images is to generate a url using the PHP SDK. That way the downloads go directly from S3 to your users.
提供图像的最佳方式是使用PHP SDK生成url。这样下载就可以直接从S3下载到用户。
You don't need to download via your servers as @mfonda suggested - you can set any caching headers you like on S3 objects - and if you did you would be losing some major benefits of using S3.
您不需要像@mfonda建议的那样通过服务器进行下载——您可以在S3对象上设置任何您喜欢的缓存头——如果这样做,您将失去使用S3的一些主要好处。
However, as you pointed out in your question, the url will always be changing (actually the querystring) so browsers won't cache the file. The easy work around is simply to always use the same expiry date so that the same querystring is always generated. Or better still 'cache' the url yourself (eg in the database) and reuse it every time.
但是,正如您在问题中指出的那样,url总是在变化(实际上是querystring),因此浏览器不会缓存文件。简单的解决方法是始终使用相同的过期日期,以便始终生成相同的查询字符串。或者最好还是自己“缓存”url(如在数据库中),并且每次都重用它。
You'll obviously have to set the expiry time somewhere far into the future, but you can regenerate these urls every so often if you prefer. eg in your database you would store the generated url and the expiry date(you could parse that from the url too). Then either you just use the existing url or, if the expiry date has passed, generate a new one. etc...
很明显,您必须将过期时间设置在遥远的将来,但是如果您愿意,您可以经常重新生成这些url。在您的数据库中,您将存储生成的url和过期日期(您也可以从url解析它)。然后,您可以使用现有的url,或者,如果过期日期已过,则生成一个新的url。等等……
#2
7
You can use bucket policies in your Amazon bucket to allow your application's domain to access the file. In fact, you can even add your local dev domain (ex: mylocaldomain.local) to the access list and you will be able to get your images. Amazon provides sample bucket policies here: http://docs.aws.amazon.com/AmazonS3/latest/dev/AccessPolicyLanguage_UseCases_s3_a.html. This was very helpful to help me serve my images.
您可以在Amazon bucket中使用bucket策略来允许应用程序的域访问文件。事实上,您甚至可以将您的本地开发域(例如:mylocaldomain.local)添加到访问列表中,您将能够获得映像。Amazon在这里提供了示例桶策略:http://docs.aws.amazon.com/amazons3/latest/accesspolicylanguage_usecases_s3_a.html。这对我的形象很有帮助。
The policy below solved the problem that brought me to this SO topic:
下面的政策解决了让我想到这个话题的问题:
{
"Version":"2008-10-17",
"Id":"http referer policy example",
"Statement":[
{
"Sid":"Allow get requests originated from www.example.com and example.com",
"Effect":"Allow",
"Principal":"*",
"Action":"s3:GetObject",
"Resource":"arn:aws:s3:::examplebucket/*",
"Condition":{
"StringLike":{
"aws:Referer":[
"http://www.example.com/*",
"http://example.com/*"
]
}
}
}
]
}
#3
2
When you talk about security and protecting data from unauthorized users, something is clear: you have to check every time you access that resource that you are entitled to.
当您谈到安全性和保护数据不被未经授权的用户访问时,有一点是明确的:您必须在每次访问您有权访问的资源时进行检查。
That means, that generating an url that can be accessed by anyone (might be difficult to obtain, but still...). The only solution is an image proxy. You can do that with a php script.
这意味着,生成一个任何人都可以访问的url(可能很难获得,但仍然…)。唯一的解决方案是映像代理。您可以使用php脚本实现这一点。
There is a fine article from Amazon's blog that sugests using readfile, http://blogs.aws.amazon.com/php/post/Tx2C4WJBMSMW68A/Streaming-Amazon-S3-Objects-From-a-Web-Server
在亚马逊的博客中有一篇很好的文章,用readfile (http://blogs.aws.amazon.com/php/post/tx2c4wjbmsmw68a/streaming-amazon - s3 - objectsfrom- web - server)来支持
readfile('s3://my-bucket/my-images/php.gif');
#4
0
You can download the contents from S3 (in a PHP script), then serve them using the correct headers.
您可以从S3(在PHP脚本中)下载内容,然后使用正确的头来提供它们。
As a rough example, say you had the following in image.php
:
作为一个粗略的例子,假设您在image.php中有以下内容:
$s3 = new AmazonS3();
$response = $s3->get_object($bucket, $image_name);
if (!$response->isOK()) {
throw new Exception('Error downloading file from S3');
}
header("Content-Type: image/jpeg");
header("Content-Length: " . strlen($response->body));
die($response->body);
Then in your HTML code, you can do
然后在HTML代码中,您可以这样做
<img src="image.php">