boto.exception。S3ResponseError:S3ResponseError:403禁止的

时间:2022-01-20 23:04:00

I'm trying to get django to upload static files to S3, but istead I'm getting a 403 forbidden error, and I'm not sure why.

我试图让django将静态文件上载到S3,但是我得到了一个403禁止错误,我不知道为什么。

Full Stacktrace:

加:

Traceback (most recent call last):
  File "manage.py", line 14, in <module>
    execute_manager(settings)
  File "/home/levi/Projects/DoneBox/.virtualenv/local/lib/python2.7/site-packages/django/core/management/__init__.py", line 438, in execute_manager
    utility.execute()
  File "/home/levi/Projects/DoneBox/.virtualenv/local/lib/python2.7/site-packages/django/core/management/__init__.py", line 379, in execute
    self.fetch_command(subcommand).run_from_argv(self.argv)
  File "/home/levi/Projects/DoneBox/.virtualenv/local/lib/python2.7/site-packages/django/core/management/base.py", line 191, in run_from_argv
    self.execute(*args, **options.__dict__)
  File "/home/levi/Projects/DoneBox/.virtualenv/local/lib/python2.7/site-packages/django/core/management/base.py", line 220, in execute
    output = self.handle(*args, **options)
  File "/home/levi/Projects/DoneBox/.virtualenv/local/lib/python2.7/site-packages/django/core/management/base.py", line 351, in handle
    return self.handle_noargs(**options)
  File "/home/levi/Projects/DoneBox/.virtualenv/local/lib/python2.7/site-packages/django/contrib/staticfiles/management/commands/collectstatic.py", line 89, in handle_noargs
    self.copy_file(path, prefixed_path, storage, **options)
  File "/home/levi/Projects/DoneBox/.virtualenv/local/lib/python2.7/site-packages/django/contrib/staticfiles/management/commands/collectstatic.py", line 184, in copy_file
    if not self.delete_file(path, prefixed_path, source_storage, **options):
  File "/home/levi/Projects/DoneBox/.virtualenv/local/lib/python2.7/site-packages/django/contrib/staticfiles/management/commands/collectstatic.py", line 115, in delete_file
    if self.storage.exists(prefixed_path):
  File "/home/levi/Projects/DoneBox/.virtualenv/local/lib/python2.7/site-packages/storages/backends/s3boto.py", line 209, in exists
    return k.exists()
  File "/home/levi/Projects/DoneBox/.virtualenv/local/lib/python2.7/site-packages/boto/s3/key.py", line 391, in exists
    return bool(self.bucket.lookup(self.name))
  File "/home/levi/Projects/DoneBox/.virtualenv/local/lib/python2.7/site-packages/boto/s3/bucket.py", line 143, in lookup
    return self.get_key(key_name, headers=headers)
  File "/home/levi/Projects/DoneBox/.virtualenv/local/lib/python2.7/site-packages/boto/s3/bucket.py", line 208, in get_key
    response.status, response.reason, '')
boto.exception.S3ResponseError: S3ResponseError: 403 Forbidden

Contents of settings.py:

settings.py内容:

import os
DIRNAME = os.path.dirname(__file__)
# Django settings for DoneBox project.

DEBUG = True
TEMPLATE_DEBUG = DEBUG

ADMINS = (
    # ('Your Name', 'your_email@example.com'),
)

MANAGERS = ADMINS

DATABASES = {
    'default': {
        'ENGINE': 'django.db.backends.sqlite3', # Add 'postgresql_psycopg2', 'postgresql', 'mysql', 'sqlite3' or 'oracle'.
        'NAME': os.path.join(DIRNAME, "box.sqlite"),                      # Or path to database file if using sqlite3.
        'USER': '',                      # Not used with sqlite3.
        'PASSWORD': '',                  # Not used with sqlite3.
        'HOST': '',                      # Set to empty string for localhost. Not used with sqlite3.
        'PORT': '',                      # Set to empty string for default. Not used with sqlite3.
    }
}

# Local time zone for this installation. Choices can be found here:
# http://en.wikipedia.org/wiki/List_of_tz_zones_by_name
# although not all choices may be available on all operating systems.
# On Unix systems, a value of None will cause Django to use the same
# timezone as the operating system.
# If running in a Windows environment this must be set to the same as your
# system time zone.
TIME_ZONE = 'America/Denver'

# Language code for this installation. All choices can be found here:
# http://www.i18nguy.com/unicode/language-identifiers.html
LANGUAGE_CODE = 'en-us'

SITE_ID = 1

# If you set this to False, Django will make some optimizations so as not
# to load the internationalization machinery.
USE_I18N = True

# If you set this to False, Django will not format dates, numbers and
# calendars according to the current locale
USE_L10N = True

# Absolute filesystem path to the directory that will hold user-uploaded files.
# Example: "/home/media/media.lawrence.com/media/"
MEDIA_ROOT = ''

# URL that handles the media served from MEDIA_ROOT. Make sure to use a
# trailing slash.
# Examples: "http://media.lawrence.com/media/", "http://example.com/media/"
MEDIA_URL = "d1eyn4cjl5vzx0.cloudfront.net"

# Absolute path to the directory static files should be collected to.
# Don't put anything in this directory yourself; store your static files
# in apps' "static/" subdirectories and in STATICFILES_DIRS.
# Example: "/home/media/media.lawrence.com/static/"
STATIC_ROOT = os.path.join(DIRNAME, "static")

# URL prefix for static files.
# Example: "http://media.lawrence.com/static/"
STATIC_URL = "d280kzug7l5rug.cloudfront.net"

# URL prefix for admin static files -- CSS, JavaScript and images.
# Make sure to use a trailing slash.
# Examples: "http://foo.com/static/admin/", "/static/admin/".
ADMIN_MEDIA_PREFIX = '/static/admin/'

# Additional locations of static files
STATICFILES_DIRS = (
    # Put strings here, like "/home/html/static" or "C:/www/django/static".
    # Always use forward slashes, even on Windows.
    # Don't forget to use absolute paths, not relative paths.
    os.path.join(DIRNAME, "main", "static"),
)

# List of finder classes that know how to find static files in
# various locations.
STATICFILES_FINDERS = (
    'django.contrib.staticfiles.finders.FileSystemFinder',
    'django.contrib.staticfiles.finders.AppDirectoriesFinder',
    'django.contrib.staticfiles.finders.DefaultStorageFinder',
)

# Make this unique, and don't share it with anybody.
SECRET_KEY = '<snip>'

# List of callables that know how to import templates from various sources.
TEMPLATE_LOADERS = (
    'django.template.loaders.filesystem.Loader',
    'django.template.loaders.app_directories.Loader',
    'django.template.loaders.eggs.Loader',
)

MIDDLEWARE_CLASSES = (
    'django.middleware.common.CommonMiddleware',
    'django.contrib.sessions.middleware.SessionMiddleware',
    'django.middleware.csrf.CsrfViewMiddleware',
    'django.contrib.auth.middleware.AuthenticationMiddleware',
    'django.contrib.messages.middleware.MessageMiddleware',
)

ROOT_URLCONF = 'DoneBox.urls'

TEMPLATE_DIRS = (
    # Put strings here, like "/home/html/django_templates" or "C:/www/django/templates".
    # Always use forward slashes, even on Windows.
    # Don't forget to use absolute paths, not relative paths.
    os.path.join(DIRNAME, "main", "templates"),
    os.path.join(DIRNAME, "templates"),
    os.path.join(DIRNAME, "basic", "blog", "templates"),
)

INSTALLED_APPS = (
    'django.contrib.auth',
    'django.contrib.contenttypes',
    'django.contrib.sessions',
    'django.contrib.sites',
    'django.contrib.messages',
    'django.contrib.staticfiles',
    'django.contrib.sitemaps',
    # Uncomment the next line to enable the admin:
    'django.contrib.admin',
    # Uncomment the next line to enable admin documentation:
    'storages',
    'django.contrib.admindocs',
    'main',
    'contacts',
    'piston',
    'registration',
#    'contact_form',
    'basic',
    'basic.blog',
)

# A sample logging configuration. The only tangible logging
# performed by this configuration is to send an email to
# the site admins on every HTTP 500 error.
# See http://docs.djangoproject.com/en/dev/topics/logging for
# more details on how to customize your logging configuration.
LOGGING = {
    'version': 1,
    'disable_existing_loggers': False,
    'handlers': {
        'mail_admins': {
            'level': 'ERROR',
            'class': 'django.utils.log.AdminEmailHandler'
        }
    },
    'loggers': {
        'django.request': {
            'handlers': ['mail_admins'],
            'level': 'DEBUG',
            'propagate': True,
        },
        'django.db.backends': {
            'handlers': ['mail_admins'],
            'level': 'DEBUG',
            'propagate': True,
        }
    }
}

DEFAULT_FILE_STORAGE = 'storages.backends.s3boto.S3BotoStorage'
AWS_ACCESS_KEY_ID = '<snip>'
AWS_SECRET_ACCESS_KEY = '<snip>'
STATICFILES_STORAGE = 'storages.backends.s3boto.S3BotoStorage'
AWS_STORAGE_BUCKET_NAME = "donebox-static"
STATIC_FILES_BUCKET = "donebox-static"
MEDIA_FILES_BUCKET = "donebox-media"
ACCOUNT_ACTIVATION_DAYS = 7

EMAIL_HOST = "email-smtp.us-east-1.amazonaws.com"
EMAIL_HOST_USER = '<snip>'
EMAIL_HOST_PASSWORD = '<snip>'
EMAIL_PORT = 587
EMAIL_USE_TLS = True
TEMPLATE_CONTEXT_PROCESSORS = (
    "django.contrib.auth.context_processors.auth",
     "django.core.context_processors.debug",
     "django.core.context_processors.i18n",
     "django.core.context_processors.media",
     "django.core.context_processors.static",
     "django.contrib.messages.context_processors.messages",
     "DoneBox.main.context_processors_PandC",
     )

Contents of requirements.pip:

requirements.pip内容:

django==1.3
django-storages==1.1.4
django-registration==0.8
django-piston==0.2.3
django-tagging==0.3.1
django-extensions==0.8
BeautifulSoup==3.2.1
boto==2.4.1
mysql-python==1.2.3
tweepy==1.9
feedparser==5.1.2
pycrypto==2.6

A google search for this exception doesn't turn up anything interesting. I suspect I mis-configured things, although I'm not sure. Could someone point me in the right direction? Thank you for your time and consideration.

对这个异常的谷歌搜索没有发现任何有趣的东西。我怀疑我配置错误,虽然我不确定。有人能给我指出正确的方向吗?感谢您的时间和考虑。

9 个解决方案

#1


105  

I'm using Amazon IAM for the particular key ID and access key and just bumped into the same 403 Forbidden... Turns out you need to give permissions that target both the bucket root and its subobjects:

我正在使用Amazon IAM来获取特定的密钥ID和访问密钥,但是遇到了同样的403禁止……您需要为bucket根及其子对象授予权限:

{
  "Statement": [
    {
      "Principal": {
          "AWS": "*"
      },
      "Effect": "Allow",
      "Action": "s3:*",
      "Resource": ["arn:aws:s3:::bucket-name/*", "arn:aws:s3:::bucket-name"]
    }
  ]
}

#2


47  

I would recommend that you try to test your AWS credentials separately to verify whether the credentials do actually have permission to read and write data to the S3 bucket. The following should work:

我建议您尝试分别测试AWS凭据,以验证凭据是否确实具有对S3桶进行读写的权限。以下工作:

>>> import boto
>>> s3 = boto.connect_s3('<access_key>', '<secret_key>')
>>> bucket = s3.lookup('donebox-static')
>>> key = bucket.new_key('testkey')
>>> key.set_contents_from_string('This is a test')
>>> key.exists()
>>> key.delete()

You should try the same test with the other bucket ('donebox-media'). If this works, the permissions are correct and the problem lies in the Django storages code or configuration. If this fails with a 403 then either:

您应该使用另一个bucket(“donebox-media”)尝试相同的测试。如果这样做有效,那么权限是正确的,问题在于Django存储代码或配置。如果这在403上失败了,那么:

  • The access_key/secret_key strings are incorrect
  • access_key/secret_key字符串不正确
  • The access_key/secret_key are correct but that account doesn't have the necessary permissions to write to the bucket
  • access_key/secret_key是正确的,但是该帐户没有写入bucket的必要权限

I hope that helps. Please report back your findings.

我希望有帮助。请报告你的发现。

#3


39  

I had the same problem and finally discovered that the real problem was the SERVER TIME. It was misconfigured and AWS responds with a 403 FORBIDDEN.

我遇到了同样的问题,最终发现真正的问题是服务器时间。它配置错误,AWS以403禁止响应。

Using Debian you can autoconfigure using NTP:

使用Debian,您可以使用NTP自动配置:

ntpdate 0.pool.ntp.org

ntpdate 0. pool.ntp.org

#4


6  

This will also happen if your machine's time settings are incorrect

如果机器的时间设置不正确,也会发生这种情况

#5


3  

It is also possible that the wrong credentials are being used. To verify:

也可能使用了错误的凭据。验证:

import boto
s3 = boto.connect_s3('<your access key>', '<your secret key>')
bucket = s3.get_bucket('<your bucket>') # does this work?
s3 = boto.connect_s3()
s3.aws_access_key_id  # is the same key being used by default?

If not, take a look at ~/.boto, ~/.aws/config and ~/.aws/credentials.

如果没有,看看~/。宝途,~ /。aws /配置和~ / .aws /凭证。

#6


2  

In case this helps anyone, I had to add the following configuration entry for collectstatic to work and not return 403:

如果这对任何人都有帮助,我就必须为collectstatic添加以下配置条目,而不返回403:

AWS_DEFAULT_ACL = ''

#7


1  

Here is a refinement with minimal permissions. In all cases, as discussed elsewhere s3:ListAllMyBuckets is necessary on all buckets.

这是一个权限最小的优化。在所有情况下,如在其他地方讨论的那样s3: listallmybucket对于所有的bucket都是必需的。

In it's default configuration django-storages will upload files to S3 with public-read permissions - see django-storages Amazon S3 backend

在它的默认配置中,django存储将上传文件到具有公共读权限的S3——参见django存储Amazon S3后端

Trial and error revealed that in this default configuration the only two permissions required are s3:PutObject to upload a file in the first place and s3:PutObjectAcl to set the permissions for that object to public.

试验和错误显示,在这个默认配置中,唯一需要的两个权限是s3:PutObject首先上载一个文件,s3:PutObjectAcl将该对象的权限设置为public。

No additional actions are required because from that point forward read is public on the object anyway.

不需要其他操作,因为从这一点开始,read无论如何在对象上都是公共的。

IAM User Policy - public-read (default):

IAM用户政策-公共阅读(默认):

{
   "Version": "2012-10-17",
   "Statement": [
       {
           "Effect": "Allow",
           "Action": "s3:ListAllMyBuckets",
           "Resource": "arn:aws:s3:::*"
       },
       {
           "Effect": "Allow",
           "Action": [
               "s3:PutObject",
               "s3:PutObjectAcl"
           ],
           "Resource": "arn:aws:s3:::bucketname/*"
       }
   ]
}

It is not always desirable to have objects publicly readable. This is achieved by setting the relevant property in the settings file.

让对象公开可读并不总是可取的。这是通过在设置文件中设置相关属性来实现的。

Django settings.py:

Django settings.py:

...
AWS_DEFAULT_ACL = "private"
...

And then the s3:PutObjectAcl is no longer required and the minimal permissions are as follows:

然后不再需要s3:PutObjectAcl,最小权限如下:

IAM User Policy - private:

IAM用户政策-私人:

{
   "Version": "2012-10-17",
   "Statement": [
       {
           "Effect": "Allow",
           "Action": "s3:ListAllMyBuckets",
           "Resource": "arn:aws:s3:::*"
       },
       {
           "Effect": "Allow",
           "Action": [
               "s3:PutObject",
               "s3:GetObject"
           ],
           "Resource": "arn:aws:s3:::bucketname/*"
       }
   ]
}

#8


0  

Another solution avoiding custom policies and using AWS predefined policies:

另一个避免自定义策略和使用AWS预定义策略的解决方案:

  • Add S3 full access permissions to your S3 user.

    向您的S3用户添加S3的完全访问权限。

    • IAM / Users / Permissions and Attach Policy
    • IAM /用户/权限和附加策略。
    • Add policy "AmazonS3FullAccess"
    • 添加政策“AmazonS3FullAccess”

#9


0  

Maybe you actually don't have access to the bucket you're trying to lookup/get/create..

也许你实际上没有访问要查找/获取/创建的桶的权限。

Remember: bucket names have to be unique across the entire S3 eco-system, so if you try to access (lookup/get/create) a bucket named 'test' you will have no access to it.

记住:bucket名称必须在整个S3生态系统中是唯一的,所以如果您试图访问一个名为“test”的bucket,那么您将无法访问它。

#1


105  

I'm using Amazon IAM for the particular key ID and access key and just bumped into the same 403 Forbidden... Turns out you need to give permissions that target both the bucket root and its subobjects:

我正在使用Amazon IAM来获取特定的密钥ID和访问密钥,但是遇到了同样的403禁止……您需要为bucket根及其子对象授予权限:

{
  "Statement": [
    {
      "Principal": {
          "AWS": "*"
      },
      "Effect": "Allow",
      "Action": "s3:*",
      "Resource": ["arn:aws:s3:::bucket-name/*", "arn:aws:s3:::bucket-name"]
    }
  ]
}

#2


47  

I would recommend that you try to test your AWS credentials separately to verify whether the credentials do actually have permission to read and write data to the S3 bucket. The following should work:

我建议您尝试分别测试AWS凭据,以验证凭据是否确实具有对S3桶进行读写的权限。以下工作:

>>> import boto
>>> s3 = boto.connect_s3('<access_key>', '<secret_key>')
>>> bucket = s3.lookup('donebox-static')
>>> key = bucket.new_key('testkey')
>>> key.set_contents_from_string('This is a test')
>>> key.exists()
>>> key.delete()

You should try the same test with the other bucket ('donebox-media'). If this works, the permissions are correct and the problem lies in the Django storages code or configuration. If this fails with a 403 then either:

您应该使用另一个bucket(“donebox-media”)尝试相同的测试。如果这样做有效,那么权限是正确的,问题在于Django存储代码或配置。如果这在403上失败了,那么:

  • The access_key/secret_key strings are incorrect
  • access_key/secret_key字符串不正确
  • The access_key/secret_key are correct but that account doesn't have the necessary permissions to write to the bucket
  • access_key/secret_key是正确的,但是该帐户没有写入bucket的必要权限

I hope that helps. Please report back your findings.

我希望有帮助。请报告你的发现。

#3


39  

I had the same problem and finally discovered that the real problem was the SERVER TIME. It was misconfigured and AWS responds with a 403 FORBIDDEN.

我遇到了同样的问题,最终发现真正的问题是服务器时间。它配置错误,AWS以403禁止响应。

Using Debian you can autoconfigure using NTP:

使用Debian,您可以使用NTP自动配置:

ntpdate 0.pool.ntp.org

ntpdate 0. pool.ntp.org

#4


6  

This will also happen if your machine's time settings are incorrect

如果机器的时间设置不正确,也会发生这种情况

#5


3  

It is also possible that the wrong credentials are being used. To verify:

也可能使用了错误的凭据。验证:

import boto
s3 = boto.connect_s3('<your access key>', '<your secret key>')
bucket = s3.get_bucket('<your bucket>') # does this work?
s3 = boto.connect_s3()
s3.aws_access_key_id  # is the same key being used by default?

If not, take a look at ~/.boto, ~/.aws/config and ~/.aws/credentials.

如果没有,看看~/。宝途,~ /。aws /配置和~ / .aws /凭证。

#6


2  

In case this helps anyone, I had to add the following configuration entry for collectstatic to work and not return 403:

如果这对任何人都有帮助,我就必须为collectstatic添加以下配置条目,而不返回403:

AWS_DEFAULT_ACL = ''

#7


1  

Here is a refinement with minimal permissions. In all cases, as discussed elsewhere s3:ListAllMyBuckets is necessary on all buckets.

这是一个权限最小的优化。在所有情况下,如在其他地方讨论的那样s3: listallmybucket对于所有的bucket都是必需的。

In it's default configuration django-storages will upload files to S3 with public-read permissions - see django-storages Amazon S3 backend

在它的默认配置中,django存储将上传文件到具有公共读权限的S3——参见django存储Amazon S3后端

Trial and error revealed that in this default configuration the only two permissions required are s3:PutObject to upload a file in the first place and s3:PutObjectAcl to set the permissions for that object to public.

试验和错误显示,在这个默认配置中,唯一需要的两个权限是s3:PutObject首先上载一个文件,s3:PutObjectAcl将该对象的权限设置为public。

No additional actions are required because from that point forward read is public on the object anyway.

不需要其他操作,因为从这一点开始,read无论如何在对象上都是公共的。

IAM User Policy - public-read (default):

IAM用户政策-公共阅读(默认):

{
   "Version": "2012-10-17",
   "Statement": [
       {
           "Effect": "Allow",
           "Action": "s3:ListAllMyBuckets",
           "Resource": "arn:aws:s3:::*"
       },
       {
           "Effect": "Allow",
           "Action": [
               "s3:PutObject",
               "s3:PutObjectAcl"
           ],
           "Resource": "arn:aws:s3:::bucketname/*"
       }
   ]
}

It is not always desirable to have objects publicly readable. This is achieved by setting the relevant property in the settings file.

让对象公开可读并不总是可取的。这是通过在设置文件中设置相关属性来实现的。

Django settings.py:

Django settings.py:

...
AWS_DEFAULT_ACL = "private"
...

And then the s3:PutObjectAcl is no longer required and the minimal permissions are as follows:

然后不再需要s3:PutObjectAcl,最小权限如下:

IAM User Policy - private:

IAM用户政策-私人:

{
   "Version": "2012-10-17",
   "Statement": [
       {
           "Effect": "Allow",
           "Action": "s3:ListAllMyBuckets",
           "Resource": "arn:aws:s3:::*"
       },
       {
           "Effect": "Allow",
           "Action": [
               "s3:PutObject",
               "s3:GetObject"
           ],
           "Resource": "arn:aws:s3:::bucketname/*"
       }
   ]
}

#8


0  

Another solution avoiding custom policies and using AWS predefined policies:

另一个避免自定义策略和使用AWS预定义策略的解决方案:

  • Add S3 full access permissions to your S3 user.

    向您的S3用户添加S3的完全访问权限。

    • IAM / Users / Permissions and Attach Policy
    • IAM /用户/权限和附加策略。
    • Add policy "AmazonS3FullAccess"
    • 添加政策“AmazonS3FullAccess”

#9


0  

Maybe you actually don't have access to the bucket you're trying to lookup/get/create..

也许你实际上没有访问要查找/获取/创建的桶的权限。

Remember: bucket names have to be unique across the entire S3 eco-system, so if you try to access (lookup/get/create) a bucket named 'test' you will have no access to it.

记住:bucket名称必须在整个S3生态系统中是唯一的,所以如果您试图访问一个名为“test”的bucket,那么您将无法访问它。