如何使用python在BigQuery中执行job.insert?我得到“需要登录”,但可以列出所有表格和数据集

时间:2021-09-27 15:45:57

I've been trying to get the API client working in Ruby to do a job insert to take data from cloud storage and put it into a table in BigQuery and haven't been too successful. In the past, I've looked at the Python API and got things going in Ruby, but this puzzles me.

我一直在努力让API客户端在Ruby中工作,以便从云存储中获取数据,并将其放入BigQuery的表中并且不太成功。在过去,我已经看过Python API,并且用Ruby编写了一些内容,但这让我很困惑。

import httplib2
import urllib2

from apiclient.discovery import build
from oauth2client.client import SignedJwtAssertionCredentials


def loadTable(service, projectId, datasetId, targetTableId):
  try:
    jobCollection = service.jobs()
    jobData = {
      'projectId': XXXXXXXXX,
      'configuration': {
          'load': {
            'sourceUris': ["gs://person-bucket/person_json.tar.gz"],
            'schema': {
              'fields'=> [
                  { 'name'=>'person_id', 'type'=>'integer' },
                  { 'name'=> 'person_name', 'type'=>'string' },
                  { 'name'=> 'logged_in_at', 'type'=>'timestamp' },
                ]
            },
            'destinationTable': {
              'projectId': XXXXXXXXX,
              'datasetId': 'personDataset',
              'tableId': 'person'
            },
          }
        }
      }

    insertResponse = jobCollection.insert(projectId=projectId, body=jobData).execute()

    # Ping for status until it is done, with a short pause between calls.
    import time
    while True:
      job = jobCollection.get(projectId=projectId,
                                 jobId=insertResponse['jobReference']['jobId']).execute()
      if 'DONE' == job['status']['state']:
          print 'Done Loading!'
          return

      print 'Waiting for loading to complete...'
      time.sleep(10)

    if 'errorResult' in job['status']:
      print 'Error loading table: ', pprint.pprint(job)
      return

  except urllib2.HTTPError as err:
    print 'Error in loadTable: ', pprint.pprint(err.resp)



PROJECT_NUMBER = 'XXXXXXXXX'
SERVICE_ACCOUNT_EMAIL = 'XXXXXXXXX@developer.gserviceaccount.com'

f = file('key.p12', 'rb')
key = f.read()
f.close()

credentials = SignedJwtAssertionCredentials(
    SERVICE_ACCOUNT_EMAIL,
    key,
    scope='https://www.googleapis.com/auth/bigquery')

http = httplib2.Http()
http = credentials.authorize(http)

service = build('bigquery', 'v2')
tables = service.tables()
response = tables.list(projectId=PROJECT_NUMBER, datasetId='person_dataset').execute(http)

print(response)
print("-------------------------------")


loadTable(service, PROJECT_NUMBER, "person_dataset", "person_table")

When I ask for the list of tables, I must be authorized, and can view the table details, but yet can't seem to get a table to be created with data imported from cloud storage.

当我要求表格列表时,我必须获得授权,并且可以查看表格详细信息,但似乎无法使用从云存储导入的数据创建表格。

This is the output I get in the console:

这是我在控制台中获得的输出:

No handlers could be found for logger "oauth2client.util"
{u'totalItems': 2, u'tables': [{u'kind': u'bigquery#table', u'id': u'xxx:xxx.xxx', u'tableReference': {u'projectId': u'xxx', u'tableId': u'xxx', u'datasetId': u'xxx'}}, {u'kind': u'bigquery#table', u'id': u'xxx:xxx.yyy', u'tableReference': {u'projectId': u'xxx', u'tableId': u'yyy', u'datasetId': u'xxx'}}], u'kind': u'bigquery#tableList', u'etag': u'"zzzzzzzzzzzzzzzz"'}
Traceback (most recent call last):
  File "test.py", line 96, in <module>
    loadTable(service, PROJECT_NUMBER, "person_dataset", "person_table")
  File "test.py", line 50, in loadTable
    body=jobData).execute()
  File "/usr/local/lib/python2.7/dist-packages/oauth2client-1.2-py2.7.egg/oauth2client/util.py", line 132, in positional_wrapper
    return wrapped(*args, **kwargs)
  File "/usr/local/lib/python2.7/dist-packages/google_api_python_client-1.2-py2.7.egg/apiclient/http.py", line 723, in execute
    raise HttpError(resp, content, uri=self.uri)
apiclient.errors.HttpError: <HttpError 401 when requesting https://www.googleapis.com/bigquery/v2/projects/xxxxxxxx/jobs?alt=json returned "Login Required">

Could someone please tell me what I'm doing wrong or point me in the right direction?

有人可以告诉我我做错了什么或指出我正确的方向?

Any help would be really appreciated.

任何帮助将非常感激。

Thanks and have a great day.

谢谢,祝你有个美好的一天。

2 个解决方案

#1


1  

I'm not a ruby developer, but I believe that when you call build('bigquery', 'v2') you should pass the authorized http object. The methods used appear to be the same as python -- a relevant example is here: https://developers.google.com/api-client-library/python/samples/authorized_api_cmd_line_calendar.py

我不是一个红宝石开发人员,但我相信当你调用build('bigquery','v2')时,你应该传递授权的http对象。使用的方法与python相同 - 相关示例如下:https://developers.google.com/api-client-library/python/samples/authorized_api_cmd_line_calendar.py

#2


0  

Thanks for that. Question is resolved: If anyone else is interested please look here: How to import a json from a file on cloud storage to Bigquery

感谢那。问题已解决:如果有其他人感兴趣,请查看此处:如何将json从云存储上的文件导入Bigquery

#1


1  

I'm not a ruby developer, but I believe that when you call build('bigquery', 'v2') you should pass the authorized http object. The methods used appear to be the same as python -- a relevant example is here: https://developers.google.com/api-client-library/python/samples/authorized_api_cmd_line_calendar.py

我不是一个红宝石开发人员,但我相信当你调用build('bigquery','v2')时,你应该传递授权的http对象。使用的方法与python相同 - 相关示例如下:https://developers.google.com/api-client-library/python/samples/authorized_api_cmd_line_calendar.py

#2


0  

Thanks for that. Question is resolved: If anyone else is interested please look here: How to import a json from a file on cloud storage to Bigquery

感谢那。问题已解决:如果有其他人感兴趣,请查看此处:如何将json从云存储上的文件导入Bigquery