Dataflow pipline选项的服务帐户凭据

时间:2022-01-16 14:03:57

Upgrading from Dataflow 1.9 to Beam 0.4.0. The methods on GcpOptions to set service account name (setServiceAccountName) and key file (setServiceAccountKeyFile) are no longer available. The closest alternative is setGcpCredential.

从Dataflow 1.9升级到Beam 0.4.0。 GcpOptions上设置服务帐户名(setServiceAccountName)和密钥文件(setServiceAccountKeyFile)的方法不再可用。最接近的替代方案是setGcpCredential。

To manually create the GoogleCredential, what will be the appropriate scopes to use? My pipelines need to access PubSub, Datastore, and BigQuery, potentially Cloud Storage.

要手动创建GoogleCredential,要使用的适当范围是什么?我的管道需要访问PubSub,Datastore和BigQuery,可能是云存储。

new GoogleCredential.Builder()
    .setTransport(HTTP_TRANSPORT)
    .setJsonFactory(JSON_FACTORY)
    .setServiceAccountId(serviceAccount)
    .setServiceAccountScopes(SCOPES)  // what will be the scopes?
    .setServiceAccountPrivateKeyFromP12File(p12file)
    .build();

1 个解决方案

#1


1  

I believe based on this list that all of those should be accessible using the https://www.googleapis.com/auth/cloud-platform scope.

我相信基于此列表,可以使用https://www.googleapis.com/auth/cloud-platform范围访问所有这些内容。

#1


1  

I believe based on this list that all of those should be accessible using the https://www.googleapis.com/auth/cloud-platform scope.

我相信基于此列表,可以使用https://www.googleapis.com/auth/cloud-platform范围访问所有这些内容。