使用云功能执行Dataflow模板时出错

时间:2022-12-04 20:23:58

Getting below error while trying to execute custom dataflow template using Google Cloud function.

尝试使用Google Cloud功能执行自定义数据流模板时出现以下错误。

Error:"problem running dataflow template, error was: { Error: A Forbidden error was returned while attempting to retrieve an access token for the Compute Engine built-in service account. This may be because the Compute Engine instance does not have the correct permission scopes specified. Could not refresh access token".

错误:“运行数据流模板时出现问题,错误是:{错误:尝试检索计算引擎内置服务帐户的访问令牌时返回了禁止错误。这可能是因为计算引擎实例没有正确的权限指定的范围。无法刷新访问令牌“。

I have tried supplying all the required permissions and scopes.Could someone please suggest a resolution.

我已经尝试提供所有必需的权限和范围。有人可以建议一个解决方案。

1 个解决方案

#1


2  

The google-cloud node library does not yet support the Dataflow API, so the current way to use that API is the googleapis library.

Google-cloud节点库尚不支持Dataflow API,因此使用该API的当前方式是googleapis库。

Following the instructions there, I've tried to launch a Dataflow job with a Google-provided template using an HTTP-triggered function, and had no issues:

按照那里的说明,我尝试使用HTTP触发的函数启动带有Google提供的模板的Dataflow作业,并且没有任何问题:

const {google} = require('googleapis');
const project = "your-project-id"

exports.launchDataflowTemplate = (req, res) => {
    let result;
    google.auth.getApplicationDefault(function(err, authClient, projectId) {
            if (err) {
                throw err;
            }
            if (authClient.createScopedRequired && authClient.createScopedRequired()) {
                authClient = authClient.createScoped([
                    'https://www.googleapis.com/auth/cloud-platform',
                    'https://www.googleapis.com/auth/compute',
                    'https://www.googleapis.com/auth/compute.readonly',
                    'https://www.googleapis.com/auth/userinfo.email'
                ]);
            }
            var dataflow = google.dataflow({
                version: "v1b3",
                auth: authClient
            });

            var launchParams = {
                "inputFilePattern": "gs://your-input-bucket/*.gz",
                "outputDirectory": "gs://your-result-bucket/",
                "outputFailureFile": "gs://your-logs-bucket/error.csv"
            };

            var env = {
               "tempLocation": "gs://your-staging-bucket/temp",
               "zone": "us-central1-f"
            }


            var opts = {
                projectId: project,
                gcsPath: "gs://dataflow-templates/latest/Bulk_Decompress_GCS_Files",
                resource: {
                    parameters: launchParams,
                    environment: env
                }
            };

            dataflow.projects.templates.launch(opts, (err, result) => {
                if (err) {
                    throw err;
                }
                res.send(result.data);
            });
    });
};

#1


2  

The google-cloud node library does not yet support the Dataflow API, so the current way to use that API is the googleapis library.

Google-cloud节点库尚不支持Dataflow API,因此使用该API的当前方式是googleapis库。

Following the instructions there, I've tried to launch a Dataflow job with a Google-provided template using an HTTP-triggered function, and had no issues:

按照那里的说明,我尝试使用HTTP触发的函数启动带有Google提供的模板的Dataflow作业,并且没有任何问题:

const {google} = require('googleapis');
const project = "your-project-id"

exports.launchDataflowTemplate = (req, res) => {
    let result;
    google.auth.getApplicationDefault(function(err, authClient, projectId) {
            if (err) {
                throw err;
            }
            if (authClient.createScopedRequired && authClient.createScopedRequired()) {
                authClient = authClient.createScoped([
                    'https://www.googleapis.com/auth/cloud-platform',
                    'https://www.googleapis.com/auth/compute',
                    'https://www.googleapis.com/auth/compute.readonly',
                    'https://www.googleapis.com/auth/userinfo.email'
                ]);
            }
            var dataflow = google.dataflow({
                version: "v1b3",
                auth: authClient
            });

            var launchParams = {
                "inputFilePattern": "gs://your-input-bucket/*.gz",
                "outputDirectory": "gs://your-result-bucket/",
                "outputFailureFile": "gs://your-logs-bucket/error.csv"
            };

            var env = {
               "tempLocation": "gs://your-staging-bucket/temp",
               "zone": "us-central1-f"
            }


            var opts = {
                projectId: project,
                gcsPath: "gs://dataflow-templates/latest/Bulk_Decompress_GCS_Files",
                resource: {
                    parameters: launchParams,
                    environment: env
                }
            };

            dataflow.projects.templates.launch(opts, (err, result) => {
                if (err) {
                    throw err;
                }
                res.send(result.data);
            });
    });
};