确认云功能中数据流作业的成功

时间:2021-09-14 15:35:31

Is there a way for cloud functions to confirm whether a dataflow job has succeeded or not??

云功能是否有办法确认数据流作业是否成功?

Cloud function that I tried:

我试过的云功能:

const google = require('googleapis');

exports.statusJob = function(event, callback) {
 const file = event.data;
 if (file.resourceState === 'exists' && file.name) {
     console.log(file.name);
   console.log(event.data);
   google.auth.getApplicationDefault(function (err, authClient, projectId) {
     if (err) {
       throw err;
     }

     if (authClient.createScopedRequired && authClient.createScopedRequired()) {
       authClient = authClient.createScoped([
         'https://www.googleapis.com/auth/cloud-platform',
         'https://www.googleapis.com/auth/userinfo.email'
       ]);
     }

     const dataflow = google.dataflow({ version: 'v1b3', auth: authClient });

     dataflow.projects.jobs.get({
       projectId: 'my-project-id',
       resource: {
         jobId: 'some_number'
       }
     }, function(err, response) {
       if (err) {
         console.error("problem running dataflow template, error was: ", err);
       }
       console.log("Dataflow template response: ", response);
       callback();
     });

   });
 }
};

Package JSON:

包JSON:

{
  "name": "test",
  "version": "1.0.0",
  "description": "",
  "main": "index.js",
  "scripts": {
    "test": "echo \"Error: no test specified\" && exit 1"
  },
  "author": "",
  "license": "ISC",
  "dependencies": {
    "googleapis": "^18.0.0"
  }
}

The above thing worked for me perfectly ONCE. The response that I got was:

以上的事情对我来说非常有用。我得到的回应是:

Dataflow template response: { id: 'some_number', projectId: 'my-project-id', name: 'cloud-fn', type: 'JOB_TYPE_BATCH', environment: { userAgent: { name: 'Google Cloud Dataflow SDK for Java', support: [Object], 'build.date': '2017-05-23 19:46', version: '2.0.0' }, version: { major: '6', job_type: 'JAVA_BATCH_AUTOSCALING' } }, currentState: 'JOB_STATE_DONE',........ 

Then it gave an error each time after that saying:

然后每次发出错误后说:

problem running dataflow template, error was: Error: Missing required parameters: jobId at createAPIRequest (/user_code/node_modules/googleapis/lib/apirequest.js:110:14) at Object.get (/user_code/node_modules/googleapis/apis/dataflow/v1b3.js:670:16) at /user_code/index.js:22:29 at callback (/user_code/node_modules/googleapis/node_modules/google-auth-library/lib/auth/googleauth.js:42:14) at /user_code/node_modules/googleapis/node_modules/google-auth-library/lib/auth/googleauth.js:289:13 at _combinedTickCallback (internal/process/next_tick.js:73:7) at process._tickDomainCallback (internal/process/next_tick.js:128:9)

Does anyone know anything about this?

有人对这个有了解吗?

Thanks

谢谢

1 个解决方案

#1


1  

You can use the Dataflow CLI to determine if a job failed or succeeded. It lets you list jobs and also check their failed/success/running/cancelled status.

您可以使用Dataflow CLI确定作业是失败还是成功。它允许您列出作业并检查其失败/成功/运行/取消状态。

Specifically, to check the state of a single job, you may run:

具体来说,要检查单个作业的状态,您可以运行:

gcloud beta dataflow jobs describe <JOB_ID>

For more info check the docs:

有关更多信息,请查看文档:

https://cloud.google.com/dataflow/pipelines/dataflow-command-line-intf

https://cloud.google.com/dataflow/pipelines/dataflow-command-line-intf

#1


1  

You can use the Dataflow CLI to determine if a job failed or succeeded. It lets you list jobs and also check their failed/success/running/cancelled status.

您可以使用Dataflow CLI确定作业是失败还是成功。它允许您列出作业并检查其失败/成功/运行/取消状态。

Specifically, to check the state of a single job, you may run:

具体来说,要检查单个作业的状态,您可以运行:

gcloud beta dataflow jobs describe <JOB_ID>

For more info check the docs:

有关更多信息,请查看文档:

https://cloud.google.com/dataflow/pipelines/dataflow-command-line-intf

https://cloud.google.com/dataflow/pipelines/dataflow-command-line-intf