I have a Google Cloud Dataflow job to be executed using the Apache Beam API (0.4.0). The pipeline runs successfully using a local runner. When I submit the job using the dataflow runner the job is submitted, however it fails after 32 seconds without displaying a reason anywhere. The logs appear to be empty. The gcloud cli isn't any help either:
我有一个使用Apache Beam API(0.4.0)执行的Google Cloud Dataflow作业。管道使用本地运行程序成功运行。当我使用数据流运行器提交作业时,作业被提交,但是在32秒后它失败而没有在任何地方显示原因。日志似乎是空的。 gcloud cli也没有任何帮助:
$ gcloud beta dataflow logs list 2017-01-23_12_51_23-5463584243087329795
E 2017-01-23T21:51:53 2017-01-23_12_51_23-5463584243087329795_00000159cd197209 (cdfde4683948d134): Workflow failed.
How can I track down the cause of the error?
如何找出错误原因?
1 个解决方案
#1
2
This is because the Dataflow API has not been enabled for your project. Do step 3 here: https://cloud.google.com/dataflow/docs/quickstarts/quickstart-java-maven
这是因为尚未为您的项目启用Dataflow API。请执行以下步骤:https://cloud.google.com/dataflow/docs/quickstarts/quickstart-java-maven
Dataflow will have a better error message for this soon.
数据流很快就会有更好的错误信息。
#1
2
This is because the Dataflow API has not been enabled for your project. Do step 3 here: https://cloud.google.com/dataflow/docs/quickstarts/quickstart-java-maven
这是因为尚未为您的项目启用Dataflow API。请执行以下步骤:https://cloud.google.com/dataflow/docs/quickstarts/quickstart-java-maven
Dataflow will have a better error message for this soon.
数据流很快就会有更好的错误信息。