如何重新启动已取消的Cloud Dataflow流式传输作业?

时间:2022-09-06 15:23:02

I've created a standard PubSub to BigQuery dataflow. However, in order to ensure I wasn't going to run up a huge bill while offline, I cancelled the dataflow. From the GCP console, there doesn't seem to be an option to restart it - is this possible, either through the console, or through the shell (and if so, how)?

我已经为BigQuery数据流创建了一个标准的PubSub。但是,为了确保我不会在离线时运行巨额账单,我取消了数据流。从GCP控制台,似乎没有一个选项可以重新启动它 - 这可能是通过控制台,还是通过shell(如果是这样,如何)?

1 个解决方案

#1


1  

Cloud Dataflow currently does not provide a mechanism to restart a Dataflow job that has been stopped or cancelled.

Cloud Dataflow当前不提供重新启动已停止或取消的Dataflow作业的机制。

However, for this Pub/Sub -> BigQuery flow, one way to approach this would be to use the Google-provided Pub/Sub to BigQuery template; these templates provide code-free solutions for common data movement patterns using Cloud Dataflow.

但是,对于这个Pub / Sub - > BigQuery流程,一种方法是使用Google提供的Pub / Sub到BigQuery模板;这些模板使用Cloud Dataflow为常见数据移动模式提供无代码解决方案。

You can execute a streaming Dataflow job using this template, via the REST API, using a unique job name to ensure that there is only one instance of this Dataflow job running at any point in time. If the job were cancelled, you could (re)start this streaming Dataflow job by running the same command again.

您可以使用此模板通过REST API使用唯一作业名执行流数据流作业,以确保在任何时间点只运行此Dataflow作业的一个实例。如果作业被取消,您可以通过再次运行相同的命令(重新)启动此流数据流作业。

#1


1  

Cloud Dataflow currently does not provide a mechanism to restart a Dataflow job that has been stopped or cancelled.

Cloud Dataflow当前不提供重新启动已停止或取消的Dataflow作业的机制。

However, for this Pub/Sub -> BigQuery flow, one way to approach this would be to use the Google-provided Pub/Sub to BigQuery template; these templates provide code-free solutions for common data movement patterns using Cloud Dataflow.

但是,对于这个Pub / Sub - > BigQuery流程,一种方法是使用Google提供的Pub / Sub到BigQuery模板;这些模板使用Cloud Dataflow为常见数据移动模式提供无代码解决方案。

You can execute a streaming Dataflow job using this template, via the REST API, using a unique job name to ensure that there is only one instance of this Dataflow job running at any point in time. If the job were cancelled, you could (re)start this streaming Dataflow job by running the same command again.

您可以使用此模板通过REST API使用唯一作业名执行流数据流作业,以确保在任何时间点只运行此Dataflow作业的一个实例。如果作业被取消,您可以通过再次运行相同的命令(重新)启动此流数据流作业。