可以扩展Cloud Dataflow上的25个工作限制吗?

时间:2020-12-10 15:36:10

I am currently evaluating Google Cloud Dataflow. According to the documentation, you may run up to 25 concurrent Dataflow jobs per Cloud Platform project..

我目前正在评估Google Cloud Dataflow。根据文档,每个Cloud Platform项目最多可以运行25个并发Dataflow作业。

This seems very low. I could see a lot of value in running 100s of small to medium size jobs, to be able to support parallel deployments, testing, and to modularize pipelines (many small jobs seems like it could be easier to update/upgrade, even though Dataflow has some support for in-place pipeline upgrades).

这似乎很低。我可以看到运行100个中小型作业有很多价值,能够支持并行部署,测试和模块化管道(许多小型工作似乎更容易更新/升级,即使Dataflow有一些支持就地管道升级)。

Can this limit be increased in some way, like many quotas can be on GCP? What are common practices to work around this limitation?

可以通过某种方式增加此限制,例如可以在GCP上使用许多配额吗?解决此限制的常见做法是什么?

1 个解决方案

#1


0  

Yes, you can ask to increase the limit for your project. Can you please write to dataflow-feedback@google.com and give your project ID?

是的,您可以要求增加项目的限制。您能否写信给dataflow-feedback@google.com并提供您的项目ID?

#1


0  

Yes, you can ask to increase the limit for your project. Can you please write to dataflow-feedback@google.com and give your project ID?

是的,您可以要求增加项目的限制。您能否写信给dataflow-feedback@google.com并提供您的项目ID?