有没有办法为自动扩展的Cloud Dataflow指定最小数量的工作人员?

时间:2021-07-19 15:35:22

I'd like to specify a minimum number of workers for my job that autoscaling will not go below (akin to how it works for max_num_workers). Is this possible? My reason is that sometimes the worker startup takes long enough that the autoscaling decides to drop the number of workers to one, even though doing so is not optimal for my job. I'd still like to use autoscaling in case the job is larger than my estimated minimum.

我想为我的工作指定一个最小数量的工作人员,自动调节不会低于此值(类似于max_num_workers的工作方式)。这可能吗?我的理由是,有时工人启动需要足够长的时间,自动调节决定将工人数量减少到一个,即使这样做不是我工作的最佳选择。我仍然喜欢使用自动缩放,以防作业大于我估计的最小值。

2 个解决方案

#1


1  

Minimum number of workers is not yet supported. Could file a ticket with job details so that it support can take a look to understand why it downscales to too few workers?

目前尚不支持最低工人数。可以提交一份包含工作详细信息的票证,以便它可以帮助我们理解为什么它会向太少的工人缩减?

#2


0  

According to the Autoscaling documentation, you could specify the maximum number of workers in the --maxNumWorkers option and the --numWorkers as the initial number of workers. You could find a description of these options in this document

根据Autoscaling文档,您可以在--maxNumWorkers选项中指定最大工作数,并将--numWorkers指定为初始工作数。您可以在本文档中找到这些选项的说明

#1


1  

Minimum number of workers is not yet supported. Could file a ticket with job details so that it support can take a look to understand why it downscales to too few workers?

目前尚不支持最低工人数。可以提交一份包含工作详细信息的票证,以便它可以帮助我们理解为什么它会向太少的工人缩减?

#2


0  

According to the Autoscaling documentation, you could specify the maximum number of workers in the --maxNumWorkers option and the --numWorkers as the initial number of workers. You could find a description of these options in this document

根据Autoscaling文档,您可以在--maxNumWorkers选项中指定最大工作数,并将--numWorkers指定为初始工作数。您可以在本文档中找到这些选项的说明