I am currently performing the migration using a single machine which runs sequentially and reads the Entities from the namespaces which is painful for me. Is it possible to use Google Cloud Dataflow to perform the migration easier?
我目前正在使用一台顺序运行的机器执行迁移,并从命名空间中读取实体,这对我来说很痛苦。是否可以使用Google Cloud Dataflow更轻松地执行迁移?
1 个解决方案
#1
2
You should be able to use DatastoreIO to manipulate the records and process them in parallel.
您应该能够使用DatastoreIO来操作记录并并行处理它们。
PCollection<Entity> entities = p.apply(
Read.from(DatastoreIO.read()
.withDataset(datasetId)
.withQuery(query)
.withHost(host)));
p.apply(<Your transform>)
p.apply(DatastoreIO.writeTo(dataset));
p.run();
As of Dataflow SDK for Java 1.2.0, support for querying and writing Datastore Entities in namespaces has been added.
从Dataflow SDK for Java 1.2.0开始,添加了对在命名空间中查询和编写数据存储实体的支持。
#1
2
You should be able to use DatastoreIO to manipulate the records and process them in parallel.
您应该能够使用DatastoreIO来操作记录并并行处理它们。
PCollection<Entity> entities = p.apply(
Read.from(DatastoreIO.read()
.withDataset(datasetId)
.withQuery(query)
.withHost(host)));
p.apply(<Your transform>)
p.apply(DatastoreIO.writeTo(dataset));
p.run();
As of Dataflow SDK for Java 1.2.0, support for querying and writing Datastore Entities in namespaces has been added.
从Dataflow SDK for Java 1.2.0开始,添加了对在命名空间中查询和编写数据存储实体的支持。