在数据流中查询数据存储时无法找到DEFAULT_INSTANCE

时间:2021-07-31 15:35:41

I am basically just follow the word count example to pull data from datastore in dataflow like

我基本上只是按照单词计数示例从数据流中的数据存储区中提取数据

DatastoreV1.Query.Builder q = DatastoreV1.Query.newBuilder();
    q.addKindBuilder().setName([entity kind]);
    DatastoreV1.Query query = q.build();

    DatastoreIO.Source source = DatastoreIO.source()
            .withDataset([...])
            .withQuery(query)
            .withNamespace([...]);

    PCollection<DatastoreV1.Entity> collection = pipeline.apply(Read.from(source));

But it keeps failing on:

但它一直在失败:

java.lang.RuntimeException: Unable to find DEFAULT_INSTANCE in com.google.api.services.datastore.DatastoreV1$Query at com.google.protobuf.GeneratedMessageLite$SerializedForm.readResolve(GeneratedMessageLite.java:1065) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at ...

java.lang.RuntimeException:无法在com.google.api.service.datastore.DatastoreV1上查找DEFAULT_INSTANCE $查询com.google.protobuf.GeneratedMessageLite $ SerializedForm.readResolve(GeneratedMessageLite.java:1065)at sun.reflect.NativeMethodAccessorImpl。在sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)的invoke0(Native Method)...

Couldn't find any solution that seems relevant in the internet so far.

到目前为止,找不到任何与互联网相关的解决方案。

Might somebody could suggest maybe a general direction on what might be going wrong?

有人可能会建议可能出现问题的大方向吗?

1 个解决方案

#1


2  

Protocol Buffers have certain restrictions. Among others, you have to link in the protobuf Java runtime that matches the version of the protoc compiler that the code was generated with, and you can (normally) have only one runtime present. This applies to all use cases of Protocol Buffers, and they aren't Dataflow specific.

协议缓冲区有一定的限制。除此之外,您必须链接protobuf Java运行时,该运行时与生成代码的protoc编译器的版本相匹配,并且您(通常)只能存在一个运行时。这适用于协议缓冲区的所有用例,并且它们不是特定于数据流的。

Dataflow SDK for Java, version 1.4.0 and older, depends on protobuf version 2.5 and links in a Datastore client library generated with the corresponding protoc compiler. The easiest solution is not to override any protobuf-java and google-api-services-datastore-protobuf dependencies and let them be brought into your project by the Dataflow SDK.

适用于Java的1.4版及更早版本的Dataflow SDK依赖于protobuf版本2.5以及使用相应protoc编译器生成的数据存储区客户端库中的链接。最简单的解决方案是不要覆盖任何protobuf-java和google-api-services-datastore-protobuf依赖项,并让它们通过Dataflow SDK引入您的项目。

If you really have to upgrade to protobuf version 3 for an unrelated reason, you should also upgrade google-api-services-datastore-protobuf to version v1beta2-rev1-4.0.0, because that one was generated with the corresponding protoc compiler. Please note that this is a workaround for Datastore only -- I would expect other dependencies that require protobuf version 2 to break, unless they are upgraded too.

如果你真的因为一个无关的原因需要升级到protobuf版本3,你还应该将google-api-services-datastore-protobuf升级到版本v1beta2-rev1-4.0.0,因为那个是用相应的protoc编译器生成的。请注意,这只是数据存储区的解决方法 - 我希望其他需要protobuf版本2的依赖项可以中断,除非它们也进行了升级。

Now, we are actively working on upgrading the Dataflow SDK to protobuf version 3. I'd expect this functionality in the next minor release, possibly 1.5.0. Since any version of the Dataflow SDK can support only one protobuf at a time, support for version 2 will break at that time, unless a few dependencies are manually rolled back.

现在,我们正在积极地将Dataflow SDK升级到protobuf版本3.我期望在下一个次要版本中使用此功能,可能是1.5.0。由于任何版本的Dataflow SDK一次只能支持一个protobuf,因此对版本2的支持将在那时中断,除非手动回滚一些依赖项。

#1


2  

Protocol Buffers have certain restrictions. Among others, you have to link in the protobuf Java runtime that matches the version of the protoc compiler that the code was generated with, and you can (normally) have only one runtime present. This applies to all use cases of Protocol Buffers, and they aren't Dataflow specific.

协议缓冲区有一定的限制。除此之外,您必须链接protobuf Java运行时,该运行时与生成代码的protoc编译器的版本相匹配,并且您(通常)只能存在一个运行时。这适用于协议缓冲区的所有用例,并且它们不是特定于数据流的。

Dataflow SDK for Java, version 1.4.0 and older, depends on protobuf version 2.5 and links in a Datastore client library generated with the corresponding protoc compiler. The easiest solution is not to override any protobuf-java and google-api-services-datastore-protobuf dependencies and let them be brought into your project by the Dataflow SDK.

适用于Java的1.4版及更早版本的Dataflow SDK依赖于protobuf版本2.5以及使用相应protoc编译器生成的数据存储区客户端库中的链接。最简单的解决方案是不要覆盖任何protobuf-java和google-api-services-datastore-protobuf依赖项,并让它们通过Dataflow SDK引入您的项目。

If you really have to upgrade to protobuf version 3 for an unrelated reason, you should also upgrade google-api-services-datastore-protobuf to version v1beta2-rev1-4.0.0, because that one was generated with the corresponding protoc compiler. Please note that this is a workaround for Datastore only -- I would expect other dependencies that require protobuf version 2 to break, unless they are upgraded too.

如果你真的因为一个无关的原因需要升级到protobuf版本3,你还应该将google-api-services-datastore-protobuf升级到版本v1beta2-rev1-4.0.0,因为那个是用相应的protoc编译器生成的。请注意,这只是数据存储区的解决方法 - 我希望其他需要protobuf版本2的依赖项可以中断,除非它们也进行了升级。

Now, we are actively working on upgrading the Dataflow SDK to protobuf version 3. I'd expect this functionality in the next minor release, possibly 1.5.0. Since any version of the Dataflow SDK can support only one protobuf at a time, support for version 2 will break at that time, unless a few dependencies are manually rolled back.

现在,我们正在积极地将Dataflow SDK升级到protobuf版本3.我期望在下一个次要版本中使用此功能,可能是1.5.0。由于任何版本的Dataflow SDK一次只能支持一个protobuf,因此对版本2的支持将在那时中断,除非手动回滚一些依赖项。