使用Apache Beam的Join类时出现问题

时间:2021-08-22 15:29:35

I was writing a code to do a LeftOuterJoin using Apache beam the Class provided by apache to do work easily apache Provide a join class org.apache.beam.sdk.extensions.joinlibrary.Join; While the whole code works properly when I use POJO class or String, Integer, Long in KV format but fails when I use TableRow in KV and Throws an Exception. I have also Shared a Code Below the Exception for reference.

我正在编写一个代码来使用Apache做一个LeftOuterJoin,使用apache提供的类来轻松地工作apache提供一个连接类org.apache.beam.sdk.extensions.joinlibrary.Join;当我在KV格式中使用POJO类或String,Integer,Long时整个代码正常工作,但是当我在KV中使用TableRow并引发异常时失败。我还在例外情况下共享了一个代码供参考。

Apr 12, 2018 6:26:03 PM org.apache.beam.sdk.Pipeline validate
WARNING: The following transforms do not have stable unique names: ParDo(Anonymous), Create.Values
Exception in thread "main" java.lang.IllegalArgumentException: unable to serialize DoFnAndMainOutput{doFn=org.apache.beam.sdk.extensions.joinlibrary.Join$2@1817f1eb, mainOutputTag=Tag<output>}
    at org.apache.beam.sdk.util.SerializableUtils.serializeToByteArray(SerializableUtils.java:57)
    at org.apache.beam.runners.direct.repackaged.runners.core.construction.ParDoTranslation.translateDoFn(ParDoTranslation.java:440)
    at org.apache.beam.runners.direct.repackaged.runners.core.construction.ParDoTranslation$1.translateDoFn(ParDoTranslation.java:148)
    at org.apache.beam.runners.direct.repackaged.runners.core.construction.ParDoTranslation.payloadForParDoLike(ParDoTranslation.java:656)
    at org.apache.beam.runners.direct.repackaged.runners.core.construction.ParDoTranslation.translateParDo(ParDoTranslation.java:144)
    at org.apache.beam.runners.direct.repackaged.runners.core.construction.ParDoTranslation$ParDoPayloadTranslator.translate(ParDoTranslation.java:108)
    at org.apache.beam.runners.direct.repackaged.runners.core.construction.PTransformTranslation.toProto(PTransformTranslation.java:193)
    at org.apache.beam.runners.direct.repackaged.runners.core.construction.ParDoTranslation.getParDoPayload(ParDoTranslation.java:515)
    at org.apache.beam.runners.direct.repackaged.runners.core.construction.ParDoTranslation.isSplittable(ParDoTranslation.java:525)
    at org.apache.beam.runners.direct.repackaged.runners.core.construction.PTransformMatchers$4.matches(PTransformMatchers.java:194)
    at org.apache.beam.sdk.Pipeline$2.visitPrimitiveTransform(Pipeline.java:278)
    at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:670)
    at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:662)
    at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:662)
    at org.apache.beam.sdk.runners.TransformHierarchy$Node.access$600(TransformHierarchy.java:311)
    at org.apache.beam.sdk.runners.TransformHierarchy.visit(TransformHierarchy.java:245)
    at org.apache.beam.sdk.Pipeline.traverseTopologically(Pipeline.java:458)
    at org.apache.beam.sdk.Pipeline.replace(Pipeline.java:256)
    at org.apache.beam.sdk.Pipeline.replaceAll(Pipeline.java:209)
    at org.apache.beam.runners.direct.DirectRunner.run(DirectRunner.java:173)
    at org.apache.beam.runners.direct.DirectRunner.run(DirectRunner.java:62)
    at org.apache.beam.sdk.Pipeline.run(Pipeline.java:311)
    at org.apache.beam.sdk.Pipeline.run(Pipeline.java:297)
    at com.bitwise.StarterPipeline.main(StarterPipeline.java:93)
Caused by: java.io.NotSerializableException: com.google.api.services.bigquery.model.TableRow
    at java.io.ObjectOutputStream.writeObject0(Unknown Source)
    at java.io.ObjectOutputStream.defaultWriteFields(Unknown Source)
    at java.io.ObjectOutputStream.writeSerialData(Unknown Source)
    at java.io.ObjectOutputStream.writeOrdinaryObject(Unknown Source)
    at java.io.ObjectOutputStream.writeObject0(Unknown Source)
    at java.io.ObjectOutputStream.defaultWriteFields(Unknown Source)
    at java.io.ObjectOutputStream.writeSerialData(Unknown Source)
    at java.io.ObjectOutputStream.writeOrdinaryObject(Unknown Source)
    at java.io.ObjectOutputStream.writeObject0(Unknown Source)
    at java.io.ObjectOutputStream.writeObject(Unknown Source)
    at org.apache.beam.sdk.util.SerializableUtils.serializeToByteArray(SerializableUtils.java:53)
    ... 23 more

Code

import org.apache.beam.runners.dataflow.options.DataflowPipelineOptions;
    import org.apache.beam.runners.direct.DirectRunner;
    import org.apache.beam.sdk.Pipeline;
    import org.apache.beam.sdk.extensions.joinlibrary.Join;
    import org.apache.beam.sdk.options.PipelineOptionsFactory;
    import org.apache.beam.sdk.transforms.Create;
    import org.apache.beam.sdk.transforms.DoFn;
    import org.apache.beam.sdk.transforms.ParDo;
    import org.apache.beam.sdk.values.KV;
    import org.apache.beam.sdk.values.PCollection;
    import org.slf4j.Logger;
    import org.slf4j.LoggerFactory;

    import com.google.api.services.bigquery.model.TableRow;

    public class StarterPipeline {
      private static final Logger LOG = LoggerFactory.getLogger(StarterPipeline.class);

      static transient TableRow t= new TableRow();
      public static void main(String[] args) {
          DataflowPipelineOptions options = PipelineOptionsFactory.as(DataflowPipelineOptions.class);
          options.setRunner(DirectRunner.class);
          options.setProject("Project Name");
          options.setTempLocation("Location");
          options.setStagingLocation("Location");
            Pipeline p = Pipeline.create(options);


        PCollection<KV<String, String>> leftPcollection = p.apply(Create.of("Kishan")).apply(ParDo.of(new DoFn<String,KV<String,String>>(){
            @ProcessElement
            public void processElement(ProcessContext c){
                c.output(KV.of("Kishan", "Kumar"));
                c.output(KV.of("Kishan1", "Test"));
            }
        }));
    //          
        PCollection<KV<String, TableRow>> rightPcollection = p.apply(Create.of("Kishan")).apply(ParDo.of(new DoFn<String,KV<String,TableRow>>(){
            @ProcessElement
            public void processElement(ProcessContext c){
                c.output(KV.of("Kishan",new TableRow().set("Key", "Value")));
            }
        }));
    //          
        PCollection<TableRow> joinedPcollection =
                  Join.leftOuterJoin(leftPcollection, rightPcollection,t).apply("Tesdt",ParDo.of(new DoFn<KV<String, KV<String, TableRow>>,TableRow>(){
                      @ProcessElement
            public void processElement(ProcessContext c){
                         //Processing
                      }
                  }));



        p.run();
      }
    }

1 个解决方案

#1


2  

This is because your DoFn is serialized with Java serialization in order to distribute and run it, but TableRow cannot be serialized via Java serialization.

这是因为您的DoFn使用Java序列化进行序列化以便分发和运行它,但TableRow无法通过Java序列化进行序列化。

I don't see where in your code snippet there is an actual TableRow value in the closure of a DoFn but that is surely the cause.

我没有看到你的代码片段在DoFn的闭包中有一个实际的TableRow值,但这肯定是原因。

#1


2  

This is because your DoFn is serialized with Java serialization in order to distribute and run it, but TableRow cannot be serialized via Java serialization.

这是因为您的DoFn使用Java序列化进行序列化以便分发和运行它,但TableRow无法通过Java序列化进行序列化。

I don't see where in your code snippet there is an actual TableRow value in the closure of a DoFn but that is surely the cause.

我没有看到你的代码片段在DoFn的闭包中有一个实际的TableRow值,但这肯定是原因。