将数据附加到bigquery时,dataprep作业失败

时间:2021-11-08 15:05:21

It works when it's exporting to a new table in bigquery but failed when append to an existing big query data. Both table has exactly same schema, here is the error code. anyone understand what's the error code mean here? Thanks.

它在bigquery中导出到新表时有效,但在追加到现有的大查询数据时失败。两个表都具有完全相同的模式,这是错误代码。任何人都明白这里的错误代码是什么意思?谢谢。

java.lang.RuntimeException:
org.apache.beam.sdk.util.UserCodeException:
java.lang.ClassCastException:
java.lang.String cannot be cast to com.trifacta.dataflow.types.TimeLike
at com.google.cloud.dataflow.worker.GroupAlsoByWindowsParDoFn$1.output (GroupAlsoByWindowsParDoFn.java:182)
at com.google.cloud.dataflow.worker.GroupAlsoByWindowFnRunner$1.outputWindowedValue (GroupAlsoByWindowFnRunner.java:104)
at com.google.cloud.dataflow.worker.util.BatchGroupAlsoByWindowViaIteratorsFn.processElement (BatchGroupAlsoByWindowViaIteratorsFn.java:121)
at com.google.cloud.dataflow.worker.util.BatchGroupAlsoByWindowViaIteratorsFn.processElement (BatchGroupAlsoByWindowViaIteratorsFn.java:53)
at com.google.cloud.dataflow.worker.GroupAlsoByWindowFnRunner.invokeProcessElement (GroupAlsoByWindowFnRunner.java:117)
at com.google.cloud.dataflow.worker.GroupAlsoByWindowFnRunner.processElement (GroupAlsoByWindowFnRunner.java:74)
at com.google.cloud.dataflow.worker.GroupAlsoByWindowsParDoFn.processElement (GroupAlsoByWindowsParDoFn.java:113)
at com.google.cloud.dataflow.worker.util.common.worker.ParDoOperation.process (ParDoOperation.java:48)
at com.google.cloud.dataflow.worker.util.common.worker.OutputReceiver.process (OutputReceiver.java:52)
at com.google.cloud.dataflow.worker.util.common.worker.ReadOperation.runReadLoop (ReadOperation.java:187)
at com.google.cloud.dataflow.worker.util.common.worker.ReadOperation.start (ReadOperation.java:148)
at com.google.cloud.dataflow.worker.util.common.worker.MapTaskExecutor.execute (MapTaskExecutor.java:68)
at com.google.cloud.dataflow.worker.DataflowWorker.executeWork (DataflowWorker.java:330)
at com.google.cloud.dataflow.worker.DataflowWorker.doWork (DataflowWorker.java:302)
at com.google.cloud.dataflow.worker.DataflowWorker.getAndPerformWork (DataflowWorker.java:251)
at com.google.cloud.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.doWork (DataflowBatchWorkerHarness.java:135)
at com.google.cloud.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.call (DataflowBatchWorkerHarness.java:115)
at com.google.cloud.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.call (DataflowBatchWorkerHarness.java:102)
at java.util.concurrent.FutureTask.run (FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker (ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run (ThreadPoolExecutor.java:617)

1 个解决方案

#1


0  

Actually, never mind, i ran the same flow again today and it works...

实际上,没关系,我今天再次运行相同的流程,它的工作原理......

#1


0  

Actually, never mind, i ran the same flow again today and it works...

实际上,没关系,我今天再次运行相同的流程,它的工作原理......