BigQuery can read from Google Drive as a federated source. See here. I want to be able to read a table in BigQuery into my Dataflow pipeline that is pointing to a Drive document.
BigQuery可以从Google云端硬盘中读取联合来源。看这里。我希望能够将BigQuery中的表读入指向Drive文档的Dataflow管道中。
Hooking up BigQuery to the file in Drive works perfectly fine:
将BigQuery连接到Drive中的文件非常合适:
But, when I then try to read that table into my Dataflow pipeline I (understandably) get the following error:
但是,当我尝试将该表读入我的Dataflow管道时,我(可以理解)得到以下错误:
No suitable credentials found to access Google Drive. Contact the table owner for assistance.
找不到合适的凭据来访问Google云端硬盘。请联系表所有者以获取帮助。
[..]
PCollection<TableRow> results = pipeline.apply("whatever",
BigQueryIO.Read.fromQuery("SELECT * from [CPT_7414_PLAYGROUND.google_drive_test]"))
.apply(ParDo.of(new DoFn<TableRow, TableRow>() {
[..]
How do I permission Dataflow to be able to read from a table in BigQuery that is pointing to Drive?
我如何才能从DataQuery中的表中读取指向Drive的权限?
1 个解决方案
#1
1
Dataflow does not currently support reading from a federated table backed by Drive, but this is coming soon.
Dataflow当前不支持从Drive支持的联合表中读取,但这很快就会出现。
#1
1
Dataflow does not currently support reading from a federated table backed by Drive, but this is coming soon.
Dataflow当前不支持从Drive支持的联合表中读取,但这很快就会出现。