Is there a way to read csv files from a local computer and write it to big query or storage using google dataflow? If it exists, what runner should be used?
有没有办法从本地计算机读取csv文件,并使用谷歌数据流将其写入大查询或存储?如果存在,应该使用哪种跑步者?
All the google dataflow examples either read from cloud and write to either to cloud storage or big query.
所有Google数据流示例都可以从云中读取并写入云存储或大查询。
I use DirectPipelineRunner for reading from local computer and writing to local computer.
我使用DirectPipelineRunner从本地计算机读取并写入本地计算机。
1 个解决方案
#1
2
The DirectPipelineRunner should work for this; you can use TextIO to read from your local input files and use BigQueryIO to write to BigQuery.
DirectPipelineRunner应该适用于此;您可以使用TextIO从本地输入文件中读取并使用BigQueryIO写入BigQuery。
#1
2
The DirectPipelineRunner should work for this; you can use TextIO to read from your local input files and use BigQueryIO to write to BigQuery.
DirectPipelineRunner应该适用于此;您可以使用TextIO从本地输入文件中读取并使用BigQueryIO写入BigQuery。