I am trying to import tables from Postgresql to HDFS, using Sqoop. It works fine. But when I try to import a table, where one of the fields is of json type, it shows the error:
我正在尝试使用Sqoop将表从Postgresql导入HDFS。它工作正常。但是当我尝试导入一个表,其中一个字段是json类型时,它显示错误:
ERROR orm.ClassWriter: Cannot resolve SQL type 1111
It seems like Sqoop does not support json as data types for table fields. Any idea if there is a solution for this?
看起来Sqoop不支持json作为表字段的数据类型。知道是否有解决方案吗?
2 个解决方案
#1
4
Try this:
尝试这个:
sqoop import --connect jdbc:postgresql://XXX.XX.XXX.XXX:5432/iAtlas --table msg02 --username aaaa.bbbbb --password ccccc --schema tracking --map-column-java wsresp=String --map-column-hive wsresp=STRING
add --map-column-hive option also, using --map-column-java we are mapping sql type to java type then using --map-column-hive will map it to hive type
添加--map-column-hive选项,使用--map-column-java我们将sql类型映射到java类型然后使用--map-column-hive将它映射到hive类型
#2
1
I also faced the similar error. Was able to resolve it by using --map-column-java option in sqoop import command. Map the json type to String at the time of import like:
我也遇到了类似的错误。能够通过在sqoop import命令中使用--map-column-java选项来解决它。在导入时将json类型映射到String,如:
--map-column-java wsresp=String
--map-column-java wsresp = String
#1
4
Try this:
尝试这个:
sqoop import --connect jdbc:postgresql://XXX.XX.XXX.XXX:5432/iAtlas --table msg02 --username aaaa.bbbbb --password ccccc --schema tracking --map-column-java wsresp=String --map-column-hive wsresp=STRING
add --map-column-hive option also, using --map-column-java we are mapping sql type to java type then using --map-column-hive will map it to hive type
添加--map-column-hive选项,使用--map-column-java我们将sql类型映射到java类型然后使用--map-column-hive将它映射到hive类型
#2
1
I also faced the similar error. Was able to resolve it by using --map-column-java option in sqoop import command. Map the json type to String at the time of import like:
我也遇到了类似的错误。能够通过在sqoop import命令中使用--map-column-java选项来解决它。在导入时将json类型映射到String,如:
--map-column-java wsresp=String
--map-column-java wsresp = String