import
--connect
jdbc:mysql://cdh5.hadoop.com:3306/test
--username
root
--password
123456
--table
epc_partgroup
--num-mappers
1
--fields-terminated-by
"\t"
--delete-target-dir
--hive-database
hive_gary
--hive-import
--hive-table
epc_partgroup
=================================================================
>>> Invoking Sqoop command line now >>>
4924 [main] WARN org.apache.sqoop.tool.SqoopTool - $SQOOP_CONF_DIR has not been set in the environment. Cannot check for additional configuration.
4971 [main] INFO org.apache.sqoop.Sqoop - Running Sqoop version: 1.4.5-cdh5.3.6
5000 [main] WARN org.apache.sqoop.tool.BaseSqoopTool - Setting your password on the command-line is insecure. Consider using -P instead.
5025 [main] WARN org.apache.sqoop.ConnFactory - $SQOOP_CONF_DIR has not been set in the environment. Cannot check for additional configuration.
5172 [main] INFO org.apache.sqoop.manager.MySQLManager - Preparing to use a MySQL streaming resultset.
5180 [main] INFO org.apache.sqoop.tool.CodeGenTool - Beginning code generation
5864 [main] INFO org.apache.sqoop.manager.SqlManager - Executing SQL statement: SELECT t.* FROM `epc_partgroup` AS t LIMIT 1
5958 [main] INFO org.apache.sqoop.manager.SqlManager - Executing SQL statement: SELECT t.* FROM `epc_partgroup` AS t LIMIT 1
5966 [main] INFO org.apache.sqoop.orm.CompilationManager - HADOOP_MAPRED_HOME is /opt/soft/cdh5/hadoop-2.5.0-cdh5.3.6
13088 [main] INFO org.apache.sqoop.orm.CompilationManager - Writing jar file: /tmp/sqoop-gary/compile/7eb5d287f10f01ef33f03384f9dcf6b5/epc_partgroup.jar
13167 [main] INFO org.apache.sqoop.tool.ImportTool - Destination directory epc_partgroup deleted.
13167 [main] WARN org.apache.sqoop.manager.MySQLManager - It looks like you are importing from mysql.
13168 [main] WARN org.apache.sqoop.manager.MySQLManager - This transfer can be faster! Use the --direct
13168 [main] WARN org.apache.sqoop.manager.MySQLManager - option to exercise a MySQL-specific fast path.
13168 [main] INFO org.apache.sqoop.manager.MySQLManager - Setting zero DATETIME behavior to convertToNull (mysql)
13172 [main] INFO org.apache.sqoop.mapreduce.ImportJobBase - Beginning import of epc_partgroup
13225 [main] WARN org.apache.sqoop.mapreduce.JobBase - SQOOP_HOME is unset. May not be able to find all job dependencies.
14223 [main] INFO org.apache.sqoop.mapreduce.db.DBInputFormat - Using read commited transaction isolation
Heart beat
48069 [main] INFO org.apache.sqoop.mapreduce.ImportJobBase - Transferred 2.6914 KB in 34.8306 seconds (79.1259 bytes/sec)
48113 [main] INFO org.apache.sqoop.mapreduce.ImportJobBase - Retrieved 102 records.
48220 [main] INFO org.apache.sqoop.manager.SqlManager - Executing SQL statement: SELECT t.* FROM `epc_partgroup` AS t LIMIT 1
48323 [main] INFO org.apache.sqoop.hive.HiveImport - Loading uploaded data into Hive
Intercepting System.exit(1)
<<< Invocation of Main class completed <<<
Failing Oozie Launcher, Main class [org.apache.oozie.action.hadoop.SqoopMain], exit code [1]
Oozie Launcher failed, finishing Hadoop job gracefully
Oozie Launcher, uploading action data to HDFS sequence file: hdfs://cdh5.hadoop.com:8020/user/gary/oozie-gary/0000007-151227120318126-oozie-gary-W/sqoop-node--sqoop/action-data.seq
Oozie Launcher ends
我看到 两个压yarn上运行的进程已经成功,最后导入hive的时候出现了这个错误
9 个解决方案
#1
不知道你的具体环境是什么,找不到具体原因
#2
oozie 调用sqoop导入mysql数据到 hive
[code]
import
--connect
jdbc:mysql://cdh5.hadoop.com:3306/test
--username
root
--password
123456
--table
epc_partgroup
--num-mappers
1
--fields-terminated-by
"\t"
--delete-target-dir
--hive-database
hive_gary
--hive-import
--hive-table
epc_partgroup
[code]
#3
你好 我也遇到同样问题,请问你是怎么解决的后来。
#4
#5
$SQOOP_CONF_DIR has not been set in the environment. Cannot check for additional configuration.
#6
同样的错误
84723 [uber-SubtaskRunner] INFO org.apache.sqoop.hive.HiveImport - Loading uploaded data into Hive
Intercepting System.exit(1)
<<< Invocation of Main class completed <<<
Failing Oozie Launcher, Main class [org.apache.oozie.action.hadoop.SqoopMain], exit code [1]
Oozie Launcher failed, finishing Hadoop job gracefully
84723 [uber-SubtaskRunner] INFO org.apache.sqoop.hive.HiveImport - Loading uploaded data into Hive
Intercepting System.exit(1)
<<< Invocation of Main class completed <<<
Failing Oozie Launcher, Main class [org.apache.oozie.action.hadoop.SqoopMain], exit code [1]
Oozie Launcher failed, finishing Hadoop job gracefully
#7
我也遇到这个错误,请问解决了吗,能不能指点一下,谢谢非常感谢
#8
请问解决了吗
#9
$SQOOP_CONF_DIR has not been set in the environment. 没有设置这个
#1
不知道你的具体环境是什么,找不到具体原因
#2
oozie 调用sqoop导入mysql数据到 hive
[code]
import
--connect
jdbc:mysql://cdh5.hadoop.com:3306/test
--username
root
--password
123456
--table
epc_partgroup
--num-mappers
1
--fields-terminated-by
"\t"
--delete-target-dir
--hive-database
hive_gary
--hive-import
--hive-table
epc_partgroup
[code]
#3
你好 我也遇到同样问题,请问你是怎么解决的后来。
#4
#5
$SQOOP_CONF_DIR has not been set in the environment. Cannot check for additional configuration.
#6
同样的错误
84723 [uber-SubtaskRunner] INFO org.apache.sqoop.hive.HiveImport - Loading uploaded data into Hive
Intercepting System.exit(1)
<<< Invocation of Main class completed <<<
Failing Oozie Launcher, Main class [org.apache.oozie.action.hadoop.SqoopMain], exit code [1]
Oozie Launcher failed, finishing Hadoop job gracefully
84723 [uber-SubtaskRunner] INFO org.apache.sqoop.hive.HiveImport - Loading uploaded data into Hive
Intercepting System.exit(1)
<<< Invocation of Main class completed <<<
Failing Oozie Launcher, Main class [org.apache.oozie.action.hadoop.SqoopMain], exit code [1]
Oozie Launcher failed, finishing Hadoop job gracefully
#7
我也遇到这个错误,请问解决了吗,能不能指点一下,谢谢非常感谢
#8
请问解决了吗
#9
$SQOOP_CONF_DIR has not been set in the environment. 没有设置这个