持续更新ing
Hive错误
错误1:
org.apache.thrift.transport.TTransportException: java.net.SocketException: Connection reset by peer: socket write error
解决方式:
mysql数据库连接太久了,断开重连
错误2:
SemanticException [Error 10001]: Line 2:5 Table not found 'student'
解决方式:
语言异常:找不到表student
指定student是属于哪个数据库
错误3:
Error while compiling statement: FAILED: SemanticException Line 0:-1 Invalid column reference 'between': (possible column names are: sno, cno, degree)
解决方式:
编译时抛出语义异常:无效的列,后面列出来可能需要用的字段
意思是我现在的这个字段在该表中不存在,换一个括号里面的字段就好了
错误4:
Error while compiling statement: FAILED: SemanticException [Error 10025]: Line 8:7 Expression not in GROUP BY key 'c_id'
解决方式:
意思是字段c_id不在聚合函数的中
将select的所有关键字添加进聚合函数group by后面即可
错误5:
Error while compiling statement: FAILED: ParseException line 7:11 missing EOF at '(' near 'first_value'
解决方式:
EOF指的是缺了逗号,括号啊之类的,补上就好了
错误6:
SQL 错误 [40000] [42000]: Error while compiling statement: FAILED: NullPointerException null
解决方式:
空指针异常,first_value()里面忘了传参
错误7:
SQL 错误 [10035] [42000]: Error while compiling statement: FAILED: SemanticException [Error 10035]: Column repeated in partitioning columns
CREATE TABLE IF NOT EXISTS student(
sid string,
name string,
gender string,
age int,
academy string,
dt date,
chinese int,
math int,
english int
)
PARTITIONED BY (academy string,dt date)
clustered BY (sid) sorted BY (sid ASC) INTO 4 buckets
ROW format delimited
fields terminated BY '\t'
;
解决方式:
分区表中的字段不能和表中的字段重复
修改为
CREATE TABLE IF NOT EXISTS student(
sid string,
name string,
gender string,
age int,
chinese int,
math int,
english int
)
PARTITIONED BY (academy string,dt date)
clustered BY (sid) sorted BY (sid ASC) INTO 4 buckets
ROW format delimited
fields terminated BY '\t'
;
错误8:
SQL 错误 [10096] [42000]: Error while compiling statement: FAILED: SemanticException [Error 10096]: Dynamic partition strict mode requires at least one static partition column. To turn this off set hive.exec.dynamic.partition.mode=nonstrict
解决方式:
set hive.exec.dynamic.partition.mode=nonstrict
错误9:
SQL 错误 [2] [08S01]: Error while processing statement: FAILED: Execution Error, return code 2 from org.apache.hadoop.hive.ql.exec.mr.MapRedTask
解决方式:
分区表加载数据时应该要指定字段,不能直接使用select * ,否则会报内存溢出
错误:
解决方式: