往Hive表中插入时报错:
java.lang.RuntimeException: java.lang.UnsupportedOperationException: Currently the writer can only accept BytesRefArrayWritable at org.apache.hadoop.hive.ql.exec.ExecReducer.reduce(ExecReducer.java:270) at org.apache.hadoop.mapred.ReduceTask.runOldReducer(ReduceTask.java:506) at org.apache.hadoop.mapred.ReduceTask.run(ReduceTask.java:447) at org.apache.hadoop.mapred.Child$4.run(Child.java:268) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:396) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1408) at org.apache.hadoop.mapred.Child.main(Child.java:262) Caused by: java.lang.UnsupportedOperationException: Currently the writer can only accept BytesRefArrayWritable at org.apache.hadoop.hive.ql.io.RCFile$Writer.append(RCFile.java:880) at org.apache.hadoop.hive.ql.io.RCFileOutputFormat$2.write(RCFileOutputFormat.java:140) at org.apache.hadoop.hive.ql.exec.FileSinkOperator.processOp(FileSinkOperator.java:606) java.lang.RuntimeException: java.lang.UnsupportedOperationException: Currently the writer can only accept BytesRefArrayWritable at org.apache.hadoop.hive.ql.exec.ExecReducer.reduce(ExecReducer.java:270) at org.apache.hadoop.mapred.ReduceTask.runOldReducer(ReduceTask.java:506) at org.apache.hadoop.mapred.ReduceTask.run(ReduceTask.java:447) at org.apache.hadoop.mapred.Child$4.run(Child.java:268) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:396) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1408) at org.apache.hadoop.mapred.Child.main(Child.java:262) Caused by: java.lang.UnsupportedOperationException: Currently the writer can only accept BytesRefArrayWritable at org.apache.hadoop.hive.ql.io.RCFile$Writer.append(RCFile.java:880) at org.apache.hadoop.hive.ql.io.RCFileOutputFormat$2.write(RCFileOutputFormat.java:140) at org.apache.hadoop.hive.ql.exec.FileSinkOperator.processOp(FileSinkOperator.java:606) java.lang.RuntimeException: java.lang.UnsupportedOperationException: Currently the writer can only accept BytesRefArrayWritable at org.apache.hadoop.hive.ql.exec.ExecReducer.reduce(ExecReducer.java:270) at org.apache.hadoop.mapred.ReduceTask.runOldReducer(ReduceTask.java:506) at org.apache.hadoop.mapred.ReduceTask.run(ReduceTask.java:447) at org.apache.hadoop.mapred.Child$4.run(Child.java:268) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:396) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1408) at org.apache.hadoop.mapred.Child.main(Child.java:262) Caused by: java.lang.UnsupportedOperationException: Currently the writer can only accept BytesRefArrayWritable at org.apache.hadoop.hive.ql.io.RCFile$Writer.append(RCFile.java:880) at org.apache.hadoop.hive.ql.io.RCFileOutputFormat$2.write(RCFileOutputFormat.java:140) at org.apache.hadoop.hive.ql.exec.FileSinkOperator.processOp(FileSinkOperator.java:606)
貌似建表是格式的问题:http://comments.gmane.org/gmane.comp.java.hadoop.hive.user/2849
建表语句如下:
CREATE TABLE client_user_type_installtime( userkey string, mos string, type string) PARTITIONED BY ( dt string, installtime_type string) ROW FORMAT DELIMITED FIELDS TERMINATED BY '9' LINES TERMINATED BY '10' STORED AS INPUTFORMAT 'org.apache.hadoop.hive.ql.io.RCFileInputFormat' OUTPUTFORMAT 'org.apache.hadoop.hive.ql.io.RCFileOutputFormat'
修改建表语句:
CREATE TABLE client_user_type_installtime( userkey string, mos string, type string) PARTITIONED BY ( dt string, installtime_type string) ROW FORMAT SERDE 'org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe' STORED AS INPUTFORMAT 'org.apache.hadoop.mapred.TextInputFormat' OUTPUTFORMAT 'org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat'
重建后插入成功。