Replies: 0
Hi ,
We have been trying to import the data from Oracle to HDFS using SQOOP. The column that we are trying to use for incremental import is a timestamp.
SQOOP is throwing a Java exception as follows:
Error: java.io.IOException: SQLException in nextKeyValue
at org.apache.sqoop.mapreduce.db.DBRecordReader.nextKeyValue(DBRecordReader.java:277)
at org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.nextKeyValue(MapTask.java:553)
at org.apache.hadoop.mapreduce.task.MapContextImpl.nextKeyValue(MapContextImpl.java:80)
at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.nextKeyValue(WrappedMapper.java:91)
at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
at org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:64)
at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1614)
at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:163)
Caused by: java.sql.SQLDataException: ORA-01841: (full) year must be between -4713 and +9999, and not be 0
Here is the SQOOP command that is used to perform incremental import:
sqoop import –connect ‘jdbc:oracle:thin:@(description=(address=(protocol=tcp)(host=xxxxxxxx)(port=1521))(connect_data=(SID=YYYY)))’ –username AAAA –password ZZZZ –table TABLE123 –target-dir /user/hive/incremental_table1 -m 1 –check-column xyz_TS –incremental lastmodified –last-value {last_import_date}
Your response is highly appreciated.