Quantcast
Channel: Hortonworks » All Topics
Viewing all articles
Browse latest Browse all 5121

Sqoop Export from ORC Hive table to Netezza

$
0
0

Replies: 1

I have a very simple table with 3 string columns and 1 date column in HIVE which is created with ORC format. I am trying to export the table to Netezza with 3 char columns and 1 date column using sqoop export. Below is the command i am using.

sqoop export –direct –connect jdbc:netezza://testnza.catmktg.com:5480/dncusmf1 –username ***** –password **** –table sqoop_test3 –export-dir /apps/hive/warehouse/data_arch.db/sqoop_test3.

I am not able to understand weather it is a problem with date or with orc format.

Any help is greatly appriciated.
when i try and export a simple table stored as txt file format with all the columns as string it works and loads the data to the table in NZ which also has all string columns.

It fails with giving a very generic error.

Error: java.lang.InterruptedException
at java.lang.Object.wait(Native Method)
at java.lang.Thread.join(Thread.java:1280)
at java.lang.Thread.join(Thread.java:1354)
at org.apache.sqoop.mapreduce.db.netezza.NetezzaExternalTableExportMapper.run(NetezzaExternalTableExportMapper.java:207)
at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:340)
at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:167)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1557)
at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:162)

14/09/24 17:23:42 INFO mapreduce.Job: Task Id : attempt_1411476976663_0084_m_000001_1, Status : FAILED
Error: java.io.IOException: org.netezza.error.NzSQLException: ERROR: External Table : count of bad input rows reached maxerrors limit


Viewing all articles
Browse latest Browse all 5121

Trending Articles