java - Hive shell throws Filenotfound exception while executing queries, inspite of adding jar files using "ADD JAR" -


1) have added serde jar file using "add jar /home/hduser/softwares/hive/hive-serdes-1.0-snapshot.jar;"

2) create table

3) table creates

4) when execute select query throws file not found exception

hive> select count(*) tab_tweets;  query id = hduser_20150604145353_51b4def4-11fb-4638-acac-77301c1c1806 total jobs = 1 launching job 1 out of 1 number of reduce tasks determined @ compile time: 1 in order change average load reducer (in bytes):   set hive.exec.reducers.bytes.per.reducer=<number> in order limit maximum number of reducers:   set hive.exec.reducers.max=<number> in order set constant number of reducers:   set mapreduce.job.reduces=<number> java.io.filenotfoundexception: file not exist: hdfs://node1:9000/home/hduser/softwares/hive/hive-serdes-1.0-snapshot.jar     @ org.apache.hadoop.hdfs.distributedfilesystem$18.docall(distributedfilesystem.java:1122)     @ org.apache.hadoop.hdfs.distributedfilesystem$18.docall(distributedfilesystem.java:1114)     @ org.apache.hadoop.fs.filesystemlinkresolver.resolve(filesystemlinkresolver.java:81)     @ org.apache.hadoop.hdfs.distributedfilesystem.getfilestatus(distributedfilesystem.java:1114)     @ org.apache.hadoop.mapreduce.filecache.clientdistributedcachemanager.getfilestatus(clientdistributedcachemanager.java:288)     @ org.apache.hadoop.mapreduce.filecache.clientdistributedcachemanager.getfilestatus(clientdistributedcachemanager.java:224)     @ org.apache.hadoop.mapreduce.filecache.clientdistributedcachemanager.determinetimestamps(clientdistributedcachemanager.java:99)     @ org.apache.hadoop.mapreduce.filecache.clientdistributedcachemanager.determinetimestampsandcachevisibilities(clientdistributedcachemanager.java:57)     @ org.apache.hadoop.mapreduce.jobsubmitter.copyandconfigurefiles(jobsubmitter.java:269)     @ org.apache.hadoop.mapreduce.jobsubmitter.copyandconfigurefiles(jobsubmitter.java:390)     @ org.apache.hadoop.mapreduce.jobsubmitter.submitjobinternal(jobsubmitter.java:483)     @ org.apache.hadoop.mapreduce.job$10.run(job.java:1296)     @ org.apache.hadoop.mapreduce.job$10.run(job.java:1293)     @ java.security.accesscontroller.doprivileged(native method)     @ javax.security.auth.subject.doas(subject.java:422)     @ org.apache.hadoop.security.usergroupinformation.doas(usergroupinformation.java:1628)     @ org.apache.hadoop.mapreduce.job.submit(job.java:1293)     @ org.apache.hadoop.mapred.jobclient$1.run(jobclient.java:562)     @ org.apache.hadoop.mapred.jobclient$1.run(jobclient.java:557)     @ java.security.accesscontroller.doprivileged(native method)     @ javax.security.auth.subject.doas(subject.java:422)     @ org.apache.hadoop.security.usergroupinformation.doas(usergroupinformation.java:1628)     @ org.apache.hadoop.mapred.jobclient.submitjobinternal(jobclient.java:557)     @ org.apache.hadoop.mapred.jobclient.submitjob(jobclient.java:548)     @ org.apache.hadoop.hive.ql.exec.mr.execdriver.execute(execdriver.java:428)     @ org.apache.hadoop.hive.ql.exec.mr.mapredtask.execute(mapredtask.java:137)     @ org.apache.hadoop.hive.ql.exec.task.executetask(task.java:160)     @ org.apache.hadoop.hive.ql.exec.taskrunner.runsequential(taskrunner.java:88)     @ org.apache.hadoop.hive.ql.driver.launchtask(driver.java:1638)     @ org.apache.hadoop.hive.ql.driver.execute(driver.java:1397)     @ org.apache.hadoop.hive.ql.driver.runinternal(driver.java:1183)     @ org.apache.hadoop.hive.ql.driver.run(driver.java:1049)     @ org.apache.hadoop.hive.ql.driver.run(driver.java:1039)     @ org.apache.hadoop.hive.cli.clidriver.processlocalcmd(clidriver.java:207)     @ org.apache.hadoop.hive.cli.clidriver.processcmd(clidriver.java:159)     @ org.apache.hadoop.hive.cli.clidriver.processline(clidriver.java:370)     @ org.apache.hadoop.hive.cli.clidriver.executedriver(clidriver.java:754)     @ org.apache.hadoop.hive.cli.clidriver.run(clidriver.java:675)     @ org.apache.hadoop.hive.cli.clidriver.main(clidriver.java:615)     @ sun.reflect.nativemethodaccessorimpl.invoke0(native method)     @ sun.reflect.nativemethodaccessorimpl.invoke(nativemethodaccessorimpl.java:62)     @ sun.reflect.delegatingmethodaccessorimpl.invoke(delegatingmethodaccessorimpl.java:43)     @ java.lang.reflect.method.invoke(method.java:483)     @ org.apache.hadoop.util.runjar.run(runjar.java:221)     @ org.apache.hadoop.util.runjar.main(runjar.java:136) 

job submission failed exception 'java.io.filenotfoundexception(file not exist: hdfs://node1:9000/home/hduser/softwares/hive/hive-serdes-1.0-snapshot.jar)' failed: execution error, return code 1 org.apache.hadoop.hive.ql.exec.mr.mapredtask

method 1: copy hive-serdes-1.0-snapshot.jar file local filesystem hdfs.

hadoop fs -mkdir /home/hduser/softwares/hive/ hadoop fs -put /home/hduser/softwares/hive/hive-serdes-1.0-snapshot.jar /home/hduser/softwares/hive/ 

note: use hdfs dfs instead of hadoop fs, if using latest hadoop versions.

method 2: change value hive.aux.jars.path in hive-site.xml as:

<property>  <name>hive.aux.jars.path</name>  <value>file:///home/hduser/softwares/hive/hive-serdes-1.0-snapshot.jar</value> </property> 

method 3: add hive-serdes-1.0-snapshot.jar in hadoop classpath. i.e., add line in hadoop-env.sh:

export hadoop_classpath=$hadoop_classpath:/home/hduser/softwares/hive/hive-serdes-1.0-snapshot.jar 

note: have mentioned paths considering have installed hive in /home/hduser/softwares/hive. if have hive installed elsewhere, please change /home/hduser/softwares/hive point hive installation folder.


Comments

Popular posts from this blog

Magento/PHP - Get phones on all members in a customer group -

php - .htaccess mod_rewrite for dynamic url which has domain names -

Website Login Issue developed in magento -