• Home
  • Ubuntu 20.04
    • Upgrade Ubuntu
    • Install Java
    • Install Node.js
    • Install Docker
    • Install LAMP Stack
  • Tutorials
    • AWS
    • Shell Scripting
    • Docker
    • Git
    • MongoDB
  • Funny Tools
  • FeedBack
  • Submit Article
  • About Us
TecAdmin
Menu
  • Home
  • Ubuntu 20.04
    • Upgrade Ubuntu
    • Install Java
    • Install Node.js
    • Install Docker
    • Install LAMP Stack
  • Tutorials
    • AWS
    • Shell Scripting
    • Docker
    • Git
    • MongoDB
  • Funny Tools
  • FeedBack
  • Submit Article
  • About Us

How to Install Apache Hive with Hadoop on CentOS, Ubuntu and LinuxMint

Written by Rahul, Updated on April 20, 2017

What is Apache Hive ? The Apache Hive data warehouse software facilitates querying and managing large datasets residing in distributed storage. Hive provides a mechanism to project structure onto this data and query the data using an SQL-like language called HiveQL. Read More on Official site

This how-to guide will help you to Install Apache Hive on CentOS/RHEL with Hadoop with easy steps.

Step 1: Install JAVA and Hadoop

Apache Hive required java 6 or later version. We also need to install hadoop first before installing apache hive on our system. Use below links to install them

Steps to Install JAVA on CentOS and RHEL 7/6/5
Steps to Install Hadoop on Linux

Step 2: Download Hive Archive

After configuring hadoop successfully on your linux system. lets start hive setup. First download latest hive source code and extract archive using following commands.

# cd /home/hadoop
# wget http://archive.apache.org/dist/hive/hive-0.12.0/hive-0.12.0-bin.tar.gz
# tar xzf hive-0.12.0-bin.tar.gz
# mv hive-0.12.0-bin hive
# chown -R hadoop hive

Step 3: Setup Environment Variables

After extracting hive archive file, switch to hadoop user and setup following environment variables.

# su - hadoop
$ export HADOOP_HOME=/home/hadoop/hadoop
$ export HADOOP_PREFIX=/home/hadoop/hadoop
$ export HIVE_HOME=/home/hadoop/hive
$ export PATH=$HIVE_HOME/bin:$PATH

Step 4: Start Hive

Before running hive we need to create /tmp and /user/hive/warehouse and set them chmod g+w in HDFS before create a table in Hive. Use the following commands.

$ cd /home/hadoop/hive
$ $HADOOP_HOME/bin/hadoop fs -mkdir /tmp
$ $HADOOP_HOME/bin/hadoop fs -mkdir /user/hive/warehouse
$ $HADOOP_HOME/bin/hadoop fs -chmod g+w /tmp
$ $HADOOP_HOME/bin/hadoop fs -chmod g+w /user/hive/warehouse

Lets start using hive using following command.

$ bin/hive

Logging initialized using configuration in jar:file:/home/hadoop/hive/lib/hive-common-0.12.0.jar!/hive-log4j.properties
hive>

Step 5: Create Demo Table and Test

At this stage you have successfully installed hive. Lets create a sample table using following command

hive>  CREATE TABLE demo1 (id int, name string);
OK
Time taken: 6.565 seconds

Show the created tables with below command.

hive> SHOW TABLES;
OK
demo1
Time taken: 0.231 seconds, Fetched: 1 row(s)

Drop the table using below command.

hive> DROP TABLE demo1;
OK
Time taken: 2.393 seconds

Read more about SQL operations on hive on apache hive official site.

Share it!
Share on Facebook
Share on Twitter
Share on LinkedIn
Share on Reddit
Share on Tumblr
Share on Whatsapp
Rahul
Rahul
Connect on Facebook Connect on Twitter

I, Rahul Kumar am the founder and chief editor of TecAdmin.net. I am a Red Hat Certified Engineer (RHCE) and working as an IT professional since 2009..

12 Comments

  1. Avatar Christos Reply
    July 25, 2018 at 2:41 pm

    hello I am trying to run hive but i got the following error:

    WARNING: HADOOP_PREFIX has been replaced by HADOOP_HOME. Using value of HADOOP_PREFIX.
    SLF4J: Class path contains multiple SLF4J bindings.
    SLF4J: Found binding in [jar:file:/home/hadoop/hadoop/share/hadoop/common/lib/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class]
    SLF4J: Found binding in [jar:file:/home/hadoop/hive/lib/slf4j-log4j12-1.6.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]
    SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
    SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
    WARNING: HADOOP_PREFIX has been replaced by HADOOP_HOME. Using value of HADOOP_PREFIX.
    SLF4J: Class path contains multiple SLF4J bindings.
    SLF4J: Found binding in [jar:file:/home/hadoop/hadoop/share/hadoop/common/lib/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class]
    SLF4J: Found binding in [jar:file:/home/hadoop/hive/lib/slf4j-log4j12-1.6.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]
    SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
    SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
    2018-07-25 10:31:25,026 INFO Configuration.deprecation: mapred.input.dir.recursive is deprecated. Instead, use mapreduce.input.fileinputformat.input.dir.recursive
    2018-07-25 10:31:25,028 INFO Configuration.deprecation: mapred.max.split.size is deprecated. Instead, use mapreduce.input.fileinputformat.split.maxsize
    2018-07-25 10:31:25,028 INFO Configuration.deprecation: mapred.min.split.size is deprecated. Instead, use mapreduce.input.fileinputformat.split.minsize
    2018-07-25 10:31:25,028 INFO Configuration.deprecation: mapred.min.split.size.per.rack is deprecated. Instead, use mapreduce.input.fileinputformat.split.minsize.per.rack
    2018-07-25 10:31:25,028 INFO Configuration.deprecation: mapred.min.split.size.per.node is deprecated. Instead, use mapreduce.input.fileinputformat.split.minsize.per.node
    2018-07-25 10:31:25,028 INFO Configuration.deprecation: mapred.reduce.tasks is deprecated. Instead, use mapreduce.job.reduces
    2018-07-25 10:31:25,029 INFO Configuration.deprecation: mapred.reduce.tasks.speculative.execution is deprecated. Instead, use mapreduce.reduce.speculative
    2018-07-25 10:31:25,656 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform… using builtin-java classes where applicable
    log4j:ERROR Could not instantiate class [org.apache.hadoop.hive.shims.HiveEventCounter].
    java.lang.IllegalArgumentException: Unrecognized Hadoop major version number: 3.1.0
    at org.apache.hadoop.hive.shims.ShimLoader.getMajorVersion(ShimLoader.java:154)
    at org.apache.hadoop.hive.shims.ShimLoader.loadShims(ShimLoader.java:113)
    at org.apache.hadoop.hive.shims.ShimLoader.getEventCounter(ShimLoader.java:98)
    at org.apache.hadoop.hive.shims.HiveEventCounter.(HiveEventCounter.java:34)
    at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
    at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
    at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
    at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
    at java.lang.Class.newInstance(Class.java:442)
    at org.apache.log4j.helpers.OptionConverter.instantiateByClassName(OptionConverter.java:337)
    at org.apache.log4j.helpers.OptionConverter.instantiateByKey(OptionConverter.java:124)
    at org.apache.log4j.PropertyConfigurator.parseAppender(PropertyConfigurator.java:785)
    at org.apache.log4j.PropertyConfigurator.parseCategory(PropertyConfigurator.java:768)
    at org.apache.log4j.PropertyConfigurator.configureRootCategory(PropertyConfigurator.java:648)
    at org.apache.log4j.PropertyConfigurator.doConfigure(PropertyConfigurator.java:514)
    at org.apache.log4j.PropertyConfigurator.doConfigure(PropertyConfigurator.java:580)
    at org.apache.log4j.PropertyConfigurator.configure(PropertyConfigurator.java:415)
    at org.apache.hadoop.hive.common.LogUtils.initHiveLog4jDefault(LogUtils.java:124)
    at org.apache.hadoop.hive.common.LogUtils.initHiveLog4jCommon(LogUtils.java:77)
    at org.apache.hadoop.hive.common.LogUtils.initHiveLog4j(LogUtils.java:58)
    at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:630)
    at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:614)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:498)
    at org.apache.hadoop.util.RunJar.run(RunJar.java:308)
    at org.apache.hadoop.util.RunJar.main(RunJar.java:222)
    log4j:ERROR Could not instantiate appender named “EventCounter”.

    Logging initialized using configuration in jar:file:/home/hadoop/hive/lib/hive-common-0.12.0.jar!/hive-log4j.properties
    Exception in thread “main” java.lang.RuntimeException: org.apache.hadoop.hive.ql.metadata.HiveException: java.lang.RuntimeException: java.lang.IllegalArgumentException: Unrecognized Hadoop major version number: 3.1.0
    at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:286)
    at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:670)
    at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:614)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:498)
    at org.apache.hadoop.util.RunJar.run(RunJar.java:308)
    at org.apache.hadoop.util.RunJar.main(RunJar.java:222)
    Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: java.lang.RuntimeException: java.lang.IllegalArgumentException: Unrecognized Hadoop major version number: 3.1.0
    at org.apache.hadoop.hive.ql.metadata.HiveUtils.getAuthenticator(HiveUtils.java:368)
    at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:278)
    … 8 more
    Caused by: java.lang.RuntimeException: java.lang.IllegalArgumentException: Unrecognized Hadoop major version number: 3.1.0
    at org.apache.hadoop.hive.ql.security.HadoopDefaultAuthenticator.setConf(HadoopDefaultAuthenticator.java:53)
    at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:77)
    at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:137)
    at org.apache.hadoop.hive.ql.metadata.HiveUtils.getAuthenticator(HiveUtils.java:365)
    … 9 more
    Caused by: java.lang.IllegalArgumentException: Unrecognized Hadoop major version number: 3.1.0
    at org.apache.hadoop.hive.shims.ShimLoader.getMajorVersion(ShimLoader.java:154)
    at org.apache.hadoop.hive.shims.ShimLoader.loadShims(ShimLoader.java:113)
    at org.apache.hadoop.hive.shims.ShimLoader.getHadoopShims(ShimLoader.java:80)
    at org.apache.hadoop.hive.ql.security.HadoopDefaultAuthenticator.setConf(HadoopDefaultAuthenticator.java:51)

  2. Avatar Christos Reply
    July 25, 2018 at 2:39 pm

    I am trying to run hive but I got the following error:
    WARNING: HADOOP_PREFIX has been replaced by HADOOP_HOME. Using value of HADOOP_PREFIX.
    SLF4J: Class path contains multiple SLF4J bindings.
    SLF4J: Found binding in [jar:file:/home/hadoop/hadoop/share/hadoop/common/lib/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class]
    SLF4J: Found binding in [jar:file:/home/hadoop/hive/lib/slf4j-log4j12-1.6.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]
    SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
    SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
    WARNING: HADOOP_PREFIX has been replaced by HADOOP_HOME. Using value of HADOOP_PREFIX.
    SLF4J: Class path contains multiple SLF4J bindings.
    SLF4J: Found binding in [jar:file:/home/hadoop/hadoop/share/hadoop/common/lib/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class]
    SLF4J: Found binding in [jar:file:/home/hadoop/hive/lib/slf4j-log4j12-1.6.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]
    SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
    SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
    2018-07-25 10:31:25,026 INFO Configuration.deprecation: mapred.input.dir.recursive is deprecated. Instead, use mapreduce.input.fileinputformat.input.dir.recursive
    2018-07-25 10:31:25,028 INFO Configuration.deprecation: mapred.max.split.size is deprecated. Instead, use mapreduce.input.fileinputformat.split.maxsize
    2018-07-25 10:31:25,028 INFO Configuration.deprecation: mapred.min.split.size is deprecated. Instead, use mapreduce.input.fileinputformat.split.minsize
    2018-07-25 10:31:25,028 INFO Configuration.deprecation: mapred.min.split.size.per.rack is deprecated. Instead, use mapreduce.input.fileinputformat.split.minsize.per.rack
    2018-07-25 10:31:25,028 INFO Configuration.deprecation: mapred.min.split.size.per.node is deprecated. Instead, use mapreduce.input.fileinputformat.split.minsize.per.node
    2018-07-25 10:31:25,028 INFO Configuration.deprecation: mapred.reduce.tasks is deprecated. Instead, use mapreduce.job.reduces
    2018-07-25 10:31:25,029 INFO Configuration.deprecation: mapred.reduce.tasks.speculative.execution is deprecated. Instead, use mapreduce.reduce.speculative
    2018-07-25 10:31:25,656 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform… using builtin-java classes where applicable
    log4j:ERROR Could not instantiate class [org.apache.hadoop.hive.shims.HiveEventCounter].
    java.lang.IllegalArgumentException: Unrecognized Hadoop major version number: 3.1.0
    at org.apache.hadoop.hive.shims.ShimLoader.getMajorVersion(ShimLoader.java:154)
    at org.apache.hadoop.hive.shims.ShimLoader.loadShims(ShimLoader.java:113)
    at org.apache.hadoop.hive.shims.ShimLoader.getEventCounter(ShimLoader.java:98)
    at org.apache.hadoop.hive.shims.HiveEventCounter.(HiveEventCounter.java:34)
    at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
    at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
    at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
    at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
    at java.lang.Class.newInstance(Class.java:442)
    at org.apache.log4j.helpers.OptionConverter.instantiateByClassName(OptionConverter.java:337)
    at org.apache.log4j.helpers.OptionConverter.instantiateByKey(OptionConverter.java:124)
    at org.apache.log4j.PropertyConfigurator.parseAppender(PropertyConfigurator.java:785)
    at org.apache.log4j.PropertyConfigurator.parseCategory(PropertyConfigurator.java:768)
    at org.apache.log4j.PropertyConfigurator.configureRootCategory(PropertyConfigurator.java:648)
    at org.apache.log4j.PropertyConfigurator.doConfigure(PropertyConfigurator.java:514)
    at org.apache.log4j.PropertyConfigurator.doConfigure(PropertyConfigurator.java:580)
    at org.apache.log4j.PropertyConfigurator.configure(PropertyConfigurator.java:415)
    at org.apache.hadoop.hive.common.LogUtils.initHiveLog4jDefault(LogUtils.java:124)
    at org.apache.hadoop.hive.common.LogUtils.initHiveLog4jCommon(LogUtils.java:77)
    at org.apache.hadoop.hive.common.LogUtils.initHiveLog4j(LogUtils.java:58)
    at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:630)
    at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:614)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:498)
    at org.apache.hadoop.util.RunJar.run(RunJar.java:308)
    at org.apache.hadoop.util.RunJar.main(RunJar.java:222)
    log4j:ERROR Could not instantiate appender named “EventCounter”.

    Logging initialized using configuration in jar:file:/home/hadoop/hive/lib/hive-common-0.12.0.jar!/hive-log4j.properties
    Exception in thread “main” java.lang.RuntimeException: org.apache.hadoop.hive.ql.metadata.HiveException: java.lang.RuntimeException: java.lang.IllegalArgumentException: Unrecognized Hadoop major version number: 3.1.0
    at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:286)
    at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:670)
    at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:614)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:498)
    at org.apache.hadoop.util.RunJar.run(RunJar.java:308)
    at org.apache.hadoop.util.RunJar.main(RunJar.java:222)
    Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: java.lang.RuntimeException: java.lang.IllegalArgumentException: Unrecognized Hadoop major version number: 3.1.0
    at org.apache.hadoop.hive.ql.metadata.HiveUtils.getAuthenticator(HiveUtils.java:368)
    at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:278)
    … 8 more
    Caused by: java.lang.RuntimeException: java.lang.IllegalArgumentException: Unrecognized Hadoop major version number: 3.1.0
    at org.apache.hadoop.hive.ql.security.HadoopDefaultAuthenticator.setConf(HadoopDefaultAuthenticator.java:53)
    at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:77)
    at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:137)
    at org.apache.hadoop.hive.ql.metadata.HiveUtils.getAuthenticator(HiveUtils.java:365)
    … 9 more
    Caused by: java.lang.IllegalArgumentException: Unrecognized Hadoop major version number: 3.1.0
    at org.apache.hadoop.hive.shims.ShimLoader.getMajorVersion(ShimLoader.java:154)
    at org.apache.hadoop.hive.shims.ShimLoader.loadShims(ShimLoader.java:113)
    at org.apache.hadoop.hive.shims.ShimLoader.getHadoopShims(ShimLoader.java:80)
    at org.apache.hadoop.hive.ql.security.HadoopDefaultAuthenticator.setConf(HadoopDefaultAuthenticator.java:51)
    … 12 more

  3. Avatar obiero Reply
    January 17, 2018 at 12:05 pm

    Why I’m getting the error: No such file or directory ?

    [email protected] [~/hive]# $HADOOP_HOME/bin/hadoop fs -mkdir /usr/hive/warehouse/
    WARNING: HADOOP_PREFIX has been replaced by HADOOP_HOME. Using value of HADOOP_PREFIX.
    2018-01-17 13:02:33,608 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform… using builtin-java classes where applicable
    mkdir: `/usr/hive/warehouse/’: No such file or directory
    [email protected] [~/hive]#

  4. Avatar jj Reply
    August 4, 2017 at 2:48 pm

    Love this blog, simple and easy to read, i can learn a lot

    Keep posting Rahul 🙂

  5. Avatar sidharth Reply
    April 19, 2017 at 2:30 am

    can uh help me for configuring $HADOOP_HOME path ?
    which one i should prefer…
    HADOOP_HOME=/home/hadoop/hadoop
    ……..
    OR
    …….
    HADOOP_HOME=/opt/hadoop/hadoop
    …..tell me ??

    • Rahul Rahul K. Reply
      April 20, 2017 at 9:07 am

      Use HADOOP_HOME=/home/hadoop/hadoop

  6. Avatar sankar Reply
    March 1, 2017 at 3:56 am

    Hi Rahul,
    you missed one step here. Below command should be run to create db. Then only you will be able to create table and stuff. If I said correct then, please add this step into it. So that in future whoever viewing this steps will be helpful.
    >> schematool -initSchema -dbType derby<<

    • Avatar joe Reply
      June 28, 2019 at 7:34 am

      Thanks, this worked for me after I deleted the existing metastore_db directory.

  7. Avatar vinay Reply
    July 18, 2016 at 7:08 am

    waste —- highly waste blog

  8. Avatar jems Reply
    February 20, 2016 at 5:19 pm

    on the previous tutorial about installing hadoop, the HADOOP_HOME=/home/hadoop/hadoop
    but in this tutorial its become HADOOP_HOME=/opt/hadoop/hadoop. should i use the first one or change to /opt ?

  9. Avatar charles Reply
    August 12, 2015 at 7:25 pm

    Hi,

    The link wget is broken.

    Please update if you can.

    Thanks,
    Chuckz

    • Avatar Rahul Reply
      October 11, 2015 at 8:26 pm

      wget http://archive.apache.org/dist/hive/hive-0.12.0/hive-0.12.0-bin.tar.gz

Leave a Reply Cancel reply

Popular Posts

  • How to View or List Cron Jobs in Linux
  • How to Install PHP 8 on Ubuntu 20.04
  • How to Set Up SSH Tunnel with PuTTY
  • How to Install Tor Browser on Ubuntu 20.04
  • Issue with phpMyAdmin and PHP: Warning in ./libraries/sql.lib.php#613 count(): Parameter must be an array or an object that implements Countable”
  • How to Allow Remote Connections to MySQL
  • How to Install MySQL 8.0 on Ubuntu 20.04
  • How to Install Apache Kafka on Ubuntu 20.04
© 2013-2021 Tecadmin.net. All Rights Reserved | Terms  | Privacy Policy