Error accessing Hive in Hortonworks Data Platform (HDP) on Hortonworks Sandbox

Asked

Viewed 38 times

0

Good evening, you guys, I am trying to access Hive from Hortonworks Data Platform and an error is appearing. Could you help me?

[root@sandbox-hdp ~]# Hive log4j:WARN No such Property [maxBackupIndex] in org.apache.log4j.Dailyrollingfileappender.

Logging initialized using Configuration in file:/etc/Hive/2.6.5.0-292/0/Hive-log4j.properties

Hive-log4j.properties:

# Licensed to the Apache Software Foundation (ASF) under one
# or more contributor license agreements.  See the NOTICE file
# distributed with this work for additional information
# regarding copyright ownership.  The ASF licenses this file
# to you under the Apache License, Version 2.0 (the
# "License"); you may not use this file except in compliance
# with the License.  You may obtain a copy of the License at
#
#     http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.

# Define some default values that can be overridden by system properties
hive.log.threshold=ALL
hive.root.logger=INFO,DRFA
hive.log.dir=${java.io.tmpdir}/${user.name}
hive.log.file=hive.log

# Define the root logger to the system property "hadoop.root.logger".
log4j.rootLogger=${hive.root.logger}, EventCounter

# Logging Threshold
log4j.threshold=${hive.log.threshold}

#
# Daily Rolling File Appender
#
# Use the PidDailyerRollingFileAppend class instead if you want to use separate log files
# for different CLI session.
#
# log4j.appender.DRFA=org.apache.hadoop.hive.ql.log.PidDailyRollingFileAppender

log4j.appender.DRFA=org.apache.log4j.DailyRollingFileAppender

log4j.appender.DRFA.File=${hive.log.dir}/${hive.log.file}

# Rollver at midnight
log4j.appender.DRFA.DatePattern=.yyyy-MM-dd

# 30-day backup
log4j.appender.DRFA.MaxBackupIndex= 30
#log4j.appender.DRFA.MaxFileSize = 256MB
log4j.appender.DRFA.layout=org.apache.log4j.PatternLayout


# Pattern format: Date LogLevel LoggerName LogMessage
#log4j.appender.DRFA.layout.ConversionPattern=%d{ISO8601} %p %c: %m%n
# Debugging Pattern format
log4j.appender.DRFA.layout.ConversionPattern=%d{ISO8601} %-5p [%t]: %c{2} (%F:%M(%L)) - %m%n


#
# console
# Add "console" to rootlogger above if you want to use this
#

log4j.appender.console=org.apache.log4j.ConsoleAppender
log4j.appender.console.target=System.err
log4j.appender.console.layout=org.apache.log4j.PatternLayout
log4j.appender.console.layout.ConversionPattern=%d{yy/MM/dd HH:mm:ss} [%t]: %p %c{2}: %m%n
log4j.appender.console.encoding=UTF-8

#custom logging levels
#log4j.logger.xxx=DEBUG

#
# Event Counter Appender
# Sends counts of logging messages at different severity levels to Hadoop Metrics.
#
log4j.appender.EventCounter=org.apache.hadoop.hive.shims.HiveEventCounter


log4j.category.DataNucleus=ERROR,DRFA
log4j.category.Datastore=ERROR,DRFA
log4j.category.Datastore.Schema=ERROR,DRFA
log4j.category.JPOX.Datastore=ERROR,DRFA
log4j.category.JPOX.Plugin=ERROR,DRFA
log4j.category.JPOX.MetaData=ERROR,DRFA
log4j.category.JPOX.Query=ERROR,DRFA
log4j.category.JPOX.General=ERROR,DRFA
log4j.category.JPOX.Enhancer=ERROR,DRFA


# Silence useless ZK logs
log4j.logger.org.apache.zookeeper.server.NIOServerCnxn=WARN,DRFA
log4j.logger.org.apache.zookeeper.ClientCnxnSocketNIO=WARN,DRFA
  • DailyRollingFileAppender can’t stand it maxBackupIndex, I believe only RollingFileAppender support.

  • @Guilhermenascimento Thanks. What should I do in this case?

  • I’m not sure if that’s it, it might be the version of org.apache.log4j which you added, if it is on the server it may be that the version is older, I do not understand much

1 answer

0

First check in ambari if Hive is started, as well as Zookeeper, Ranger, HDFS, YARN and Mapreduce. Poce access ambari at http://localhost:8080/

After all started, access to sandbox per ssh (ssh [email protected] -p 2222) and enter Hive using Beeline as reported below:

Beeline -u "jdbc:hive2://sandbox-hdp.hortonworks.com:2181/;serviceDiscoveryMode=zookeeper;zooKeeperNamespace=hiveserver2" -nhive -phive

If you want to use hivecli, you must enter username and password:

Hive -nhive -phive

If you still can’t, send the complete error.

Browser other questions tagged

You are not signed in. Login or sign up in order to post.