메뉴 건너뛰기

Cloudera, BigData, Semantic IoT, Hadoop, NoSQL

Cloudera CDH/CDP 및 Hadoop EcoSystem, Semantic IoT등의 개발/운영 기술을 정리합니다. gooper@gooper.com로 문의 주세요.


./schematool -initSchema -dbType derby 를 실행해준다.

(-dbType은  derby|mysql|postgres|oracle중에 하나를 지정해준다.)


schematool실행시 오류가 발생하여 다시 schematool -initSchema를 실행할때 "Error: FUNCTION 'NUCLEUS_ASCII' already exists. (state=X0Y68,code=30000)"오류가 발생하는데 이때는 "ls -l | grep meta"로 metastore_db폴더를 찾아 삭제하고 schematool을 이용하여 initSchema를 수행하면 정상적으로 schema가 초기화 된다.


* 참조 : https://cwiki.apache.org/confluence/display/Hive/Hive+Schema+Tool


-----------------schema설정없이 hive를 최초에 실행했을때 발생하는 오류메세지 ----------

$ bin/hive

hive: line 86: readlink: command not found

which: 0652-141 There is no hbase in /usr/bin /etc /usr/sbin /usr/ucb /engine/bigdata/bin /usr/bin/X11 /sbin . /usr/java7_64/bin /engine/bigdata/hadoop/bin /engine/bigdata/hive/bin.

SLF4J: Class path contains multiple SLF4J bindings.

SLF4J: Found binding in [jar:file:/engine/bigdata/apache-hive-2.1.0-bin/lib/hive-jdbc-2.1.0-standalone.jar!/org/slf4j/impl/StaticLoggerBinder.class]

SLF4J: Found binding in [jar:file:/engine/bigdata/apache-hive-2.1.0-bin/lib/log4j-slf4j-impl-2.4.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]

SLF4J: Found binding in [jar:file:/engine/bigdata/hadoop-2.7.2/share/hadoop/common/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]

SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.

SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory]


Logging initialized using configuration in file:/engine/bigdata/apache-hive-2.1.0-bin/conf/hive-log4j2.properties Async: true

Exception in thread "main" java.lang.RuntimeException: org.apache.hadoop.hive.ql.metadata.HiveException: org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Hive metastore database is not initialized. Please use schematool (e.g. ./schematool -initSchema -dbType ...) to create the schema. If needed, don'                         t forget to include the option to auto-create the underlying database in your JDBC connection string (e.g. ?createDatabaseIfNotExist=true for mysql))

        at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:578)

        at org.apache.hadoop.hive.ql.session.SessionState.beginStart(SessionState.java:518)

        at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:705)

        at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:641)

        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)

        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:95)

        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:56)

        at java.lang.reflect.Method.invoke(Method.java:620)

        at org.apache.hadoop.util.RunJar.run(RunJar.java:221)

        at org.apache.hadoop.util.RunJar.main(RunJar.java:136)

Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Hive metastore database is not initialized. Please use schematool (e.g. ./schematool -initSchema -dbType ...) to create the schema. If needed, don't forget to include the option to auto-creat                         e the underlying database in your JDBC connection string (e.g. ?createDatabaseIfNotExist=true for mysql))

        at org.apache.hadoop.hive.ql.metadata.Hive.registerAllFunctionsOnce(Hive.java:226)

        at org.apache.hadoop.hive.ql.metadata.Hive.<init>(Hive.java:366)

        at org.apache.hadoop.hive.ql.metadata.Hive.create(Hive.java:310)

        at org.apache.hadoop.hive.ql.metadata.Hive.getInternal(Hive.java:290)

        at org.apache.hadoop.hive.ql.metadata.Hive.get(Hive.java:266)

        at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:545)

        ... 9 more

Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Hive metastore database is not initialized. Please use schematool (e.g. ./schematool -initSchema -dbType ...) to create the schema. If needed, don't forget to include the option to auto-create the underlying database in your JDBC connection                          string (e.g. ?createDatabaseIfNotExist=true for mysql))

        at org.apache.hadoop.hive.ql.metadata.Hive.getAllFunctions(Hive.java:3593)

        at org.apache.hadoop.hive.ql.metadata.Hive.reloadFunctions(Hive.java:236)

        at org.apache.hadoop.hive.ql.metadata.Hive.registerAllFunctionsOnce(Hive.java:221)

        ... 14 more

Caused by: MetaException(message:Hive metastore database is not initialized. Please use schematool (e.g. ./schematool -initSchema -dbType ...) to create the schema. If needed, don't forget to include the option to auto-create the underlying database in your JDBC connection string (e.g. ?createDatabaseIfNotExist=true for my                         sql))

        at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:3364)

        at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:3336)

        at org.apache.hadoop.hive.ql.metadata.Hive.getAllFunctions(Hive.java:3590)

        ... 16 more

번호 제목 날짜 조회 수
430 It is indirectly referenced from required .class files 오류 발생시 조치방법 2017.03.09 4174
429 centos 5.X에 hadoop 2.0.5 alpha 설치 2013.12.16 4171
428 Failed to write to server: (no server available): 2022.01.17 4167
427 kerberos설정된 상태의 spooldir->memory->hdfs로 저장하는 과정의 flume agent configuration구성 예시 2019.05.30 4162
426 impala2를 Cloudera Manager가 아닌 수동으로 설치하는 방법 2018.05.30 4162
425 hive metastore db중 TBLS, TABLE_PARAMS테이블 설명 2021.10.22 4161
424 java.lang.OutOfMemoryError: unable to create new native thread오류 발생지 조치사항 2016.10.17 4160
423 ubuntu에 maven 3.6.1설치 및 환경변수 설정 2019.06.02 4156
422 Ubuntu 16.04 LTS에 4대에 Hadoop 2.8.0설치 2017.05.01 4155
421 AIX 7.1에 Python 2.7.11설치하기 2016.10.06 4155
420 DB별 JDBC 드라이버 2015.10.02 4154
419 ServerInfo객체파일 2016.07.21 4149
418 자주쓰는 유용한 프로그램 2018.03.16 4148
417 "File /user/hadoop/share/lib does not exist" 오류 해결방법 2015.06.07 4147
416 spark 2.3.0을 설치하가 위해서 parcel에 다음 url을 입력한다. 2018.07.15 4146
415 oozie 4.1 설치 - maven을 이용한 source compile on hadoop 2.5.2 with postgresql 9.3 2015.04.30 4144
414 cloudera에서 spark-shell를 실행했을때 default master는 spark.master=yarn-client임 2018.06.20 4137
413 [Sentry]HDFS의 ACL을 Sentry와 연동후 테스트 2020.06.02 4130
412 source, sink를 직접 구현하여 사용하는 예시 2019.05.30 4129
411 원격의 origin/master를 기준으로 dev branch를 만들어 작업후 원격의 origin/dev에 push하는 방법 file 2016.11.22 4126
위로