메뉴 건너뛰기

Cloudera, BigData, Semantic IoT, Hadoop, NoSQL

Cloudera CDH/CDP 및 Hadoop EcoSystem, Semantic IoT등의 개발/운영 기술을 정리합니다. gooper@gooper.com로 문의 주세요.


1. maven다운로드
cd /usr/local

wget http://mirror.apache-kr.org/maven/maven-3/3.2.5/binaries/apache-maven-3.2.5-bin.tar.gz

--2015-04-30 15:32:17--  http://mirror.apache-kr.org/maven/maven-3/3.2.5/binaries/apache-maven-3.2.5-bin.tar.gz
Resolving mirror.apache-kr.org... 182.161.117.136
Connecting to mirror.apache-kr.org|182.161.117.136|:80... connected.
HTTP request sent, awaiting response... 200 OK
Length: 7956528 (7.6M) [application/x-gzip]
Saving to: “apache-maven-3.2.5-bin.tar.gz”

100%[==========================================================================================================================>] 7,956,528   11.2M/s   in 0.7s    

2015-04-30 15:32:23 (11.2 MB/s) - “apache-maven-3.2.5-bin.tar.gz” saved [7956528/7956528]

2. 압축풀기
tar xvfz apache-maven-3.2.5-bin.tar.gz

3. 링크걸기
 ln -s apache-maven-3.2.5 maven

4. M2_HOME환경변수및 PATH 설정
vi /etc/profile
export M2_HOME=/usr/local/maven
export PATH=$PATH:$M2_HOME/bin

* 변경반영 : source /etc/profile

5. oozie 4.1다운로드
  wget http://mirror.apache-kr.org/oozie/4.1.0/oozie-4.1.0.tar.gz
5-1. 기본적으로 hadoop 1.1.1을 기준으로 build되므로  hadoop 2.5.2로 컴파일 되도록 정보르 수정해준다.

6. 컴파일(default는 hadoop 1.1.1버젼으로 컴파일 되므로 -P hadoop-2를 지정하여 hadoop 2버젼으로 컴파일한다)
 cd oozie-4.1.0/bin
 ./mkdistro.sh -P hadoop-2 -DskipTests
 (컴파일 완료후 결과파일 생성위치 : /home/hadoop/oozie-4.1.0/distro/target/oozie-4.1.0-distro.tar.gz)

7. 6번 빌드파일을 설치할 위치로 복사후 압축풀기및 링크생성
cp ./oozie-4.1.0-distro.tar.gz /usr/local
cd /usr/local
tar xvfz oozie-4.1.0-distro.tar.gz
ln -s oozie-4.1.0/ oozie

8. war파일에 추가할 hadooplibs및 및 각종 jar파일 복사
mkdir libext
wget -P libext http://extjs.com/deploy/ext-2.2.zip
cd libext
wget http://dev.sencha.com/deploy/ext-2.2.zip
cp -R ../oozie-4.1.0/hadooplibs/hadoop-2/target/hadooplibs/hadooplib-2.4.1.oozie-4.0.1/* libext
cp mysql-connector~.jar libext

* war파일 변경
[root@master oozie]$ oozie-setup.sh prepare-war
* 이전에 oozie-setup.sh 실행시 -extjs옵션을 주었는데 지금은 별도로 옵션을 주지 않고 libext폴더에 ext-2.2.zip파일을 복사애 놓고
oozie-setup.sh을 실행하면 자동으로 찾아서 설정해주는것으로 바뀌었다.

  setting CATALINA_OPTS="$CATALINA_OPTS -Xmx1024m"

INFO: Adding extension: /hadoop/oozie/libext/activation-1.1.jar
INFO: Adding extension: /hadoop/oozie/libext/avro-1.7.4.jar
INFO: Adding extension: /hadoop/oozie/libext/commons-beanutils-1.7.0.jar
INFO: Adding extension: /hadoop/oozie/libext/commons-beanutils-core-1.8.0.jar
INFO: Adding extension: /hadoop/oozie/libext/commons-cli-1.2.jar
INFO: Adding extension: /hadoop/oozie/libext/commons-codec-1.4.jar
INFO: Adding extension: /hadoop/oozie/libext/commons-collections-3.2.1.jar
INFO: Adding extension: /hadoop/oozie/libext/commons-compress-1.4.1.jar
INFO: Adding extension: /hadoop/oozie/libext/commons-configuration-1.6.jar
INFO: Adding extension: /hadoop/oozie/libext/commons-digester-1.8.jar
INFO: Adding extension: /hadoop/oozie/libext/commons-httpclient-3.1.jar
INFO: Adding extension: /hadoop/oozie/libext/commons-io-2.1.jar
INFO: Adding extension: /hadoop/oozie/libext/commons-lang-2.4.jar
INFO: Adding extension: /hadoop/oozie/libext/commons-logging-1.1.jar
INFO: Adding extension: /hadoop/oozie/libext/commons-math3-3.1.1.jar
INFO: Adding extension: /hadoop/oozie/libext/commons-net-3.1.jar
INFO: Adding extension: /hadoop/oozie/libext/guava-11.0.2.jar
INFO: Adding extension: /hadoop/oozie/libext/hadoop-annotations-2.3.0.jar
INFO: Adding extension: /hadoop/oozie/libext/hadoop-auth-2.3.0.jar
INFO: Adding extension: /hadoop/oozie/libext/hadoop-client-2.3.0.jar
INFO: Adding extension: /hadoop/oozie/libext/hadoop-common-2.3.0.jar
INFO: Adding extension: /hadoop/oozie/libext/hadoop-hdfs-2.3.0.jar
INFO: Adding extension: /hadoop/oozie/libext/hadoop-mapreduce-client-app-2.3.0.jar
INFO: Adding extension: /hadoop/oozie/libext/hadoop-mapreduce-client-common-2.3.0.jar
INFO: Adding extension: /hadoop/oozie/libext/hadoop-mapreduce-client-core-2.3.0.jar
INFO: Adding extension: /hadoop/oozie/libext/hadoop-mapreduce-client-jobclient-2.3.0.jar
INFO: Adding extension: /hadoop/oozie/libext/hadoop-mapreduce-client-shuffle-2.3.0.jar
INFO: Adding extension: /hadoop/oozie/libext/hadoop-yarn-api-2.3.0.jar
INFO: Adding extension: /hadoop/oozie/libext/hadoop-yarn-client-2.3.0.jar
INFO: Adding extension: /hadoop/oozie/libext/hadoop-yarn-common-2.3.0.jar
INFO: Adding extension: /hadoop/oozie/libext/hadoop-yarn-server-common-2.3.0.jar
INFO: Adding extension: /hadoop/oozie/libext/httpclient-4.2.5.jar
INFO: Adding extension: /hadoop/oozie/libext/httpcore-4.2.4.jar
INFO: Adding extension: /hadoop/oozie/libext/jackson-core-asl-1.8.8.jar
INFO: Adding extension: /hadoop/oozie/libext/jackson-mapper-asl-1.8.8.jar
INFO: Adding extension: /hadoop/oozie/libext/jaxb-api-2.2.2.jar
INFO: Adding extension: /hadoop/oozie/libext/jersey-core-1.9.jar
INFO: Adding extension: /hadoop/oozie/libext/jetty-util-6.1.26.jar
INFO: Adding extension: /hadoop/oozie/libext/jsr305-1.3.9.jar
INFO: Adding extension: /hadoop/oozie/libext/log4j-1.2.16.jar
INFO: Adding extension: /hadoop/oozie/libext/paranamer-2.3.jar
INFO: Adding extension: /hadoop/oozie/libext/postgresql-9.3-1103.jdbc4.jar
INFO: Adding extension: /hadoop/oozie/libext/protobuf-java-2.5.0.jar
INFO: Adding extension: /hadoop/oozie/libext/servlet-api-2.5.jar
INFO: Adding extension: /hadoop/oozie/libext/slf4j-api-1.6.6.jar
INFO: Adding extension: /hadoop/oozie/libext/slf4j-log4j12-1.6.6.jar
INFO: Adding extension: /hadoop/oozie/libext/snappy-java-1.0.4.1.jar
INFO: Adding extension: /hadoop/oozie/libext/stax-api-1.0-2.jar
INFO: Adding extension: /hadoop/oozie/libext/xmlenc-0.52.jar
INFO: Adding extension: /hadoop/oozie/libext/xz-1.0.jar
INFO: Adding extension: /hadoop/oozie/libext/zookeeper-3.4.5.jar

New Oozie WAR file with added 'ExtJS library, JARs' at /hadoop/oozie/oozie-server/webapps/oozie.war


INFO: Oozie is ready to be started

9. 환경변수 설정(/etc/profile)
export OOZIE_HOME=/hadoop/oozie
export PATH=$PATH:$OOZIE_HOME/bin

* 변경반영 : source /etc/profile

10. db관련정보 수정(oozie-site.xml)
   <property>
        <name>oozie.db.schema.name</name>
        <value>ooziedb</value>
        <description>
            Oozie DataBase Name
        </description>
    </property>

    <property>
        <name>oozie.service.JPAService.create.db.schema</name>
        <value>false</value>
        <description>
            Creates Oozie DB.

            If set to true, it creates the DB schema if it does not exist. If the DB schema exists is a NOP.
            If set to false, it does not create the DB schema. If the DB schema does not exist it fails start up.
        </description>
    </property>
    <property>
        <name>oozie.service.JPAService.jdbc.driver</name>
        <value>org.postgresql.Driver</value>
        <description>
            JDBC driver class.
        </description>
    </property>

    <property>
        <name>oozie.service.JPAService.jdbc.url</name>
        <value>jdbc:postgresql://node1//${oozie.db.schema.name}</value>
        <description>
            JDBC URL.
        </description>
    </property>


    <property>
        <name>oozie.service.JPAService.jdbc.username</name>
        <value>oozie</value>
        <description>
            DB user name.
        </description>
    </property>

    <property>
        <name>oozie.service.JPAService.jdbc.password</name>
        <value>oozie_pass</value>
        <description>
            DB user password.

            IMPORTANT: if password is emtpy leave a 1 space string, the service trims the value,
                       if empty Configuration assumes it is NULL.
        </description>
    </property>

*아래 부분에 대한 주석을 풀어주고 #USER#값을 반드시 oozie실행 게정(예, hadoop 혹은 oozie)으로 변경한다.
(여기서 지정된것은 core-site.xml에 지정한 proxy와 반드시 일치할 필요가 없나?? 여기는 oozie로 하고 core-site.xml에는 hadoop
으로 지정해도 실행된다.??)

<property>

        <name>oozie.service.ProxyUserService.proxyuser.#USER#.hosts</name>

        <value>*</value>

        <description>

            List of hosts the '#USER#' user is allowed to perform 'doAs'

            operations.


            The '#USER#' must be replaced with the username o the user who is

            allowed to perform 'doAs' operations.


            The value can be the '*' wildcard or a list of hostnames.


            For multiple users copy this property and replace the user name

            in the property name.

        </description>

    </property>


    <property>

        <name>oozie.service.ProxyUserService.proxyuser.hadoop.groups</name>

        <value>*</value>

        <description>

            List of groups the '#USER#' user is allowed to impersonate users

            from to perform 'doAs' operations.


            The '#USER#' must be replaced with the username o the user who is

            allowed to perform 'doAs' operations.


            The value can be the '*' wildcard or a list of groups.


            For multiple users copy this property and replace the user name

            in the property name.

        </description>

    </property>


11. core-site.xml

oozie job을 실행하는 계정에 대한 권한부여(두개다 <value></value>에 *을 부여해도됨)


<property>
         <name>hadoop.proxyuser.[userId].hosts</name>
         <value>MasterNode</value>
</property>
<property>
         <name>hadoop.proxyuser.[userId].groups</name>
         <value>[userId]</value>
</property>


12. 더비대신 mysql사용하도록 설정된 정보를 이용하여 db및 관련table생성
./ooziedb.sh create -sqlfile oozie.sql -run

13. 확인(command)
[root@master logs]$ oozie admin -oozie http://localhost:11000/oozie -status
System mode: NORMAL

14. 확인URL
http://master:11000/oozie/
* 어느정도 시간이 지나면 오류가 발생하면서 oozied가 다운되면서 catalina.out파일에 
java.lang.NoClassDefFoundError: org/apache/hadoop/util/ShutdownHookManager라는 오류가 발생하면
hadoop의 common라이브러리를 oozie설치폴더의 lib에 복사하고(/usr/local/hadoop/share/hadoop/common/hadoop-common-2.5.2.jar /usr/local/oozie/lib), oozied.sh stop, oozied.sh start하여 oozie데몬을 다시 기동시켜준다.



번호 제목 날짜 조회 수
133 Hue Job Browser의 Queries탭에서 조건을 지정하는 방법 2018.05.10 3283
132 lagom-windows용 build.sbt파일 내용 2017.10.12 3280
131 파일명 혹은 확장자 일괄 변경하는 방법 2017.01.26 3280
130 elasticsearch에서 모든 인덱스(색인)을 삭제하는 방법 2019.06.09 3270
129 Hive MetaStore Server기동시 Could not create "increment"/"table" value-generation container SEQUENCE_TABLE since autoCreate flags do not allow it. 오류발생시 조치사항 2017.05.03 3266
128 Spark Streaming 코드레벨단에서의 성능개선 2016.10.31 3265
127 HDFS Balancer설정및 수행 2018.03.21 3261
126 halyard 1.3의 rdf4j-server.war와 rdf4j-workbench.war를 tomcat deploy후 조회시 java.lang.NoClassDefFoundError: org/apache/hadoop/hbase/Cell발생시 조치사항 2017.07.05 3258
125 슬라이딩 윈도우 예제 2016.07.28 3254
124 HBase 설정 최적화하기(VCNC) file 2017.07.18 3248
123 elasticsearch 1.3.0에서 rdf및 hadoop plugin설치 2016.04.06 3226
122 Ubuntu 16.04 LTS에 Hive 2.1.1설치하면서 "Version information not found in metastore"발생하는 오류원인및 조치사항 2017.05.03 3220
121 hadoop cluster구성된 노드를 확인시 Capacity를 보면 색이 붉은색으로 표시되어 있는 경우나 Unhealthy인 경우 처리방법 2017.08.30 3207
120 elasticsearch 기동시 permission denied on key 'vm.max_map_count' 오류발생시 조치사항 2017.06.23 3197
119 Flume을 이용한 데이타 수집시 HBase write 성능 튜닝 file 2016.10.31 3197
118 fuseki용 config-examples.ttl 예시 내용 2017.05.17 3193
117 spark-shell을 실행하면 "Attempted to request executors before the AM has registered!"라는 오류가 발생하면 2018.06.08 3189
116 [Impala] alter table구문수행시 "WARNINGS: Impala does not have READ_WRITE access to path 'hdfs://nameservice1/DATA/Temp/DB/source/table01_ccd'" 발생시 조치 2024.04.26 3182
115 모두를 위한 머신러닝과 딥러닝의 강의 file 2016.09.27 3174
114 magento2 샘플데이타 설치 2017.01.31 3170
위로