메뉴 건너뛰기

Bigdata, Semantic IoT, Hadoop, NoSQL

Bigdata, Hadoop ecosystem, Semantic IoT등의 프로젝트를 진행중에 습득한 내용을 정리하는 곳입니다.
필요한 분을 위해서 공개하고 있습니다. 문의사항은 gooper@gooper.com로 메일을 보내주세요.


quartz를 이용하여 주기적으로 작업을 처리하는 모듈에서 아래와 같은 오류가 계속 발생하는데..  

오류내용(1)이 발생후 오류내용(2)가 반복적으로 발생함(quartz때문에 반복적으로 발생하는거 같음)

jackson-core-asl및 jackson-mapper-asl을 이용하는데 버젼이 1.1.1을 사용한다.(버젼이 너무 오래됨)


그래서 버젼을 1.9.4로 바꿔주고 다시 compile후에 package을 했더니.. 정상적으로 수행된다.

(이거 찾느라고 하루를 보냈다. 아흐..)


----------오류내용(2)---------

java.lang.NoClassDefFoundError: Could not initialize class com.gooper.icbms.sda.comm.kafka.avro.COL_ONEM2M

        at com.gooper.icbms.sda.sch.service.CollectStatusDataFromSIJobService.collect(CollectStatusDataFromSIJobService.java:203)

        at com.gooper.icbms.sda.sch.service.CollectStatusDataFromSIJobService.execute(CollectStatusDataFromSIJobService.java:290)

        at org.quartz.core.JobRunShell.run(JobRunShell.java:202)

        at org.quartz.simpl.SimpleThreadPool$WorkerThread.run(SimpleThreadPool.java:525)

[2016-08-29 10:24:00] [ErrorLogger] [2185] [ERROR] Job (TG1000.CollectStatusDatatFromSIJob threw an exception. 

org.quartz.SchedulerException: Job threw an unhandled exception. [See nested exception: java.lang.NoClassDefFoundError: Could not initialize class com.gooper.icbms.sda.comm.kafka.avro.COL_ONEM2M]

        at org.quartz.core.JobRunShell.run(JobRunShell.java:213)

        at org.quartz.simpl.SimpleThreadPool$WorkerThread.run(SimpleThreadPool.java:525)

Caused by: java.lang.NoClassDefFoundError: Could not initialize class com.gooper.icbms.sda.comm.kafka.avro.COL_ONEM2M

        at com.gooper.icbms.sda.sch.service.CollectStatusDataFromSIJobService.collect(CollectStatusDataFromSIJobService.java:203)

        at com.gooper.icbms.sda.sch.service.CollectStatusDataFromSIJobService.execute(CollectStatusDataFromSIJobService.java:290)

        at org.quartz.core.JobRunShell.run(JobRunShell.java:202)

        ... 1 more



----------오류내용(1)---------

2016-08-29 14:03:30] [JobRunShell] [211] [ERROR] Job TG1000.CollectStatusDatatFromSIJob threw an unhandled Exception:  

java.lang.NoSuchMethodError: org.codehaus.jackson.JsonFactory.enable(Lorg/codehaus/jackson/JsonParser$Feature;)Lorg/codehaus/jackson/JsonFactory;

        at org.apache.avro.Schema.<clinit>(Schema.java:86)

        at org.apache.avro.Schema$Parser.parse(Schema.java:953)

        at com.gooper.icbms.sda.comm.kafka.avro.COL_ONEM2M.<clinit>(COL_ONEM2M.java:10)

        at com.gooper.icbms.sda.sch.service.CollectStatusDataFromSIJobService.collect(CollectStatusDataFromSIJobService.java:203)

        at com.gooper.icbms.sda.sch.service.CollectStatusDataFromSIJobService.execute(CollectStatusDataFromSIJobService.java:290)

        at org.quartz.core.JobRunShell.run(JobRunShell.java:202)

        at org.quartz.simpl.SimpleThreadPool$WorkerThread.run(SimpleThreadPool.java:525)

[2016-08-29 14:03:30] [ErrorLogger] [2185] [ERROR] Job (TG1000.CollectStatusDatatFromSIJob threw an exception. 

org.quartz.SchedulerException: Job threw an unhandled exception. [See nested exception: java.lang.NoSuchMethodError: org.codehaus.jackson.JsonFactory.enable(Lorg/codehaus/jackson/JsonParser$Feature;)Lorg/codehaus/jackson/JsonFactory;]

        at org.quartz.core.JobRunShell.run(JobRunShell.java:213)

        at org.quartz.simpl.SimpleThreadPool$WorkerThread.run(SimpleThreadPool.java:525)

Caused by: java.lang.NoSuchMethodError: org.codehaus.jackson.JsonFactory.enable(Lorg/codehaus/jackson/JsonParser$Feature;)Lorg/codehaus/jackson/JsonFactory;

        at org.apache.avro.Schema.<clinit>(Schema.java:86)

        at org.apache.avro.Schema$Parser.parse(Schema.java:953)

        at com.gooper.icbms.sda.comm.kafka.avro.COL_ONEM2M.<clinit>(COL_ONEM2M.java:10)

        at com.gooper.icbms.sda.sch.service.CollectStatusDataFromSIJobService.collect(CollectStatusDataFromSIJobService.java:203)

        at com.gooper.icbms.sda.sch.service.CollectStatusDataFromSIJobService.execute(CollectStatusDataFromSIJobService.java:290)

        at org.quartz.core.JobRunShell.run(JobRunShell.java:202)

        ... 1 more

A personal place to organize information learned during the development of such Hadoop, Hive, Hbase, Semantic IoT, etc.
We are open to the required minutes. Please send inquiries to gooper@gooper.com.

위로