메뉴 건너뛰기

Cloudera, BigData, Semantic IoT, Hadoop, NoSQL

Cloudera CDH/CDP 및 Hadoop EcoSystem, Semantic IoT등의 개발/운영 기술을 정리합니다. gooper@gooper.com로 문의 주세요.


실행 : python3 DataSetCreator.py -i s2rdf/data/sparql.in -s 0.25

=>http://stackoverflow.com/questions/27792839/spark-fail-when-running-pi-py-example-with-yarn-client-mode 참조

-----------------------------로그내용------------------------------
Input RDF file ->"
16/05/27 18:22:57 INFO SparkContext: Running Spark version 1.6.1
16/05/27 18:22:57 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
16/05/27 18:22:57 WARN SparkConf: Detected deprecated memory fraction settings: [spark.storage.memoryFraction]. As of Spark 1.6, execution and storage memory management are unified. All memory fractions used in the old model are now deprecated and no longer read. If you wish to use the old memory management, you may explicitly enable `spark.memory.useLegacyMode` (not recommended).
16/05/27 18:22:57 INFO SecurityManager: Changing view acls to: hadoop
16/05/27 18:22:57 INFO SecurityManager: Changing modify acls to: hadoop
16/05/27 18:22:57 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(hadoop); users with modify permissions: Set(hadoop)
16/05/27 18:22:57 INFO Utils: Successfully started service 'sparkDriver' on port 56181.
16/05/27 18:22:58 INFO Slf4jLogger: Slf4jLogger started
16/05/27 18:22:58 INFO Remoting: Starting remoting
16/05/27 18:22:58 INFO Remoting: Remoting started; listening on addresses :[akka.tcp://sparkDriverActorSystem@XXX.XXX.XXX.43:34384]
16/05/27 18:22:58 INFO Utils: Successfully started service 'sparkDriverActorSystem' on port 34384.
16/05/27 18:22:58 INFO SparkEnv: Registering MapOutputTracker
16/05/27 18:22:58 INFO SparkEnv: Registering BlockManagerMaster
16/05/27 18:22:58 INFO DiskBlockManager: Created local directory at /tmp/blockmgr-cdc351b1-92b1-405c-9127-fca2f798daf3
16/05/27 18:22:58 INFO MemoryStore: MemoryStore started with capacity 1247.3 MB
16/05/27 18:22:58 INFO SparkEnv: Registering OutputCommitCoordinator
16/05/27 18:22:58 INFO Utils: Successfully started service 'SparkUI' on port 4040.
16/05/27 18:22:58 INFO SparkUI: Started SparkUI at http://XXX.XXX.XXX.43:4040
16/05/27 18:22:58 INFO HttpFileServer: HTTP File server directory is /tmp/spark-de18dde4-d74e-4197-beab-2bc3de517b74/httpd-8faa7605-d0e3-44b9-ba73-d18ce63fe8f1
16/05/27 18:22:58 INFO HttpServer: Starting HTTP Server
16/05/27 18:22:58 INFO Utils: Successfully started service 'HTTP file server' on port 49921.
16/05/27 18:22:58 INFO SparkContext: Added JAR file:/home/hadoop/DataSetCreator/./datasetcreator_2.10-1.1.jar at http://XXX.XXX.XXX.43:49921/jars/datasetcreator_2.10-1.1.jar with timestamp 1464340978585
16/05/27 18:22:58 WARN YarnClientSchedulerBackend: NOTE: SPARK_WORKER_CORES is deprecated. Use SPARK_EXECUTOR_CORES or --executor-cores through spark-submit instead.
16/05/27 18:22:58 INFO ConfiguredRMFailoverProxyProvider: Failing over to rm2
16/05/27 18:22:58 INFO Client: Requesting a new application from cluster with 4 NodeManagers
16/05/27 18:22:58 INFO Client: Verifying our application has not requested more than the maximum memory capability of the cluster (19288 MB per container)
16/05/27 18:22:58 INFO Client: Will allocate AM container, with 896 MB memory including 384 MB overhead
16/05/27 18:22:58 INFO Client: Setting up container launch context for our AM
16/05/27 18:22:58 INFO Client: Setting up the launch environment for our AM container
16/05/27 18:22:58 INFO Client: Preparing resources for our AM container
16/05/27 18:22:59 INFO Client: Uploading resource file:/home/gooper/svc/apps/sda/bin/hadoop/spark-1.6.1-bin-hadoop2.6/lib/spark-assembly-1.6.1-hadoop2.6.0.jar -> hdfs://mycluster/user/hadoop/.sparkStaging/application_1464337540213_0018/spark-assembly-1.6.1-hadoop2.6.0.jar
16/05/27 18:23:01 INFO Client: Uploading resource file:/tmp/spark-de18dde4-d74e-4197-beab-2bc3de517b74/__spark_conf__2857474168024892319.zip -> hdfs://mycluster/user/hadoop/.sparkStaging/application_1464337540213_0018/__spark_conf__2857474168024892319.zip
16/05/27 18:23:01 INFO SecurityManager: Changing view acls to: hadoop
16/05/27 18:23:01 INFO SecurityManager: Changing modify acls to: hadoop
16/05/27 18:23:01 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(hadoop); users with modify permissions: Set(hadoop)
16/05/27 18:23:01 INFO Client: Submitting application 18 to ResourceManager
16/05/27 18:23:01 INFO YarnClientImpl: Submitted application application_1464337540213_0018
16/05/27 18:23:02 INFO Client: Application report for application_1464337540213_0018 (state: ACCEPTED)
16/05/27 18:23:02 INFO Client: 
         client token: N/A
         diagnostics: N/A
         ApplicationMaster host: N/A
         ApplicationMaster RPC port: -1
         queue: root.hadoop
         start time: 1464340977670
         final status: UNDEFINED
         tracking URL: http://sda2:8088/proxy/application_1464337540213_0018/
         user: hadoop
16/05/27 18:23:03 INFO Client: Application report for application_1464337540213_0018 (state: ACCEPTED)
16/05/27 18:23:04 INFO Client: Application report for application_1464337540213_0018 (state: ACCEPTED)
16/05/27 18:23:04 INFO YarnSchedulerBackend$YarnSchedulerEndpoint: ApplicationMaster registered as NettyRpcEndpointRef(null)
16/05/27 18:23:04 INFO YarnClientSchedulerBackend: Add WebUI Filter. org.apache.hadoop.yarn.server.webproxy.amfilter.AmIpFilter, Map(PROXY_HOSTS -> sda1, PROXY_URI_BASES -> http://sda1:8088/proxy/application_1464337540213_0018), /proxy/application_1464337540213_0018
16/05/27 18:23:04 INFO JettyUtils: Adding filter: org.apache.hadoop.yarn.server.webproxy.amfilter.AmIpFilter
16/05/27 18:23:05 INFO Client: Application report for application_1464337540213_0018 (state: RUNNING)
16/05/27 18:23:05 INFO Client: 
         client token: N/A
         diagnostics: N/A
         ApplicationMaster host: XXX.XXX.XXX.44
         ApplicationMaster RPC port: 0
         queue: root.hadoop
         start time: 1464340977670
         final status: UNDEFINED
         tracking URL: http://sda2:8088/proxy/application_1464337540213_0018/
         user: hadoop
16/05/27 18:23:05 INFO YarnClientSchedulerBackend: Application application_1464337540213_0018 has started running.
16/05/27 18:23:05 INFO Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 44676.
16/05/27 18:23:05 INFO NettyBlockTransferService: Server created on 44676
16/05/27 18:23:05 INFO BlockManagerMaster: Trying to register BlockManager
16/05/27 18:23:05 INFO BlockManagerMasterEndpoint: Registering block manager XXX.XXX.XXX.43:44676 with 1247.3 MB RAM, BlockManagerId(driver, XXX.XXX.XXX.43, 44676)
16/05/27 18:23:05 INFO BlockManagerMaster: Registered BlockManager
16/05/27 18:23:05 INFO EventLoggingListener: Logging events to hdfs://mycluster/user/hadoop/spark/application_1464337540213_0018
16/05/27 18:23:08 INFO YarnSchedulerBackend$YarnSchedulerEndpoint: ApplicationMaster registered as NettyRpcEndpointRef(null)
16/05/27 18:23:08 INFO YarnClientSchedulerBackend: Add WebUI Filter. org.apache.hadoop.yarn.server.webproxy.amfilter.AmIpFilter, Map(PROXY_HOSTS -> sda1, PROXY_URI_BASES -> http://sda1:8088/proxy/application_1464337540213_0018), /proxy/application_1464337540213_0018
16/05/27 18:23:08 INFO JettyUtils: Adding filter: org.apache.hadoop.yarn.server.webproxy.amfilter.AmIpFilter
16/05/27 18:23:09 ERROR YarnClientSchedulerBackend: Yarn application has already exited with state FINISHED!
16/05/27 18:23:09 INFO SparkUI: Stopped Spark web UI at http://XXX.XXX.XXX.43:4040
16/05/27 18:23:09 INFO YarnClientSchedulerBackend: Shutting down all executors
16/05/27 18:23:09 INFO YarnClientSchedulerBackend: Asking each executor to shut down
16/05/27 18:23:09 INFO YarnClientSchedulerBackend: Stopped
16/05/27 18:23:09 INFO MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped!
16/05/27 18:23:09 INFO MemoryStore: MemoryStore cleared
16/05/27 18:23:09 INFO BlockManager: BlockManager stopped
16/05/27 18:23:09 INFO BlockManagerMaster: BlockManagerMaster stopped
16/05/27 18:23:09 INFO OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped!
16/05/27 18:23:09 INFO SparkContext: Successfully stopped SparkContext
16/05/27 18:23:09 INFO RemoteActorRefProvider$RemotingTerminator: Shutting down remote daemon.
16/05/27 18:23:09 INFO RemoteActorRefProvider$RemotingTerminator: Remote daemon shut down; proceeding with flushing remote transports.
16/05/27 18:23:09 INFO RemoteActorRefProvider$RemotingTerminator: Remoting shut down.
16/05/27 18:23:28 INFO YarnClientSchedulerBackend: SchedulerBackend is ready for scheduling beginning after waiting maxRegisteredResourcesWaitingTime: 30000(ms)
16/05/27 18:23:28 ERROR SparkContext: Error initializing SparkContext.
java.lang.NullPointerException
        at org.apache.spark.SparkContext.<init>(SparkContext.scala:584)
        at dataCreator.Settings$.loadSparkContext(Settings.scala:69)
        at dataCreator.Settings$.<init>(Settings.scala:17)
        at dataCreator.Settings$.<clinit>(Settings.scala)
        at runDriver$.main(runDriver.scala:12)
        at runDriver.main(runDriver.scala)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:497)
        at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731)
        at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
        at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
        at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
        at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
16/05/27 18:23:28 INFO SparkContext: SparkContext already stopped.
Exception in thread "main" java.lang.ExceptionInInitializerError
        at runDriver$.main(runDriver.scala:12)
        at runDriver.main(runDriver.scala)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:497)
        at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731)
        at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
        at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
        at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
        at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.lang.NullPointerException
        at org.apache.spark.SparkContext.<init>(SparkContext.scala:584)
        at dataCreator.Settings$.loadSparkContext(Settings.scala:69)
        at dataCreator.Settings$.<init>(Settings.scala:17)
        at dataCreator.Settings$.<clinit>(Settings.scala)
        ... 11 more
16/05/27 18:23:28 INFO ShutdownHookManager: Shutdown hook called
16/05/27 18:23:28 INFO ShutdownHookManager: Deleting directory /tmp/spark-de18dde4-d74e-4197-beab-2bc3de517b74/httpd-8faa7605-d0e3-44b9-ba73-d18ce63fe8f1
16/05/27 18:23:28 INFO ShutdownHookManager: Deleting directory /tmp/spark-de18dde4-d74e-4197-beab-2bc3de517b74



^CTraceback (most recent call last):
  File "DataSetCreator.py", line 128, in <module>
    main(sys.argv[1:])
  File "DataSetCreator.py", line 125, in main
    generateDatsets()
  File "DataSetCreator.py", line 83, in generateDatsets
    delay()
  File "DataSetCreator.py", line 45, in delay
    time.sleep(delTime)
KeyboardInterrupt
번호 제목 날짜 조회 수
224 ./spark-sql 실행시 "java.lang.NumberFormatException: For input string: "1s"오류발생시 조치사항 2016.06.09 533
223 beeline실행시 User: root is not allowed to impersonate오류 발생시 조치사항 2016.06.03 925
222 Caused by: java.net.URISyntaxException: Relative path in absolute URI: ${system:java.io.tmpdir%7D/$%7Bsystem:user.name%7D오류발생시 조치사항 2016.06.03 1317
221 impala 설치/설정 2016.06.03 1304
220 hive 2.0.1 설치및 mariadb로 metastore 설정 2016.06.03 5386
219 Windows에서 sbt개발환경 구축 방법(링크) 2016.06.02 272
218 "암은 평범한 병, 심호흡만 잘해도 암세포 분열 저지” 2016.06.02 316
217 Scala버젼 변경 혹은 상황에 맞게 Spark소스 컴파일하기 2016.05.31 868
216 centos에 sbt 0.13.5 설치 2016.05.30 816
215 Job이 끝난 log을 볼수 있도록 설정하기 2016.05.30 1079
» spark client프로그램 기동시 "Error initializing SparkContext"오류 발생할때 조치사항 2016.05.27 688
213 --master yarn 옵션으로 spark client프로그램 실행할때 메모리 부족 오류발생시 조치방법 2016.05.27 580
212 DataSetCreator.py 실행시 파일을 찾을 수 없는 오류 2016.05.27 192
211 python 2.6.6에서 print 'A=' 형태의 사용이 python 3.5.1에서 오류(SyntaxError: Missing parentheses in call to 'print') 발생함.. 2016.05.27 615
210 python실행시 ValueError: zero length field name in format오류 해결방법 2016.05.27 1004
209 S2RDF 테스트(벤치마크 테스트를 기준으로 python, scala소스가 만들어져서 기능은 파악되지 못함) [1] file 2016.05.27 206
208 CentOS6에 python3.5.1 소스코드로 빌드하여 설치하기 2016.05.27 867
207 RDF storage조합에대한 test결과(4store, Jena+HBase, Hive+HBase, CumulusRDF, Couchbase) 페이지 링크 2016.05.26 596
206 spark-submit으로 spark application실행하는 다양한 방법 2016.05.25 741
205 spark 온라인 책자링크 (제목 : mastering-apache-spark) 2016.05.25 464
위로