메뉴 건너뛰기

Bigdata, Semantic IoT, Hadoop, NoSQL

Bigdata, Hadoop ecosystem, Semantic IoT등의 프로젝트를 진행중에 습득한 내용을 정리하는 곳입니다.
필요한 분을 위해서 공개하고 있습니다. 문의사항은 gooper@gooper.com로 메일을 보내주세요.


1. 프로젝트 폴더로 이동하여 console창에서 "sbt assembly"하면 jar파일이 만들어진다.

2. "sbt eclipse"하고 eclipse에서 "Refresh"하면 관련 jar를 인식하여 오류,경고 표시가 없어진다.

3. "sbt clean update compile"하면 모두 지우고 jar를 다시 받아 정리하고 컴파일을 한다.


---------------build.sbt-------------

import sbtassembly.AssemblyPlugin._


name := "sda-client"


version := "2.0"


javacOptions ++= Seq("-encoding", "UTF-8")


scalaVersion := "2.11.8"

//scalaVersion := "2.10.5"


resolvers += "Akka Repository" at "http://repo.akka.io/releases/"


// sda-common.jar

// unmanagedJars in Compile += file("C:\dev\workspace\sda-common\build\libs\sda-common-2.0.jar")


unmanagedJars in Compile += file("C:\dev\workspace\sda-common\target\sda-common-2.0.jar")


libraryDependencies ++= Seq(

   // spark

("org.apache.spark" %% "spark-core" % "2.0.0" % "provided").

    exclude("org.mortbay.jetty", "servlet-api").

    exclude("commons-beanutils", "commons-beanutils-core").

    exclude("commons-collections", "commons-collections").

    exclude("commons-logging", "commons-logging").

    exclude("com.esotericsoftware.minlog", "minlog").

    exclude("com.codahale.metrics", "metrics-core"),

("org.apache.spark" %% "spark-sql" % "2.0.0" % "provided"),

("org.apache.spark" %% "spark-streaming" % "2.0.0" % "provided"),

("org.apache.spark" %% "spark-streaming-kafka" % "1.6.2" % "provided"),

    

    // hadoop

    ("org.apache.hadoop" % "hadoop-common" % "2.7.2" % "provided"),    

    ("org.apache.hadoop" % "hadoop-mapreduce-client-common" % "2.7.2" % "provided")

    

 )

 

 // Excluding Scala library JARS(optional)

assemblyOption in assembly := (assemblyOption in assembly).value.copy(includeScala = false)

 


assemblyExcludedJars in assembly := { 

  val cp = (fullClasspath in assembly).value

  cp filter {_.data.getName == "slf4j-api-1.7.20.jar"}

}


assemblyMergeStrategy in assembly := {

    case PathList("javax", "servlet", xs @ _*) => MergeStrategy.last

    case PathList("javax", "activation", xs @ _*) => MergeStrategy.last

    case PathList("org", "apache", xs @ _*) => MergeStrategy.last

    case PathList("com", "google", xs @ _*) => MergeStrategy.last

    case PathList("com", "esotericsoftware", xs @ _*) => MergeStrategy.last

    case PathList("com", "codahale", xs @ _*) => MergeStrategy.last

    case PathList("com", "yammer", xs @ _*) => MergeStrategy.last

    case "about.html" => MergeStrategy.rename

    case "META-INF/ECLIPSEF.RSA" => MergeStrategy.last

    case "META-INF/mailcap" => MergeStrategy.last

    case "META-INF/mimetypes.default" => MergeStrategy.last

    case "plugin.properties" => MergeStrategy.last

    case "log4j.properties" => MergeStrategy.last

    case x =>

        val oldStrategy = (assemblyMergeStrategy in assembly).value

        oldStrategy(x)

}

A personal place to organize information learned during the development of such Hadoop, Hive, Hbase, Semantic IoT, etc.
We are open to the required minutes. Please send inquiries to gooper@gooper.com.

위로