sbt,scala,spark配置问题

You probably access the destination server through a proxyserver或者...org.scala-sbt/main/0.13.2/jars/main.jar等jar包无法下载,多半为路由器DNS服务器地址配置错误,改为114.114.114.114或其他DNS.

ubuntu 14,eclipse 3.8,scala 2.9配置eclipse scala ide,ide无法正常工作.
scala重新安装(重新安装需要覆盖安装: sudo dpkg -i --force-overwritexxx.deb)为2.10.x时ide工作正常

新建sbt-scala-spark项目,目录树如下:

sbt-scala-spark

├── build.sbt
├── lib
│ 
  └──spark-assembly-1.3.1-hadoop2.6.0.jar
├── project
│ 
  ├── build.properties
│ 
  ├── plugins.sbt
│ 
  ├── project
│ 
  │    └── target
│ 
  │            └── config-classes
│ 
  └── target
├── src
│ 
  ├── main
│ 
  │    └── scala
│ 
  │            └── SparkPi.scala
│ 
  └── test
│ 
          └── scala
├── target

文件build.sbt如下

lazy val root = (project in file(".")).
 
settings(
 
    name := "sbt-scala-spark",
 
    version :="1.0",
 
    scalaVersion:= "2.10.4",
 
    libraryDependencies += "org.apache.spark" % "spark-core_2.10" %"1.3.0",
 
    libraryDependencies += "voldemort.store.compress" % "h2-lzf" %"1.0",
 
    resolvers +="Typesafe Repository" at"http://repo.typesafe.com/typesafe/releases/",
 
    resolvers+="Maven Repository" at"http://repo.maven.apache.org/maven2",
 
    resolvers +="Akka Repository" at "http://repo.akka.io/repository/"
)

文件build.properties如下:
sbt.version=0.13.2

lib需要加入spark-assembly-1.3.1-hadoop2.6.0.jar或其他spark依赖包

附录:
1.sbt文档英文链接:http://www.scala-sbt.org/documentation.html
2.sbt中文文档链接:http://www.scala-sbt.org/0.13/tutorial/zh-cn/index.html
2.sbt配置参考链接:http://blog.csdn.net/oopsoom/article/details/38363369

你可能感兴趣的:(sbt,scala,spark配置问题)