为什么80%的码农都做不了架构师?>>>
https://langyu.iteye.com/blog/992916 https://blog.csdn.net/WeiJiFeng_/article/details/79794544 https://www.cnblogs.com/itboys/p/9226479.html https://www.xuebuyuan.com/3228633.html https://blog.csdn.net/yu0_zhang0/article/details/80454517 https://www.cnblogs.com/arachis/p/Spark_Shuffle.html https://blog.csdn.net/databatman/article/details/53023818
- mapreducer执行流程
spark mlib https://www.cnblogs.com/hd-zg/p/5911454.html
spark学习中的笔记
- map(_._n)表示任意元组tuple对象,后面的数字n表示取第几个数.(n>=1的整数)
val p=List((“hello”,35,1.50),(“nihao”,36,1.78))
res9: List[(String, Int, Double)] = List((hello,35,1.5), (nihao,36,1.78))
scala> p.map(_._1)
res10: List[String] = List(hello, nihao)
scala> p.map(_._2)
res11: List[Int] = List(35, 36)
scala> p.map(_._3)
res12: List[Double] = List(1.5, 1.78)
- 密集和稀疏向量 Vectors.dense 一个向量(1.0,0.0,3.0)它有2中表示的方法 密集:[1.0,0.0,3.0] 其和一般的数组无异
稀疏:(3,[0,2],[1.0,3.0]) 其表示的含义(向量大小,序号,值) 序号从0开始
下面是一个简单的例子
import org.apache.spark.mllib.linalg.Vectors
object Test {
def main(args: Array[String]) {
val vd = Vectors.dense(2, 5, 8)
println(vd(1))
println(vd)
//向量个数,序号,value
val vs = Vectors.sparse(4, Array(0, 1, 2, 3), Array(9, 3, 5, 7))
println(vs(0)) //序号访问
println(vs)
val vs2 = Vectors.sparse(4, Array(0, 2, 1, 3), Array(9, 3, 5, 7))
println(vs2(2))
println(vs2)
}
}
#结果
5.0
[2.0,5.0,8.0]
9.0
(4,[0,1,2,3],[9.0,3.0,5.0,7.0])
3.0
(4,[0,2,1,3],[9.0,3.0,5.0,7.0])