spark运行报错Please install psutil to have better support with spilling
记录一下错误,在windows上面运行spark报错words=sc.parallelize(['scala','java','hadoop','spark','scala','hadoop','spark','scala'])words.distinct().count()最然能够运行出结果,但是会报错Pleaseinstallpsutiltohavebettersupportwithspill