spark报错 - ip访问不到

14/11/04 15:02:53 WARN cluster.YarnClientClusterScheduler: Initial job has not accepted any resources; check your cluster UI to ensure that workers are registered and have sufficient memory 


昨天刚装完vmware后,出现了vmnet8地址,然后跑spark就连到了vmnet8地址上了,需要重设如下:


需要在.bashrc中设置环境变量:  将其设为本机的外网ip.  

export SPARK_LOCAL_IP='10.223.1.1'  


你可能感兴趣的:(spark)