(venv) [root@master pyspark_project]# spark-submit --master spark://10.105.16.240:7077 /home/mysql1/anqu/python/anquProduct/restApi/flask/RestApiServer.py
报错:
erImpl: Initial job has not accepted any resources; check your cluster UI to ensure that workers are registered and have sufficient resources
16/08/24 15:12:58 WARN TaskSchedulerImpl: Initial job has not accepted any resources; check your cluster UI to ensure that workers are registered and have sufficient resources
解决:
网上看了很多 1.hostname与主机名不对、2.资源不够、3.4040端口号被占用,之前的程序已运行。什么的。都不是。我这里把spark-env.sh的配置参数都去掉了,把参数设置在提交应用时,解决。应该是主从节点配置不一致导致的。
\