The command sbin/start-all.sh is used to start all Spark standalone daemons (Master and Workers).

In the given output, it shows that the Master daemon failed to start due to the following error:

  • /opt/module/spark-standalone/bin/spark-class:行71: /opt/module/jdk1.8.0_144/bin/java: 没有那个文件或目录: This error indicates that the Java executable (java) could not be found at the specified path. It suggests that the Java Development Kit (JDK) is not installed or the path to the JDK is not correctly set.

Similarly, the Worker daemons on hadoop102 and hadoop103 also failed to start due to the same error.

To resolve this issue, ensure that you have installed the JDK and set the correct path to the JDK in your system.

Spark Standalone Cluster 启动失败:Java 可执行文件缺失

原文地址: http://www.cveoy.top/t/topic/z6n 著作权归作者所有。请勿转载和采集!

免费AI点我,无需注册和登录