hadoop/yarn/spark执行器内存增加

w41d8nur  于 2021-06-01  发布在  Hadoop
关注(0)|答案(0)|浏览(272)

当我用--master yarn cluster--num executors 7--driver memory 10g--executor memory 16g--executor cores 5执行spark submit命令时,我得到以下错误,我不确定在哪里更改堆大小,我怀疑yarn配置文件在什么地方,请给出建议
错误

Invalid maximum heap size: -Xmx10g
The specified size exceeds the maximum representable size.
Error: Could not create the Java Virtual Machine.
Error: A fatal exception has occurred. Program will exit.**

暂无答案!

目前还没有任何答案,快来回答吧!

相关问题