如何在独立模式下在mesos上运行spark

jq6vz3qz  于 2021-06-26  发布在  Mesos
关注(0)|答案(1)|浏览(367)

我已经在本地计算机上安装了mesos,并按照mesos设置中所述对其进行了配置。现在我想在本地机器上安装的mesos上运行spark。我已经根据官方文档配置了spark,并在本地机器上运行了单节点hadoop集群。spark二进制软件包被复制到hdfs根目录,我在spark-env.sh中设置了以下属性:

export MESOS_NATIVE_JAVA_LIBRARY=/usr/local/lib/libmesos.so
export SPARK_EXECUTOR_URI=hdfs://spark-2.2.0-bin-hadoop2.7.tgz

是spark-defaults.conf:

spark.executor.uri         hdfs://spark-2.2.0-bin-hadoop2.7.tgz

运行spark时:

/bin/spark-shell --master mesos://host:5050

正在给出以下错误:

ERROR SparkContext: Error initializing SparkContext.
org.apache.spark.SparkException: Could not parse Master URL: 'mesos://host:5050'

请指导我做错了什么以及如何改正。

ycl3bljg

ycl3bljg1#

我已经成功地在mesos上安装了apachespark。请在你的ubuntu机器上执行以下步骤。


# Update the packages.

$ sudo apt-get update

# Install a few utility tools.

$ sudo apt-get install -y tar wget git

# Install the latest OpenJDK.

$ sudo apt-get install -y openjdk-8-jdk

# Install autotools (Only necessary if building from git repository).

$ sudo apt-get install -y autoconf libtool

# Install other Mesos dependencies.

$ sudo apt-get -y install build-essential python-dev python-six python-virtualenv libcurl4-nss-dev libsasl2-dev libsasl2-modules maven libapr1-dev libsvn-dev

# Change working directory.

$ cd mesos

# If getting error "libz is required to build mesos"

$sudo apt install zlib1g-dev

# Configure and build.

$ mkdir build
$ cd build
build$../configure
$ make
$ make check
$ make install

$ cd build

# start master

./bin/mesos-master.sh --ip=127.0.0.1 --work_dir=/tmp/mesos

# start slave

./bin/mesos-slave.sh --master=127.0.0.1:5050 --work_dir=/tmp/mesos 

# If facing permission issue

./bin/mesos-slave.sh --master=127.0.0.1:5050 --work_dir=/tmp/mesos --no-systemd_enable_support

# Configuring Spark with Mesos

build spark with mesos support
./build/mvn -Pmesos -DskipTests clean package

In spark-env.sh
export MESOS_NATIVE_JAVA_LIBRARY= /usr/local/lib/libmesos.so
export SPARK_EXECUTOR_URI=/localpath/to/spark-2.2.0-bin-hadoop2.7.tgz

In spark-defaults.conf
spark.executor.uri         /localpath/to/spark-2.2.0-bin-hadoop2.7.tgz

run spark shell
./bin/spark-shell --master mesos://127.0.0.1:5050

# Mesos UI

http://127.0.0.1:5050

相关问题