Docker 搭建Spark 依赖singularities/spark:2.2镜像

时间:2021-04-15 08:44:20

singularities/spark:2.2版本中

Hadoop版本:2.8.2

Spark版本: 2.2.1

Scala版本:2.11.8

Java版本:1.8.0_151

拉取镜像:

[root@localhost docker-spark-2.1.]# docker pull singularities/spark

查看:

[root@localhost docker-spark-2.1.]# docker image ls
REPOSITORY TAG IMAGE ID CREATED SIZE
docker.io/singularities/spark latest 84222b254621 months ago 1.39 GB

创建docker-compose.yml文件

[root@localhost home]# mkdir singularitiesCR
[root@localhost home]# cd singularitiesCR
[root@localhost singularitiesCR]# touch docker-compose.yml

内容:

version: ""

services:
master:
image: singularities/spark
command: start-spark master
hostname: master
ports:
- "6066:6066"
- "7070:7070"
- "8080:8080"
- "50070:50070"
worker:
image: singularities/spark
command: start-spark worker master
environment:
SPARK_WORKER_CORES:
SPARK_WORKER_MEMORY: 2g
links:
- master

执行docker-compose up即可启动一个单工作节点的standlone模式下运行的spark集群

[root@localhost singularitiesCR]# docker-compose up -d
Creating singularitiescr_master_1 ... done
Creating singularitiescr_worker_1 ... done

查看容器:

[root@localhost singularitiesCR]# docker-compose ps
Name Command State Ports
--------------------------------------------------------------------------------------------------------------------------------------------------------
singularitiescr_master_1 start-spark master Up /tcp, /tcp, /tcp, /tcp, /tcp, /tcp,
0.0.0.0:->/tcp, /tcp, /tcp, /tcp, /tcp,
0.0.0.0:->/tcp, 0.0.0.0:->/tcp, /tcp, /tcp,
0.0.0.0:->/tcp, /tcp, /tcp
singularitiescr_worker_1 start-spark worker master Up /tcp, /tcp, /tcp, /tcp, /tcp, /tcp, /tcp, /tcp,
/tcp, /tcp, /tcp, /tcp, /tcp, /tcp, /tcp, /tcp,
/tcp

查看结果:

Docker 搭建Spark 依赖singularities/spark:2.2镜像

Docker 搭建Spark 依赖singularities/spark:2.2镜像

停止容器:

[root@localhost singularitiesCR]# docker-compose stop
Stopping singularitiescr_worker_1 ... done
Stopping singularitiescr_master_1 ... done
[root@localhost singularitiesCR]# docker-compose ps
Name Command State Ports
-----------------------------------------------------------------------
singularitiescr_master_1 start-spark master Exit
singularitiescr_worker_1 start-spark worker master Exit

删除容器:

[root@localhost singularitiesCR]# docker-compose rm
Going to remove singularitiescr_worker_1, singularitiescr_master_1
Are you sure? [yN] y
Removing singularitiescr_worker_1 ... done
Removing singularitiescr_master_1 ... done
[root@localhost singularitiesCR]# docker-compose ps
Name Command State Ports
------------------------------

进入master容器查看版本:

[root@localhost singularitiesCR]# docker exec -it  /bin/bash
root@master:/# hadoop version
Hadoop 2.8.
Subversion https://git-wip-us.apache.org/repos/asf/hadoop.git -r 66c47f2a01ad9637879e95f80c41f798373828fb
Compiled by jdu on --19T20:39Z
Compiled with protoc 2.5.
From source with checksum dce55e5afe30c210816b39b631a53b1d
This command was run using /usr/local/hadoop-2.8./share/hadoop/common/hadoop-common-2.8..jar
root@master:/# which is hadoop
/usr/local/hadoop-2.8./bin/hadoop
root@master:/# spark-shell
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
// :: WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Spark context Web UI available at http://172.18.0.2:4040
Spark context available as 'sc' (master = local[*], app id = local-).
Spark session available as 'spark'.
Welcome to
____ __
/ __/__ ___ _____/ /__
_\ \/ _ \/ _ `/ __/ '_/
/___/ .__/\_,_/_/ /_/\_\ version 2.2.
/_/ Using Scala version 2.11. (OpenJDK -Bit Server VM, Java 1.8.0_151)
Type in expressions to have them evaluated.
Type :help for more information.

参考:

https://github.com/SingularitiesCR/spark-docker

https://blog.csdn.net/u013705066/article/details/80030732