Configuring the Spark Runtime via Helm

Place spark/hadoop config files in moc/ subdirectory

After expanding the chart, find the moc directory, and inside it, create a new subdirectory. This directory can be called anything.

cd moc mkdir my-hadoop-conf

Copy into this directory all your spark and hadoop configuration files.

Create a configmap based on the contents of this directory

Update your helm override file to include the following:

moc: configMaps: hadoop-config: my-hadoop-conf/*

When helm install or helm upgrade is run, a configmap named hadoop-config will be created that contains the contents of the my-hadoop-conf directory. Note that the supplied directory name is relative to the moc/ directory, and helm cannot create configmaps based on directories outside the moc/ directory.

Configure the spark runtime to use these files

Update your helm override file to include the following:

moc: sparkRuntimes: "alpha": env: HADOOP_CONF_DIR: /modelop/hadoop/conf configMaps: hadoop-config: mountPath: /modelop/hadoop/conf

The above will configure spark-runtime-alpha with the files from the moc/my-hadoop-conf directory, via the hadoop-config configmap. The files will be mounted inside the pod at /modelop/hadoop/conf, and the HADOOP_CONF_DIR env variable will direct the spark runtime to look there for the files.

Configure SPARK_HOME

By default, the SPARK_HOME env variable is set to /modelop/bin/spark-2.4.4-bin-hadoop2.6. If a different version of spark/hadoop is installed onto the image, be sure to set the SPARK_HOME env variable in your helm override file:

Â