Overview

Request 583860 accepted

- Install /etc/spark/spark-env . This script is automatically
read during startup and can be used for custom configuration
- Install /etc/spark/spark-defaults.conf
- Create /run/spark dir via systemd tmpfiles
- Add missing Requires/BuildRequires for systemd
- Drop openstack-suse-macros BuildRequires and use the typical
way to create a spark user/group and homedir
- Add useful description

Loading...

Ashwin Agate's avatar

+- Install /etc/spark/spark-defaults.conf

  • we have tested this and it does not work, whenever a spark job is submitted it does not read this file even when SPARK_CONF_DIR is set to /etc/spark. This has to be in /usr/share/spark/conf/spark-defaults.conf (we are stuck with using this location, for Spark 1.6.3)

+install -D -m 755 %{name}-%{version}/dist/conf/spark-env.sh.template %{buildroot}/%{_sysconfdir}/spark/spark-env

This can be removed. spark-env.sh.template is in a different format it has variables which are set using export var=val syntax whereas for systemd purposes the variables in spark-env EnvironmentFile should be var=val syntax

ExecStart=/usr/bin/java \ -cp "/usr/share/spark/lib/*" \ org.apache.spark.deploy.master.Master

We want to also allow the users to set java heap size -Xmx and -(Xms options I dont think those get passed automatically when you invoke java those were being set by $SPARK_DAEMON_JAVA_OPTS variable. I have a feeling same is true for $SPARK_MASTERS in spark worker (since its a variable that contain multiple masters master1:port,master2:port Not sure about --ip, --port if they get picked from env automagically, will have to test it)

But in general I like calling out options explicity when invoking java

Request History
Thomas Bechtold's avatar

tbechtold created request

- Install /etc/spark/spark-env . This script is automatically
read during startup and can be used for custom configuration
- Install /etc/spark/spark-defaults.conf
- Create /run/spark dir via systemd tmpfiles
- Add missing Requires/BuildRequires for systemd
- Drop openstack-suse-macros BuildRequires and use the typical
way to create a spark user/group and homedir
- Add useful description


Dirk Mueller's avatar

dirkmueller accepted request

openSUSE Build Service is sponsored by