Wednesday, October 12, 2016

How to include multiple JAR files for Spark-shell and spark-sql


#Below are additional JAR files required to connect Apache Cassandra

[donghua@localhost ~]$ ls  -l /home/donghua/spark-casssandra-connector/*
-rw-rw-r--. 1 donghua donghua  231320 Oct 12 15:39 /home/donghua/spark-casssandra-connector/commons-beanutils-1.8.0.jar
-rw-rw-r--. 1 donghua donghua   38460 Oct 12 15:39 /home/donghua/spark-casssandra-connector/joda-convert-1.2.jar
-rw-rw-r--. 1 donghua donghua  581571 Oct 12 15:39 /home/donghua/spark-casssandra-connector/joda-time-2.3.jar
-rw-rw-r--. 1 donghua donghua   62226 Oct 12 15:40 /home/donghua/spark-casssandra-connector/jsr166e-1.1.0.jar
-rw-rw-r--. 1 donghua donghua 2112017 Oct 12 15:39 /home/donghua/spark-casssandra-connector/netty-all-4.0.33.Final.jar
-rw-rw-r--. 1 donghua donghua 4573750 Oct 12 15:41 /home/donghua/spark-casssandra-connector/scala-reflect-2.11.8.jar
-rw-rw-r--. 1 donghua donghua 6036067 Oct 12 15:39 /home/donghua/spark-casssandra-connector/spark-cassandra-connector-2.0.0-M2-s_2.11.jar

[donghua@localhost ~]$ cat spark-2.0.1-bin-hadoop2.7/conf/
docker.properties.template    log4j.properties.template     slaves.template               spark-defaults.conf.template
fairscheduler.xml.template    metrics.properties.template   spark-defaults.conf           spark-env.sh.template
[donghua@localhost ~]$ cat spark-2.0.1-bin-hadoop2.7/conf/spark-defaults.conf
#
# Licensed to the Apache Software Foundation (ASF) under one or more
# contributor license agreements.  See the NOTICE file distributed with
# this work for additional information regarding copyright ownership.
# The ASF licenses this file to You under the Apache License, Version 2.0
# (the "License"); you may not use this file except in compliance with
# the License.  You may obtain a copy of the License at
#
#    http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
#

# Default system properties included when running spark-submit.
# This is useful for setting default environmental settings.

# Example:
# spark.master                     spark://master:7077
# spark.eventLog.enabled           true
# spark.eventLog.dir               hdfs://namenode:8021/directory
# spark.serializer                 org.apache.spark.serializer.KryoSerializer
# spark.driver.memory              5g
# spark.executor.extraJavaOptions  -XX:+PrintGCDetails -Dkey=value -Dnumbers="one two three"
#
spark.driver.extraClassPath  /home/donghua/spark-casssandra-connector/*
spark.executor.extraClassPath /home/donghua/spark-casssandra-connector/*

# To execute without using --jars to include additional jar files
[donghua@localhost spark-2.0.1-bin-hadoop2.7]$ $SPARK_HOME/bin/spark-shell --conf spark.cassandra.connection.host=127.0.0.1 

No comments:

Post a Comment