mysql connector java issue

classic Classic list List threaded Threaded
5 messages Options
Reply | Threaded
Open this post in threaded view
|

mysql connector java issue

ismail elhammoud
Hello, 

Guys I have an issue with mysql connector java, even if I declared it in sbt file It couldn't work if I don't give the whole path

spark-submit --master yarn --driver-class-path /home/node2/Téléchargements/mysql-connector-java-5.1.24-bin.jar ./Sdatahub-assembly-0.1.jar


Regards, 
Isma
Reply | Threaded
Open this post in threaded view
|

unsubscribe

Eric Richardson
unsubscribe
Reply | Threaded
Open this post in threaded view
|

Re: mysql connector java issue

Artemis User
In reply to this post by ismail elhammoud

What happened was that you made the mysql jar file only available to the spark driver, not the executors.  Use the --jars parameter instead of driver-class-path to specify your third-party jar files, or copy the third-party jar files to the jars directory for Spark in your HDFS, and specify the path of HDFS using --archives in spark-submit.

-- ND

On 12/10/20 10:02 AM, ismail elhammoud wrote:
Hello, 

Guys I have an issue with mysql connector java, even if I declared it in sbt file It couldn't work if I don't give the whole path

spark-submit --master yarn --driver-class-path /home/node2/Téléchargements/mysql-connector-java-5.1.24-bin.jar ./Sdatahub-assembly-0.1.jar


Regards, 
Isma
Reply | Threaded
Open this post in threaded view
|

Re: mysql connector java issue

lec ssmi
If you can not assembly the jdbc driver jar in your application jar package, you can put the jdbc driver jar in the spark classpath, generally, $SPARK_HOME/jars  or $SPARK_HOME/lib.


Artemis User <[hidden email]> 于2020年12月11日周五 上午5:21写道:

What happened was that you made the mysql jar file only available to the spark driver, not the executors.  Use the --jars parameter instead of driver-class-path to specify your third-party jar files, or copy the third-party jar files to the jars directory for Spark in your HDFS, and specify the path of HDFS using --archives in spark-submit.

-- ND

On 12/10/20 10:02 AM, ismail elhammoud wrote:
Hello, 

Guys I have an issue with mysql connector java, even if I declared it in sbt file It couldn't work if I don't give the whole path

spark-submit --master yarn --driver-class-path /home/node2/Téléchargements/mysql-connector-java-5.1.24-bin.jar ./Sdatahub-assembly-0.1.jar


Regards, 
Isma
Reply | Threaded
Open this post in threaded view
|

Re: mysql connector java issue

Artemis User

Well, this just won't work when you are running Spark on Hadoop...

On 12/10/20 9:14 PM, lec ssmi wrote:
If you can not assembly the jdbc driver jar in your application jar package, you can put the jdbc driver jar in the spark classpath, generally, $SPARK_HOME/jars  or $SPARK_HOME/lib.


Artemis User <[hidden email]> 于2020年12月11日周五 上午5:21写道:

What happened was that you made the mysql jar file only available to the spark driver, not the executors.  Use the --jars parameter instead of driver-class-path to specify your third-party jar files, or copy the third-party jar files to the jars directory for Spark in your HDFS, and specify the path of HDFS using --archives in spark-submit.

-- ND

On 12/10/20 10:02 AM, ismail elhammoud wrote:
Hello, 

Guys I have an issue with mysql connector java, even if I declared it in sbt file It couldn't work if I don't give the whole path

spark-submit --master yarn --driver-class-path /home/node2/Téléchargements/mysql-connector-java-5.1.24-bin.jar ./Sdatahub-assembly-0.1.jar


Regards, 
Isma