extraClassPath =/opt/prog/hadoop-aws-2.7.1.jar:/opt/prog/aws-java-sdk-1.10.50.jar"
The spark-submit job can't find the relevant files in the class path.
Tried lot of options such as --jars , --driver-class-path , etc, but none worked.
If the configuration references Java system properties or environment variables not managed by YARN, they
Solved: We have Spark installed via Cloudera Manager on a YARN cluster. It appears there is a classpath.txt file in /etc/spark/conf that include list.
This directory should allow any Spark user to read/write files and the Spark
Alternatively, is there a way to disable construction of classpath.txt and rely
callerContext, (none), Application information that will be written into Yarn RM
This article is about modifying the class path for YARN applications. In this article, new Java class path "/opt/lzopath/" directory is added to the classpath.
By default, Spark on YARN uses Spark JAR files that are installed locally.