spark 3.1.1 combine hadoop(version 2.6.0-cdh5.13.1) compile error

classic Classic list List threaded Threaded
1 message Options
Reply | Threaded
Open this post in threaded view
|

spark 3.1.1 combine hadoop(version 2.6.0-cdh5.13.1) compile error

jiahong li
Hi,everyone,
 when i compile combine with hadoop version 2.6.0-cdh5.13.1 ,compile comand is 
./dev/make-distribution.sh --name 2.6.0-cdh5.13.1  --pip  --tgz  -Phive -Phive-thriftserver -Pyarn -Dhadoop.version=2.6.0-cdh5.13.1,
there exists error like this:
[INFO] --- scala-maven-plugin:4.3.0:compile (scala-compile-first) @ spark-core_2.12 ---
[INFO] Using incremental compilation using Mixed compile order
[INFO] Compiler bridge file: .sbt/1.0/zinc/org.scala-sbt/org.scala-sbt-compiler-bridge_2.12-1.3.1-bin_2.12.10__52.0-1.3.1_20191012T045515.jar
[INFO] compiler plugin: BasicArtifact(com.github.ghik,silencer-plugin_2.12.10,1.6.0,null)
[INFO] Compiling 560 Scala sources and 99 Java sources to spark/core/target/scala-2.12/classes ...
[ERROR] [Error] spark/core/src/main/scala/org/apache/spark/ui/HttpSecurityFilter.scala:107: type mismatch;
 found   : K where type K
 required: String
[ERROR] [Error] spark/core/src/main/scala/org/apache/spark/ui/HttpSecurityFilter.scala:107: value map is not a member of V
[ERROR] [Error] spark/core/src/main/scala/org/apache/spark/ui/HttpSecurityFilter.scala:107: missing argument list for method stripXSS in class XssSafeRequest
Unapplied methods are only converted to functions when a function type is expected.
You can make this conversion explicit by writing `stripXSS _` or `stripXSS(_)` instead of `stripXSS`.
[ERROR] [Error] spark/core/src/main/scala/org/apache/spark/ui/PagedTable.scala:307: value startsWith is not a member of K
[ERROR] [Error] spark/core/src/main/scala/org/apache/spark/util/Utils.scala:580: value toLowerCase is not a member of object org.apache.hadoop.util.StringUtils
[ERROR] 5 errors found

how i can compile combine with hadoop version 2.6.0-cdh5.13.1? is there any jira?

Dereck Li
Apache Spark Contributor   
Continuing Learner  
@Hangzhou,China