由于项目中的hadoop、hbase版本升级,将原来的代码中的maven替换为springboot,方便后续基于springboot做一些连接、整合等操作
上线报错:java.lang.RuntimeException: java.lang.ClassNotFoundException: Class com.bigdata.demo.WordCount$Map not found
网上的方案基本都是:
job.setJarByClass(WordCount.class)
job.setJar("wordcount.jar")
改目录运行权限等
试了没有用,最后参考了https://www.cnblogs.com/JuncaiF/p/12309672.html这篇博文中提到的可能是打包的问题,导致运行时找不到相应的类。
解决方案:
不要使用spring-boot-maven-plugin打包,调整为使用Maven插件maven-shade-plugin打包
springboot使用的是
<build>
<plugins>
<plugin>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-maven-plugin</artifactId>
</plugin>
</plugins>
</build>
maven-shade-plugin结构如下:
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-shade-plugin</artifactId>
<version>3.1.0</version>
<executions>
<execution>
<phase>package</phase>
<goals>
<goal>shade</goal>
</goals>
<configuration>
<createDependencyReducedPom>False</createDependencyReducedPom>
<transformers>
<transformer
implementation="org.apache.maven.plugins.shade.resource.ManifestResourceTransformer">
<mainClass>com.iptv.bigdata.hbase.WordCount</mainClass>
</transformer>
</transformers>
</configuration>
</execution>
</executions>
</plugin>
java tf wordcount.jar查看jar目录结构
springboot:BOOT-INF/classes/com/iptv/bigdata/hbase/WordCount.class
maven-shade-plugin: com/iptv/bigdata/hbase/WordCount$Map.class
2种打包方式最后的目录结构不同或者manifest文件有所差异导致运行时找不到对应的类
底层原理可以参考:
https://stackoverflow.com/questions/51206959/classnotfound-when-submit-a-spring-boot-fat-jar-to-spark
https://stackoverflow.com/questions/45189701/submitting-spring-boot-application-jar-to-spark-submit
二、在使用maven-shade-plugin打包时可能会遇到Exception in thread "main" java.lang.SecurityException: Invalid signature file digest for Manifest main attributes异常
这是因为在使用maven打包的时候导致某些包的重复引用,以至于打包之后的META-INF的目录下多出了一些.SF,.DSA,*.RSA文件所致
find . -name '*.SF'
./META-INF/BCKEY.SF
find . -name '*.DSA'
./META-INF/BCKEY.DSA
这里有3种解决方案
1、增加如下配置,过滤掉所有依赖的META-INF中SF,DSA,RSA后缀文件
<filters>
<filter>
<artifact>*:*</artifact>
<excludes>
<exclude>META-INF/*.SF</exclude>
<exclude>META-INF/*.DSA</exclude>
<exclude>META-INF/*.RSA</exclude>
</excludes>
</filter>
</filters>
2、对jar包进行选择性导入
<artifactSet>
<includes>
<include>org.apache.hadoop:hadoop-hdfs</include>
<include>org.apache.hadoop:hadoop-client</include>
<include>org.apache.hbase:hbase-client</include>
<include>org.apache.hbase:hbase-mapreduce</include>
</includes>
</artifactSet>
3、将报错的jar包解压,然后删除META-INF的目录下的.SF,.DSA,.RSA文件,然后再打包,参考:https://blog.csdn.net/u010225915/article/details/79454267
甚至可以1条命令直接删除: zip -d xx.jar 'META-INF/.SF' 'META-INF/.RSA' 'META-INF/SF'
参考:https://www.jianshu.com/p/cd1f1b33a41a
方案1和方案2对比:
-方案1生成的jar包会比较大,对于本项目是方案2的10倍左右
-方案2生成的jar包会比较小,但是会报接下来三中描述的异常:Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/hadoop/hbase/HBaseConfiguration
-方案3非主流方案,不推荐
三、重新打包运行,报了新的错误:
Exception in thread "main" java.lang.NoClassDefFoundError:org/apache/hadoop/hbase/HBaseConfiguration
这里也有2种解决方案
1、推荐: 配置环境变量
export HADOOP_CLASSPATH=$HADOOP_CLASSPATH:/usr/hdp/3.1.5.0-152/hbase/lib/*
参考:
https://community.cloudera.com/t5/Support-Questions/hbase-java-api-code-runtime-error/td-p/210222
https://blog.csdn.net/lucky_huijun/article/details/105337490
2、不推荐:在maven中引入对应的jar包,会带来繁琐的试错过程,最后你的maven可能会变成这样
<artifactSet>
<includes>
<include>org.apache.hbase:hbase-client</include>
<include>org.apache.hbase:hbase-hadoop-compat</include>
<include>org.apache.hbase:hbase-common</include>
<include>org.apache.hbase:hbase</include>
<include>org.apache.hbase:hbase-protocol</include>
<include>org.apache.hbase:hbase-prefix-tree</include>
<include>org.apache.hbase:hbase-server</include>
<include>org.apache.hadoop:hadoop-mapreduce-client-core</include>
<include>org.apache.hadoop:hadoop-common</include>
<include>org.apache.hadoop:hadoop-hdfs</include>
<include>org.apache.hadoop:hadoop-client</include>
<include>org.apache.commons:commons-lang3</include>
<include>com.yammer.metrics:metrics-core</include>
</includes>
</artifactSet>