福建富通建设有限公司网站,wordpress改cms,上海工程咨询行业协会,舞钢市城市建设局网站从源代码编译构建Hive3.1.3 编译说明编译Hive3.1.3更改Maven配置下载源码修改项目pom.xml修改hive源码修改说明修改standalone-metastore模块修改ql模块修改spark-client模块修改druid-handler模块修改llap-server模块修改llap-tez模块修改llap-common模块 编译打包异常集合异常… 从源代码编译构建Hive3.1.3 编译说明编译Hive3.1.3更改Maven配置下载源码修改项目pom.xml修改hive源码修改说明修改standalone-metastore模块修改ql模块修改spark-client模块修改druid-handler模块修改llap-server模块修改llap-tez模块修改llap-common模块 编译打包异常集合异常1异常2异常3异常4 编译打包成功总结 编译说明 使用Hive官方提供的预编译安装包是最常见和推荐的方式来使用Hive适用于大多数用户。这些预编译的安装包经过了测试和验证在许多不同的环境中都能正常运行。 在某些特定情况下可能需要从源代码编译Hive而不是使用预编译的安装包。 编译Hive源代码的场景、原因如下
1.定制配置 如果希望对Hive进行一些特定的配置定制或修改例如更改默认的参数设置、添加新的数据存储后端、集成新的执行引擎等那么编译源代码将能够修改和定制 Hive 的配置。 2.功能扩展 如果需要扩展Hive的功能例如添加自定义的 UDF用户定义函数、UDAF用户定义聚合函数、UDTF用户定义表生成函数等编译源代码将添加和构建这些自定义功能。 3.调试和修改 Bug 如果在使用Hive过程中遇到了问题或者发现了bug并希望进行调试和修复那么编译源代码将能够获得运行时的源代码进而进行调试和修改。 4.最新特性和改进 如果希望使用Hive的最新特性、改进和优化但这些特性尚未发布到官方的预编译包中可以从源代码编译最新的版本以获得并使用这些功能。 5.参与社区贡献 如果对Hive有兴趣并希望为其开发做贡献通过编译源代码可以获取到完整的开发环境包括构建工具、测试框架和源代码以便与Hive社区一起开发和贡献代码。 编译Hive3.1.3 当使用Spark作为Hive的执行引擎时但是Hive3.1.3本身支持的Spark版本是2.3故此需要重新编译Hive让Hive支持较新版本的Spark。计划编译Hive支持Spark3.4.0Hadoop版本3.1.3 更改Maven配置
更改maven的settings.xml文件看情况决定是否添加如下仓库地址仅供参考 !-- 阿里云仓库 --mirroridaliyun-central/idname阿里云公共仓库/nameurlhttps://maven.aliyun.com/repository/central/urlmirrorOf*/mirrorOf/mirror!-- 中央仓库 --mirroridrepo/idmirrorOfcentral/mirrorOfnameHuman Readable Name for this Mirror./nameurlhttps://repo.maven.apache.org/maven2/url/mirror下载源码
下载需要编译的Hive版本源码这里打算重新编译Hive3.1.3
wget https://archive.apache.org/dist/hive/hive-3.1.3/pache-hive-3.1.3-src.tar.gzIDEA打开pache-hive-3.1.3-src项目打开项目后肯定会各种爆红不用管 修改项目pom.xml
1.修改Hadoop版本 Hive3.1.3支持的Hadoop版本是3.1.10但是Hive与Hadoop之间记得有个范围支持故与Hadoop相关的操作看需求是否更改 hadoop.version3.1.0/hadoop.versionhadoop.version3.1.3/hadoop.version清楚的记得Hadoop3.1.3使用日志版本是1.7.25
slf4j.version1.7.10/slf4j.versionslf4j.version1.7.25/slf4j.version2.修改guava版本 由于Hive运行时会加载Hadoop依赖因此需要修改Hive中guava版本为Hadoop中的guava版本。这里即使不更改实则在使用Hive时也可能会进行更换guava版本操作(版本差异不大可以不用更换) guava.version19.0/guava.versionguava.version27.0-jre/guava.version3.修改spark版本 Hive3.1.3默认支持的Spark是2.3.0这步也是核心使其支持Spark3.4.0使用版本较新看需求适当降低。另外明确指定Spark3.4.0使用的是Scala2.13版本一同修改 spark.version2.3.0/spark.version
scala.binary.version2.11/scala.binary.version
scala.version2.11.8/scala.version# 原计划编译spark3.4.0 特么的太多坑了 后面不得不放弃
spark.version3.4.0/spark.version
scala.binary.version2.12/scala.binary.version
scala.version2.12.17/scala.version# 掉坑里折腾惨了,降低spark版本
spark.version3.2.4/spark.version
scala.binary.version2.12/scala.binary.version
scala.version2.12.17/scala.version修改hive源码
修改说明 修改Hive源代码会对其进行删除、修改、新增操作下图是Git版本控制对比图大家应该都能看懂吧。但还是说明一下-删除该行代码 新增、修改该行代码 修改hive源码这个操作是核心操作具体修改哪些源代码参考https://github.com/gitlbo/hive/commits/3.1.2 修改standalone-metastore模块
具体修改参考https://github.com/gitlbo/hive/commit/c073e71ef43699b7aa68cad7c69a2e8f487089fd
创建ColumnsStatsUtils类 代码如下
/** Licensed to the Apache Software Foundation (ASF) under one* or more contributor license agreements. See the NOTICE file* distributed with this work for additional information* regarding copyright ownership. The ASF licenses this file* to you under the Apache License, Version 2.0 (the* License); you may not use this file except in compliance* with the License. You may obtain a copy of the License at** http://www.apache.org/licenses/LICENSE-2.0** Unless required by applicable law or agreed to in writing, software* distributed under the License is distributed on an AS IS BASIS,* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.* See the License for the specific language governing permissions and* limitations under the License.*/package org.apache.hadoop.hive.metastore.columnstats;import org.apache.hadoop.hive.metastore.api.ColumnStatisticsObj;
import org.apache.hadoop.hive.metastore.columnstats.cache.DateColumnStatsDataInspector;
import org.apache.hadoop.hive.metastore.columnstats.cache.DecimalColumnStatsDataInspector;
import org.apache.hadoop.hive.metastore.columnstats.cache.DoubleColumnStatsDataInspector;
import org.apache.hadoop.hive.metastore.columnstats.cache.LongColumnStatsDataInspector;
import org.apache.hadoop.hive.metastore.columnstats.cache.StringColumnStatsDataInspector;/*** Utils class for columnstats package.*/
public final class ColumnsStatsUtils {private ColumnsStatsUtils(){}/*** Convertes to DateColumnStatsDataInspector if its a DateColumnStatsData.* param cso ColumnStatisticsObj* return DateColumnStatsDataInspector*/public static DateColumnStatsDataInspector dateInspectorFromStats(ColumnStatisticsObj cso) {DateColumnStatsDataInspector dateColumnStats;if (cso.getStatsData().getDateStats() instanceof DateColumnStatsDataInspector) {dateColumnStats (DateColumnStatsDataInspector)(cso.getStatsData().getDateStats());} else {dateColumnStats new DateColumnStatsDataInspector(cso.getStatsData().getDateStats());}return dateColumnStats;}/*** Convertes to StringColumnStatsDataInspector* if its a StringColumnStatsData.* param cso ColumnStatisticsObj* return StringColumnStatsDataInspector*/public static StringColumnStatsDataInspector stringInspectorFromStats(ColumnStatisticsObj cso) {StringColumnStatsDataInspector columnStats;if (cso.getStatsData().getStringStats() instanceof StringColumnStatsDataInspector) {columnStats (StringColumnStatsDataInspector)(cso.getStatsData().getStringStats());} else {columnStats new StringColumnStatsDataInspector(cso.getStatsData().getStringStats());}return columnStats;}/*** Convertes to LongColumnStatsDataInspector if its a LongColumnStatsData.* param cso ColumnStatisticsObj* return LongColumnStatsDataInspector*/public static LongColumnStatsDataInspector longInspectorFromStats(ColumnStatisticsObj cso) {LongColumnStatsDataInspector columnStats;if (cso.getStatsData().getLongStats() instanceof LongColumnStatsDataInspector) {columnStats (LongColumnStatsDataInspector)(cso.getStatsData().getLongStats());} else {columnStats new LongColumnStatsDataInspector(cso.getStatsData().getLongStats());}return columnStats;}/*** Convertes to DoubleColumnStatsDataInspector* if its a DoubleColumnStatsData.* param cso ColumnStatisticsObj* return DoubleColumnStatsDataInspector*/public static DoubleColumnStatsDataInspector doubleInspectorFromStats(ColumnStatisticsObj cso) {DoubleColumnStatsDataInspector columnStats;if (cso.getStatsData().getDoubleStats() instanceof DoubleColumnStatsDataInspector) {columnStats (DoubleColumnStatsDataInspector)(cso.getStatsData().getDoubleStats());} else {columnStats new DoubleColumnStatsDataInspector(cso.getStatsData().getDoubleStats());}return columnStats;}/*** Convertes to DecimalColumnStatsDataInspector* if its a DecimalColumnStatsData.* param cso ColumnStatisticsObj* return DecimalColumnStatsDataInspector*/public static DecimalColumnStatsDataInspector decimalInspectorFromStats(ColumnStatisticsObj cso) {DecimalColumnStatsDataInspector columnStats;if (cso.getStatsData().getDecimalStats() instanceof DecimalColumnStatsDataInspector) {columnStats (DecimalColumnStatsDataInspector)(cso.getStatsData().getDecimalStats());} else {columnStats new DecimalColumnStatsDataInspector(cso.getStatsData().getDecimalStats());}return columnStats;}
}接着修改以下内容具体修改参考以下截图说明 standalone-metastore/src/main/java/org/apache/hadoop/hive/metastore/columnstats/aggr/DateColumnStatsAggregator.java standalone-metastore/src/main/java/org/apache/hadoop/hive/metastore/columnstats/aggr/DecimalColumnStatsAggregator.java standalone-metastore/src/main/java/org/apache/hadoop/hive/metastore/columnstats/aggr/DoubleColumnStatsAggregator.java standalone-metastore/src/main/java/org/apache/hadoop/hive/metastore/columnstats/aggr/LongColumnStatsAggregator.java standalone-metastore/src/main/java/org/apache/hadoop/hive/metastore/columnstats/aggr/StringColumnStatsAggregator.java standalone-metastore/src/main/java/org/apache/hadoop/hive/metastore/columnstats/cache/DateColumnStatsDataInspector.java standalone-metastore/src/main/java/org/apache/hadoop/hive/metastore/columnstats/cache/DecimalColumnStatsDataInspector.java standalone-metastore/src/main/java/org/apache/hadoop/hive/metastore/columnstats/cache/DoubleColumnStatsDataInspector.java standalone-metastore/src/main/java/org/apache/hadoop/hive/metastore/columnstats/cache/LongColumnStatsDataInspector.java standalone-metastore/src/main/java/org/apache/hadoop/hive/metastore/columnstats/cache/StringColumnStatsDataInspector.java standalone-metastore/src/main/java/org/apache/hadoop/hive/metastore/columnstats/merge/DateColumnStatsMerger.java standalone-metastore/src/main/java/org/apache/hadoop/hive/metastore/columnstats/merge/DecimalColumnStatsMerger.java standalone-metastore/src/main/java/org/apache/hadoop/hive/metastore/columnstats/merge/DoubleColumnStatsMerger.java standalone-metastore/src/main/java/org/apache/hadoop/hive/metastore/columnstats/merge/LongColumnStatsMerger.java standalone-metastore/src/main/java/org/apache/hadoop/hive/metastore/columnstats/merge/StringColumnStatsMerger.java
修改ql模块
ql/src/test/org/apache/hadoop/hive/ql/stats/TestStatsUtils.java ql/src/test/org/apache/hadoop/hive/ql/exec/tez/SampleTezSessionState.java ql/src/java/org/apache/hadoop/hive/ql/exec/tez/WorkloadManager.java
修改spark-client模块
spark-client/src/main/java/org/apache/hive/spark/client/metrics/ShuffleWriteMetrics.java spark-client/src/main/java/org/apache/hive/spark/counter/SparkCounter.java
修改druid-handler模块
druid-handler/src/java/org/apache/hadoop/hive/druid/serde/DruidScanQueryRecordReader.java
修改llap-server模块
llap-server/src/java/org/apache/hadoop/hive/llap/daemon/impl/AMReporter.java llap-server/src/java/org/apache/hadoop/hive/llap/daemon/impl/LlapTaskReporter.java llap-server/src/java/org/apache/hadoop/hive/llap/daemon/impl/TaskExecutorService.java
修改llap-tez模块
llap-tez/src/java/org/apache/hadoop/hive/llap/tezplugins/LlapTaskSchedulerService.java
修改llap-common模块
llap-common/src/java/org/apache/hadoop/hive/llap/AsyncPbRpcProxy.java 编译打包
对Hive源码修改完成后执行编译打包命令
mvn clean package -Pdist -DskipTests -Dmaven.javadoc.skiptruemvn clean package -Pdist -DskipTests在执行编译打包命令过程中肯定会有各种问题的这些问题是需要解决的期间遇到的各种异常请参考下方异常集合对比解决。
注意点 1.有时本地仓库中的缓存可能会引起依赖项解析错误。可以尝试清理该项目依赖的本地仓库中的maven包这个命令会清理pom.xml中的包并重新下载执行以下命令 mvn dependency:purge-local-repository2.修改Pom.xml文件版本号或更改代码、安装Jar到本地仓库后建议关闭IDEA重新打开进入防止缓存、或者更新不及时 异常集合
注意以下异常均是按照编译Hive支持Spark3.4.0过程中产生的异常后来降低了Spark的版本。
异常1
1.maven会提示无法找到、无法下载某个Jar包、或者下载Jar耗时长(即使开启魔法也是 例如maven仓库找不到hive-upgrade-acid-3.1.3.jar与pentaho-aggdesigner-algorithm-5.1.5-jhyde_2.jar 具体异常如下仅供参考
[ERROR] Failed to execute goal on project hive-upgrade-acid: Could not resolve dependencies for project org.apache.hive:hive-upgrade-acid:jar:3.1.3: Failure to find org.pentaho:pentaho-aggdesigner-algorithm:jar:5.1.5-jhyde in https://maven.aliyun.com/repository/central was cached in the local repository, resolution will not be reattempted until the update interval of aliyun-central has elapsed or updates are forced - [Help 1]解决方案 到以下仓库搜索需要的Jar包手动下载并安装到本地仓库 仓库地址1https://mvnrepository.com/ 仓库地址2https://central.sonatype.com/ 仓库地址3https://developer.aliyun.com/mvn/search
将一个JAR安装到本地仓库示例命令的语法
mvn install:install-file -Dfilepath-to-jar -DgroupIdgroup-id -DartifactIdartifact-id -Dversionversion -Dpackagingpackagingpath-to-jarJAR文件的路径可以是本地文件系统的绝对路径。
group-id项目组ID通常采用反向域名格式例如com.example。
artifact-id项目的唯一标识符通常是项目名称。
version项目的版本号。
packagingJAR文件的打包类型例如jar。mvn install:install-file -Dfile./hive-upgrade-acid-3.1.3.jar -DgroupIdorg.apache.hive -DartifactIdhive-upgrade-acid -Dversion3.1.3 -Dpackagingjarmvn install:install-file -Dfile./pentaho-aggdesigner-algorithm-5.1.5-jhyde.jar -DgroupIdorg.pentaho -DartifactIdpentaho-aggdesigner-algorithm -Dversion5.1.5-jhyde -Dpackagingjarmvn install:install-file -Dfile./hive-metastore-2.3.3.jar -DgroupIdorg.apache.hive -DartifactIdhive-metastore -Dversion2.3.3 -Dpackagingjarmvn install:install-file -Dfile./hive-exec-3.1.3.jar -DgroupIdorg.apache.hive -DartifactIdhive-exec -Dversion3.1.3 -Dpackagingjar异常2
提示bash相关东西心凉了一大截。由于window下操作bash不支持。
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-antrun-plugin:1.7:run (generate-version-annotation) on project hive-common: An Ant BuildException has occured: Execute failed: java.io.IOException: Cannot run program bash (in directory C:\Users\JackChen\Desktop\apache-hive-3.1.3-src\common): CreateProcess error2, 系统找不到指定的文件。
[ERROR] around Ant part ...exec failonerrortrue executablebash... 4:46 in C:\Users\JackChen\Desktop\apache-hive-3.1.3-src\common\target\antrun\build-main.xml解决方案 正常来说作为开发者肯定有安装GitGit有bash窗口即在Git的Bash窗口执行编译打包命令 mvn clean package -Pdist -DskipTests异常3
当前进度在Hive Llap Server失败
[INFO] Hive Llap Client ................................... SUCCESS [ 4.030 s]
[INFO] Hive Llap Tez ...................................... SUCCESS [ 4.333 s]
[INFO] Hive Spark Remote Client ........................... SUCCESS [ 5.382 s]
[INFO] Hive Query Language ................................ SUCCESS [01:28 min]
[INFO] Hive Llap Server ................................... FAILURE [ 7.180 s]
[INFO] Hive Service ....................................... SKIPPED
[INFO] Hive Accumulo Handler .............................. SKIPPED
[INFO] Hive JDBC .......................................... SKIPPED
[INFO] Hive Beeline ....................................... SKIPPED[ERROR] Failed to execute goal org.apache.maven.plugins:maven-compiler-plugin:3.6.1:compile (default-compile) on project hive-llap-server: Compilation failure
[ERROR] /C:/Users/JackChen/Desktop/apache-hive-3.1.3-src/llap-server/src/java/org/apache/hadoop/hive/llap/daemon/impl/QueryTracker.java:[30,32] org.apache.logging.slf4j.Log4jMarker▒▒org.apache.logging.slf4j▒в▒▒ǹ▒▒▒▒▒; ▒▒▒▒▒ⲿ▒▒▒▒▒▒ж▒▒▒▒▒з▒▒▒
[ERROR]
[ERROR] - [Help 1]
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR]
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR] mvn args -rf :hive-llap-serverpublic class QueryTracker extends AbstractService {// private static final Marker QUERY_COMPLETE_MARKER new Log4jMarker(new Log4jQueryCompleteMarker());private static final Marker QUERY_COMPLETE_MARKER MarkerFactory.getMarker(MY_CUSTOM_MARKER);}异常4
编译执行到Hive HCatalog Webhcat模块失败
[INFO] Hive HCatalog ...................................... SUCCESS [ 10.947 s]
[INFO] Hive HCatalog Core ................................. SUCCESS [ 7.237 s]
[INFO] Hive HCatalog Pig Adapter .......................... SUCCESS [ 2.652 s]
[INFO] Hive HCatalog Server Extensions .................... SUCCESS [ 9.255 s]
[INFO] Hive HCatalog Webhcat Java Client .................. SUCCESS [ 2.435 s]
[INFO] Hive HCatalog Webhcat .............................. FAILURE [ 7.284 s]
[INFO] Hive HCatalog Streaming ............................ SKIPPED
[INFO] Hive HPL/SQL ....................................... SKIPPED
[INFO] Hive Streaming ..................................... SKIPPED具体异常
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-compiler-plugin:3.6.1:compile (default-compile) on project hive-webhcat: Compilation failure
[ERROR] /root/apache-hive-3.1.3-src/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/Main.java:[258,31] 对于FilterHolder(java.lang.Classorg.apache.hadoop.hdfs.web.AuthFilter), 找不到合适的构造器
[ERROR] 构造器 org.eclipse.jetty.servlet.FilterHolder.FilterHolder(org.eclipse.jetty.servlet.BaseHolder.Source)不适用
[ERROR] (参数不匹配; java.lang.Classorg.apache.hadoop.hdfs.web.AuthFilter无法转换为org.eclipse.jetty.servlet.BaseHolder.Source)
[ERROR] 构造器 org.eclipse.jetty.servlet.FilterHolder.FilterHolder(java.lang.Class? extends javax.servlet.Filter)不适用
[ERROR] (参数不匹配; java.lang.Classorg.apache.hadoop.hdfs.web.AuthFilter无法转换为java.lang.Class? extends javax.servlet.Filter)
[ERROR] 构造器 org.eclipse.jetty.servlet.FilterHolder.FilterHolder(javax.servlet.Filter)不适用
[ERROR] (参数不匹配; java.lang.Classorg.apache.hadoop.hdfs.web.AuthFilter无法转换为javax.servlet.Filter)
[ERROR]
[ERROR] - [Help 1]
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR]
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR] mvn args -rf :hive-webhcat看源码发现AuthFilter是继承AuthenticationFilterAuthenticationFilter又实现Filter应该不会出现此异常信息才对于是手动修改源码进行强制转换试试发现任然不行。 public FilterHolder makeAuthFilter() throws IOException {
// FilterHolder authFilter new FilterHolder(AuthFilter.class);FilterHolder authFilter new FilterHolder((Class? extends Filter) AuthFilter.class);UserNameHandler.allowAnonymous(authFilter);解决方案
在IDEA中单独编译打包此模块发现是能构建成功的
[INFO] ------------------------------------------------------------------------
[INFO] BUILD SUCCESS
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 40.755 s
[INFO] Finished at: 2023-08-06T21:39:1708:00
[INFO] ------------------------------------------------------------------------于是乎产生了一个想法
1.因为项目使用Maven进行打包执行mvn package再次执行相同的命令将不会重新打包项目
2.所以先针对项目执行clean命令然后对该Webhcat模块打包最后在整体编译打包时不执行clean操作直接运行 mvn package -Pdist -DskipTests。
注意后来降低了Spark版本没有产生该问题
编译打包成功
经过数个小时的解决问题与漫长的编译打包终于成功发现这个界面是多么的美好。
[INFO] --- maven-dependency-plugin:2.8:copy (copy) hive-packaging ---
[INFO] Configured Artifact: org.apache.hive:hive-jdbc:standalone:3.1.3:jar
[INFO] Copying hive-jdbc-3.1.3-standalone.jar to C:\Users\JackChen\Desktop\apache-hive-3.1.3-src\packaging\target\apache-hive-3.1.3-jdbc.jar
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary for Hive 3.1.3:
[INFO]
[INFO] Hive Upgrade Acid .................................. SUCCESS [ 5.264 s]
[INFO] Hive ............................................... SUCCESS [ 0.609 s]
[INFO] Hive Classifications ............................... SUCCESS [ 1.183 s]
[INFO] Hive Shims Common .................................. SUCCESS [ 2.239 s]
[INFO] Hive Shims 0.23 .................................... SUCCESS [ 2.748 s]
[INFO] Hive Shims Scheduler ............................... SUCCESS [ 2.286 s]
[INFO] Hive Shims ......................................... SUCCESS [ 1.659 s]
[INFO] Hive Common ........................................ SUCCESS [ 9.671 s]
[INFO] Hive Service RPC ................................... SUCCESS [ 6.608 s]
[INFO] Hive Serde ......................................... SUCCESS [ 6.042 s]
[INFO] Hive Standalone Metastore .......................... SUCCESS [ 42.432 s]
[INFO] Hive Metastore ..................................... SUCCESS [ 2.304 s]
[INFO] Hive Vector-Code-Gen Utilities ..................... SUCCESS [ 1.150 s]
[INFO] Hive Llap Common ................................... SUCCESS [ 3.343 s]
[INFO] Hive Llap Client ................................... SUCCESS [ 2.380 s]
[INFO] Hive Llap Tez ...................................... SUCCESS [ 2.476 s]
[INFO] Hive Spark Remote Client ........................... SUCCESS [31:34 min]
[INFO] Hive Query Language ................................ SUCCESS [01:09 min]
[INFO] Hive Llap Server ................................... SUCCESS [ 7.230 s]
[INFO] Hive Service ....................................... SUCCESS [ 28.343 s]
[INFO] Hive Accumulo Handler .............................. SUCCESS [ 6.179 s]
[INFO] Hive JDBC .......................................... SUCCESS [ 19.058 s]
[INFO] Hive Beeline ....................................... SUCCESS [ 4.078 s]
[INFO] Hive CLI ........................................... SUCCESS [ 3.436 s]
[INFO] Hive Contrib ....................................... SUCCESS [ 4.770 s]
[INFO] Hive Druid Handler ................................. SUCCESS [ 17.245 s]
[INFO] Hive HBase Handler ................................. SUCCESS [ 6.759 s]
[INFO] Hive JDBC Handler .................................. SUCCESS [ 4.202 s]
[INFO] Hive HCatalog ...................................... SUCCESS [ 1.757 s]
[INFO] Hive HCatalog Core ................................. SUCCESS [ 5.455 s]
[INFO] Hive HCatalog Pig Adapter .......................... SUCCESS [ 4.662 s]
[INFO] Hive HCatalog Server Extensions .................... SUCCESS [ 4.629 s]
[INFO] Hive HCatalog Webhcat Java Client .................. SUCCESS [ 4.652 s]
[INFO] Hive HCatalog Webhcat .............................. SUCCESS [ 8.899 s]
[INFO] Hive HCatalog Streaming ............................ SUCCESS [ 4.934 s]
[INFO] Hive HPL/SQL ....................................... SUCCESS [ 7.684 s]
[INFO] Hive Streaming ..................................... SUCCESS [ 4.049 s]
[INFO] Hive Llap External Client .......................... SUCCESS [ 3.674 s]
[INFO] Hive Shims Aggregator .............................. SUCCESS [ 0.557 s]
[INFO] Hive Kryo Registrator .............................. SUCCESS [03:17 min]
[INFO] Hive TestUtils ..................................... SUCCESS [ 1.154 s]
[INFO] Hive Packaging ..................................... SUCCESS [01:58 min]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD SUCCESS
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 38:22 min (Wall Clock)
[INFO] Finished at: 2023-08-08T22:50:1508:00
[INFO] ------------------------------------------------------------------------总结
在整个编译、打包过程中有2点非常重要
1.相关Jar无法下载或者下载缓慢问题一定要想方设法解决因为Jar是构建的核心缺一不可
2.Jar依赖解决了但是任然存在可能的兼容性问题编译问题遇到问题一定要一一解决解决一步走一步