华为云MRS MRS_3.1.0集群Spark&Hudi客户端融合指导书
1 替换hudi-archive.zip包下parquet相关jar包:
parquet-column-1.12.0-hw-ei-1.0.jar
parquet-common-1.12.0-hw-ei-1.0.jar
parquet-encoding-1.12.0-hw-ei-1.0.jar
parquet-format-structures-1.12.0-hw-ei-1.0.jar
parquet-hadoop-1.12.0-hw-ei-1.0.jar
parquet-jackson-1.12.0-hw-ei-1.0.jar
1.1 从hdfs文件系统获取hudi-archive.zip包到指定目录,如/tmp
hadoop fs -get hdfs://hacluster/user/spark2x/jars/8.1.0/hudi-archive.zip /tmp
1.2 替换parquet包
cd /tmp
unzip hudi-archive.zip -d hudi-archive
cd hudi-archive
cp /opt/Bigdata/client/Spark2x/spark/jars/parquet-* ./
zip -r hudi-archive.zip *
1.3 上传hudi-archive.zip包到新路径下(为了不覆盖原生包),如hdfs://hacluster/user/spark2x/jars_new/8.1.0/
hadoop fs -mkdir -p hdfs://hacluster/user/spark2x/jars_new/8.1.0/
hadoop fs -put hudi-archive.zip hdfs://hacluster/user/spark2x/jars_new/8.1.0/
2 修改spark-defaults.conf配置文件,引用新路径hudi-archive.zip压缩包
2.1 进入客户端conf目录,Spark2x/spark/conf
2.2 修改spark-defaults.conf文件,共四个配置,复制原有配置到新一行,并注释原有配置(红色为修改部分)
第一处:
#spark.executor.extraClassPath =
spark.executor.extraClassPath =$PWD/hudi/*
第二处:
#spark.yarn.dist.innerarchives = hdfs://hacluster/user/spark2x/jars/8.1.0/spark-archive-2x-x86.zip#x86,hdfs://hacluster/user/spark2x/jars/8.1.0/spark-archive-2x-arm.zip#arm
spark.yarn.dist.innerarchives = hdfs://hacluster/user/spark2x/jars/8.1.0/spark-archive-2x-x86.zip#x86,hdfs://hacluster/user/spark2x/jars/8.1.0/spark-archive-2x-arm.zip#arm,hdfs://hacluster/user/spark2x/jars_new/8.1.0/hudi-archive.zip#hudi
第三处:
#spark.yarn.cluster.driver.extraClassPath = /opt/Bigdata/common/runtime/security
spark.yarn.cluster.driver.extraClassPath = $PWD/hudi/*:/opt/Bigdata/common/runtime/security
第四处:
#spark.driver.extraClassPath = /opt/Bigdata/client/Spark2x/spark/conf/:/opt/Bigdata/client/Spark2x/spark/jars/*:/opt/Bigdata/client/Spark2x/spark/x86/*
spark.driver.extraClassPath = /opt/Bigdata/client/Hudi/hudi/lib/*:/opt/Bigdata/client/Spark2x/spark/conf/:/opt/Bigdata/client/Spark2x/spark/jars/*:/opt/Bigdata/client/Spark2x/spark/x86/*
3 验证查询hudi表
spark-sql --master yarn
select * from delta_demo5_ro limit 10;
- 点赞
- 收藏
- 关注作者
评论(0)