他的回复:
Spark启用lzo【Spark客户端配置】配置文件/opt/Bigdata/client/Spark2x/spark/conf/core-site.xml,添加下列配置io.compression.codecsorg.apache.hadoop.io.compress.BZip2Codec,org.apache.hadoop.io.compress.DefaultCodec,org.apache.hadoop.io.compress.DeflateCodec,org.apache.hadoop.io.compress.Lz4Codec,org.apache.hadoop.io.compress.SnappyCodec,org.apache.hadoop.io.compress.GzipCodec,org.apache.hadoop.io.compress.ZStandardCodec,com.huawei.hadoop.datasight.io.compress.lzc.ZCodec,com.hadoop.compression.lzo.LzoCodec,com.hadoop.compression.lzo.LzopCodecio.compression.codec.lzo.classcom.hadoop.compression.lzo.LzoCodec【提交命令】提交命令使用--jars 指定hadoop-lzo包,例如:spark-shell --master yarn --jars /opt/Bigdata/client/Hive/Beeline/lib/hadoop-lzo-0.4.20.jar