大数据技术HDFS之Shell指令
hdfs dfs -ls和linux的命令差不多
hdfs dfs -查看相关帮助命令
hadoop dfs -ls /
hadoop fs -ls /
版本不同
创建目录
hdfs dfs -mkdir -p /aaa/01/02
hdfs dfs -lsr /aaa
会看到01 02
二.使用dfs创建文件
查看文件
结果
echo "hello,zhangChenguang">>localfile
hdfs dfs -appendToFile localfile output3/file2
一直报错:
Failed to replace a bad datanode on the existing pipeline due to no more good datanodes being available to try. (Nodes: current=[DatanodeInfoWithStorage[192.168.80.131:50010,DS-ae4d2aae-49b9-4d3f-bbf1-a255217746b8,DISK], DatanodeInfoWithStorage[192.168.80.132:50010,DS-2e26c66c-f173-489a-af5c-671377ec4eae,DISK]], original=[DatanodeInfoWithStorage[192.168.80.131:50010,DS-ae4d2aae-49b9-4d3f-bbf1-a255217746b8,DISK], DatanodeInfoWithStorage[192.168.80.132:50010,DS-2e26c66c-f173-489a-af5c-671377ec4eae,DISK]]). The current failed datanode replacement policy is DEFAULT, and a client may configure this via 'dfs.client.block.write.replace-datanode-on-failure.policy' in its configuration. at org.apache.hadoop.hdfs.DataStreamer.findNewDatanode(DataStreamer.java:1281) at org.apache.hadoop.hdfs.DataStreamer.addDatanode2ExistingPipeline(DataStreamer.java:1351) at org.apache.hadoop.hdfs.DataStreamer.handleDatanodeReplacement(DataStreamer.java:1539) at org.apache.hadoop.hdfs.DataStreamer.setupPipelineForAppendOrRecovery(DataStreamer.java:1440) at org.apache.hadoop.hdfs.DataStreamer.run(DataStreamer.java:708) |
一个是节点原因,新建不报错,但是追加报错:
最后发现解决办法,在hdfs-site.xml增加内容
<property> <name> dfs.client.block.write.replace-datanode-on-failure.enable </name> <value>true</value> </property> <property> <name> dfs.client.block.write.replace-datanode-on-failure.policy </name> <value>never</value> </property> |
hdfs dfs -cat output3/file2
19/06/25 00:00:22 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable hello,zhangChenguang hello,HDFS hello,zhangChenguang hello,HDFS |
拷贝命令
查看结果
3. 文件的移动
移动并改名
4. 上传文件
5. 拷贝命令
6. 下载命令
7. 查看HDFS的文件内容
-text|tail|cat三种方式分别试验
8. 文件大小列举
9. 删除指定文件
-rm
10. 删除指定的目录
- 点赞
- 收藏
- 关注作者
评论(0)