hadoop fs
(提示)
[-mvFromLocal]
[-mvToLocal]
[-appendToFile]
[-chgrp,chmod,chown]
hadoop fs -chown jiaojiao:mygirls /hamapi.txt(hdfs是弱的权限控制没有用户的概念,你说是谁的就是谁的)
[-cp][-mv][-get]
[-getmerge](下载合并到本地)
hadoop fs -getmerge /wordcount/input/*.* /merg.file
[-put][-rm]
hadoop fs -rm -r(递归删除) /aaa/bbb/
[rmdir](删除空目录)
对于过大的文件可以cat | more
hadoop fs -df -h /(查看根目录下的空间)
linux下:df -h
hadoop fs -du -s(汇总) -h(带单位) hdfs://mini1:9000/*(统计文件夹的大小)
linux下:du -sh *
[-count](指定目录下指定文件节点数量)
[-setrep](文件副本数量)
hadoop fs -setrep 10 /a.txt(10只是记录在namenode元数据中,具体有多少个副本还得看datanode的数量)
hadoop fs -ls /
hadoop fs -put hamapi.txt /
hadoop fs -cat /hamapi.txt
cd /home/hadoop/hdpdata/dfs/data/current/BP-1583774307-192.168.23.201-1500998032274/current/finalized/subdir0/subdir0
cat blk_1073741825(输出的结果和hdoop命令相同)
hadoop fs -put centos-6.5-hadoop-2.6.4.tar.gz /(这个文件大于128M)
hadoop fs -get /cenos-6.5-hadoop-2.6.4.tar.gz
cat blk1 >> tmp.file
cat blk2 >> tmp.file
tar -zxvf tmpfile(结果和直接解压tar.gz文件相同)
hadoop fs -mkdir -p /wordcount/input
hadoop fs -put a.txt b.txt /wordcount/input
hadoop jar hadoop-mapreduce-examples-2.6.4.jar wordcount /wordcount/input/ /wordcount/output
hadoop fs -cat /wordcount/output/part-r-00000