我在hdfs中有一堆目录:
stuff A: /stuff/prod/data/inputData
stuff B: /stuff/prod/data/global/holdingpen
stuff C: /stuff/prod/data/global/keepers
stuff D: /stuff/prod/data/global/actionDFiles
stuff E: /stuff/prod/data/global/expired
stuff H: /stuff/prod/data/global/actionHFiles
stuff L: /stuff/prod/data/global/billableSessionTopXRecompute
stuff R: /stuff/prod/data/global/billingIdTimeRecompute
stuff U: /stuff/prod/data/global/uniqueStats
stuff Z: /stuff/prod/data/global/cleanupOldVersions
现在我只是对每个目录运行类似的程序来获得一个计数
hadoop fs -ls /stuff/prod/data/global/actionHFiles | wc -l
20658
我希望能够在bash脚本中这样做,这样就可以简单地打印出来
Stuff A: 183729
Stuff B: 281948
有人能帮忙吗?谢谢
暂无答案!
目前还没有任何答案,快来回答吧!