hadoop打开的文件太多问题

t3psigkw  于 2021-05-29  发布在  Hadoop
关注(0)|答案(0)|浏览(313)

是什么导致了许多打开的文件问题?
java.io.ioexception:got error,状态消息opreadblock bp-493425312-172.20.178.11-1399995954120:blk\u 1449181880\u 375614544 received exception java.io.filenotfoundexception:/hd\u data/disk4/hadoop/hdfs/data/current/bp-493425312-172.20.178.11-1399995954120/current/finalized/subdir96/subdir194/blk\u 1449181880(打开的文件太多),对于op\u read\u块,self=/172.20.178.55:39870,remote=am1plccmrhdd05.r1 core.r1.aig.net/172.20.178.105:1019,对于文件/consumer/lhd/americas/policy\ u admin/alip/inset/pending\ u workload/rd\ u ascii\ u print.tsv,对于池bp-493425312-172.20.178.11-1399995954120 block 1449181880\ u 375614544

暂无答案!

目前还没有任何答案,快来回答吧!

相关问题