为了修复hdfs中的复制文件,我在bash脚本中有以下几行代码
参考-https://community.cloudera.com/t5/community-articles/fix-under-replicated-blocks-in-hdfs-manually/ta-p/244746
su hdfs -c "hdfs fsck / | grep 'Under replicated' | awk -F':' '{print \$1}' > /tmp/under_replicated_files"
su hdfs -c "for hdfsfile in `cat /tmp/under_replicated_files`; do echo "Fixing $hdfsfile :" ; hadoop fs -setrep 3 $hdfsfile; done"
脚本在上运行时失败
: ; hadoop fs -setrep 3 ; done: -c: line 1: syntax error near unexpected token `/user/ambari-qa/.staging/job_1571913174208_0001/job.split'
: ; hadoop fs -setrep 3 ; done: -c: line 1: `/user/ambari-qa/.staging/job_1571913174208_0001/job.split'
当文件- /tmp/under_replicated_files
,包含:
/user/ambari-qa/.staging/job_1571913174208_0001/job.jar
/user/ambari-qa/.staging/job_1571913174208_0001/job.split
/user/ambari-qa/.staging/job_1571913174208_0003/job.jar
下面这句话我哪里错了?
su hdfs -c "for hdfsfile in `cat /tmp/under_replicated_files`; do echo "Fixing $hdfsfile :" ; hadoop fs -setrep 3 $hdfsfile; done"
注:
当我在用户手册中执行相同的步骤时- hdfs
,那么我们就没有任何问题了
例子:
su hdfs
$ hdfs fsck / | grep 'Under replicated' | awk -F':' '{print $1}' > /tmp/under_replicated_files
$ for hdfsfile in `cat /tmp/under_replicated_files`; do echo "Fixing $hdfsfile :" ; hadoop fs -setrep 3 $hdfsfile; done
我们也试过了
su hdfs -c "for hdfsfile in `cat /tmp/under_replicated_files`; do echo \"Fixing $hdfsfile :\" ; hadoop fs -setrep 3 $hdfsfile; done"
但我们得到:
bash: -c: line 1: syntax error near unexpected token `/user/ambari-qa/.staging/job_1571913174208_0001/job.split'
bash: -c: line 1: `/user/ambari-qa/.staging/job_1571913174208_0001/job.split'
暂无答案!
目前还没有任何答案,快来回答吧!