我想检查是否在今天创建了一个目录。如果确实如此,则将其上载到hdfs上,但如果dierectory的修改日期不同,则输出为已在hdfs上复制的目录。
# !/bin/sh
export DATA_PATH=/data/1/sanket
# We will enter the directory where we want to check other directories
cd $DATA_PATH
# Details of directories will be transfered into no_of_files.txt
ls -lh $DATA_PATH/ >> temp/no_of_files.txt
# We will extract name of the file from above file.
nameoffile=$(awk '{print $9}' temp/no_of_files.txt)
# Now we want today's date.
echo $(date) >> temp/date.txt
# So the modifying date and todays date will be copied to a variable.
filedate=$(awk '{print $6 $7}' temp/no_of_files.txt)
todaydate=$(awk '{print $2 $3}' temp/date.txt)
export "nameoffile"
export "filedate"
export "todaydate"
rm -fr $DATA_PATH/temp/no_of_files.txt
rm -fr $DATA_PATH/temp/name_of_files.txt
rm -fr $DATA_PATH/temp/date.txt
# Directory on HDFS where we want to copy data
path=sanket_data
# First to check that modifying date of directory and today's date to match and if so
# then copy the data on HDFS, if they dont match then give error as file already copied.
if [[ "$filedate" == "$todaydate" ]]; then
for filename in $nameoffile; do
#path=sanket_data
#nameoffile=$(awk '{print $9}' temp/no_of_files.txt)
#for filename in $nameoffile
/usr/bin/hadoop fs -put $DATA_PATH/$filename /user/sanket/$path
#echo $filename already copied!
done
elif [[ "$filedate" != "$todaydate" ]]; then
#/usr/bin/hadoop fs -put $DATA_PATH/$filename /user/sanket/$path
echo $filename already copied!
#hdfs dfs -put $filename /user/sanket/$path
fi
1条答案
按热度按时间s1ag04yj1#
你需要的是
touch
,date
以及newer
. 让我们先使用touch
以及date
要创建最后修改日期(lmd)为今天00:00的临时空文件,请执行以下操作:现在让我们使用
newer
测试是否$DATA_PATH
最近的lmd比today0000
: