在从hdfs复制到s3时使用groupby合并文件夹中的文件

ghhkc1vu  于 2021-06-02  发布在  Hadoop
关注(0)|答案(1)|浏览(406)

我在hdfs中有以下文件夹:

hdfs://x.x.x.x:8020/Air/BOOK/AE/DOM/20171001/2017100101
hdfs://x.x.x.x:8020/Air/BOOK/AE/INT/20171001/2017100101
hdfs://x.x.x.x:8020/Air/BOOK/BH/INT/20171001/2017100101
hdfs://x.x.x.x:8020/Air/BOOK/IN/DOM/20171001/2017100101
hdfs://x.x.x.x:8020/Air/BOOK/IN/INT/20171001/2017100101
hdfs://x.x.x.x:8020/Air/BOOK/KW/DOM/20171001/2017100101
hdfs://x.x.x.x:8020/Air/BOOK/KW/INT/20171001/2017100101
hdfs://x.x.x.x:8020/Air/BOOK/ME/INT/20171001/2017100101
hdfs://x.x.x.x:8020/Air/BOOK/OM/INT/20171001/2017100101
hdfs://x.x.x.x:8020/Air/BOOK/Others/DOM/20171001/2017100101
hdfs://x.x.x.x:8020/Air/BOOK/QA/DOM/20171001/2017100101
hdfs://x.x.x.x:8020/Air/BOOK/QA/INT/20171001/2017100101
hdfs://x.x.x.x:8020/Air/BOOK/SA/DOM/20171001/2017100101
hdfs://x.x.x.x:8020/Air/BOOK/SA/INT/20171001/2017100101
hdfs://x.x.x.x:8020/Air/SEARCH/AE/DOM/20171001/2017100101
hdfs://x.x.x.x:8020/Air/SEARCH/AE/INT/20171001/2017100101
hdfs://x.x.x.x:8020/Air/SEARCH/BH/DOM/20171001/2017100101
hdfs://x.x.x.x:8020/Air/SEARCH/BH/INT/20171001/2017100101
hdfs://x.x.x.x:8020/Air/SEARCH/IN/DOM/20171001/2017100101
hdfs://x.x.x.x:8020/Air/SEARCH/IN/INT/20171001/2017100101

每个文件夹中有将近50个文件。我的目的是合并一个文件夹中的所有文件以获得一个文件,同时从hdfs复制到s3上。我遇到的问题是带有groupby选项的regex。我尝试了这个,但似乎不起作用:

s3-dist-cp --src hdfs:///Air/ --dest s3a://HadoopSplit/Air-merged/  --groupBy '.*/(\w+)/(\w+)/(\w+)/.*' --outputCodec lzo

这个命令本身是有效的,但是我没有将每个文件夹中的文件合并到一个文件中,这让我相信问题出在我的regex上。

bqjvbblv

bqjvbblv1#

我自己想出来的…正确的正则表达式是

.*/Air/(\w+)/(\w+)/(\w+)/.*/.*/.*

合并和复制的命令是:

s3-dist-cp --src hdfs:///Air/ --dest s3a://HadoopSplit/Air-merged/  --groupBy '.*/Air/(\w+)/(\w+)/(\w+)/.*/.*/.*' --outputCodec lzo

相关问题