如何使用Camel aws-s3 Producer将文件上传到AWS S3?

pgky5nke  于 2022-11-07  发布在  Apache
关注(0)|答案(2)|浏览(232)

我正在尝试使用Camel的aws-s3 Producer将一个jpg文件上传到AWS S3 bucket。我可以用这种方法实现这个功能吗?如果可以,怎么做?现在我只得到一个IOException,无法确定下一步该怎么做。我知道我可以使用aws-sdk中的TransferManager实现上传,但现在我只对Camel的aws-s3端点感兴趣。
下面是我的路线与 Camel 2.15.3:

public void configure() {
    from("file://src/data?fileName=file.jpg&noop=true&delay=15m")
    .setHeader(S3Constants.KEY,constant("CamelFile"))
    .to("aws-s3://<bucket-name>?region=eu-west-1&accessKey=<key>&secretKey=RAW(<secret>)");
}

而我在这条路上遇到的例外是:

com.amazonaws.AmazonClientException: Unable to create HTTP entity: Stream Closed
at com.amazonaws.http.HttpRequestFactory.newBufferedHttpEntity(HttpRequestFactory.java:244)
at com.amazonaws.http.HttpRequestFactory.createHttpRequest(HttpRequestFactory.java:122)
at com.amazonaws.http.AmazonHttpClient.executeHelper(AmazonHttpClient.java:415)
at com.amazonaws.http.AmazonHttpClient.execute(AmazonHttpClient.java:273)
at com.amazonaws.services.s3.AmazonS3Client.invoke(AmazonS3Client.java:3660)
at com.amazonaws.services.s3.AmazonS3Client.putObject(AmazonS3Client.java:1432)
at org.apache.camel.component.aws.s3.S3Producer.processSingleOp(S3Producer.java:209)
at org.apache.camel.component.aws.s3.S3Producer.process(S3Producer.java:71)
at org.apache.camel.util.AsyncProcessorConverterHelper$ProcessorToAsyncProcessorBridge.process(AsyncProcessorConverterHelper.java:61)
at org.apache.camel.processor.SendProcessor.process(SendProcessor.java:129)
at org.apache.camel.management.InstrumentationProcessor.process(InstrumentationProcessor.java:77)
at org.apache.camel.processor.RedeliveryErrorHandler.process(RedeliveryErrorHandler.java:448)
at org.apache.camel.processor.CamelInternalProcessor.process(CamelInternalProcessor.java:190)
at org.apache.camel.processor.Pipeline.process(Pipeline.java:118)
at org.apache.camel.processor.Pipeline.process(Pipeline.java:80)
at org.apache.camel.processor.CamelInternalProcessor.process(CamelInternalProcessor.java:190)
at org.apache.camel.component.file.GenericFileConsumer.processExchange(GenericFileConsumer.java:439)
at org.apache.camel.component.file.GenericFileConsumer.processBatch(GenericFileConsumer.java:211)
at org.apache.camel.component.file.GenericFileConsumer.poll(GenericFileConsumer.java:175)
at org.apache.camel.impl.ScheduledPollConsumer.doRun(ScheduledPollConsumer.java:174)
at org.apache.camel.impl.ScheduledPollConsumer.run(ScheduledPollConsumer.java:101)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308)
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180)
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.io.IOException: Stream Closed
at java.io.FileInputStream.readBytes(Native Method)
at java.io.FileInputStream.read(FileInputStream.java:246)
at com.amazonaws.services.s3.internal.RepeatableInputStream.read(RepeatableInputStream.java:167)
at com.amazonaws.internal.SdkFilterInputStream.read(SdkFilterInputStream.java:73)
at com.amazonaws.services.s3.internal.MD5DigestCalculatingInputStream.read(MD5DigestCalculatingInputStream.java:88)
at com.amazonaws.internal.SdkFilterInputStream.read(SdkFilterInputStream.java:73)
at com.amazonaws.event.ProgressInputStream.read(ProgressInputStream.java:151)
at java.io.FilterInputStream.read(FilterInputStream.java:107)
at org.apache.http.util.EntityUtils.toByteArray(EntityUtils.java:136)
at org.apache.http.entity.BufferedHttpEntity.<init>(BufferedHttpEntity.java:63)
at com.amazonaws.http.HttpRequestFactory.newBufferedHttpEntity(HttpRequestFactory.java:242)
... 27 more
0lvr5msh

0lvr5msh1#

我做了一些研究,找到了一个解决方案。如果在将文件内容传递到aws-s3端点之前将其转换为字节数组,路由就可以工作,如下所示:

from("file://src/data?fileName=file.jpg&noop=true&delay=15m")
    .convertBodyTo(byte[].class)
    .setHeader(S3Constants.CONTENT_LENGTH, simple("${in.header.CamelFileLength}"))
    .setHeader(S3Constants.KEY,simple("${in.header.CamelFileNameOnly}"))
    .to("aws-s3://{{awsS3BucketName}}"
                    + "?deleteAfterWrite=false&region=eu-west-1"
                    + "&accessKey={{awsAccessKey}}"
                    + "&secretKey=RAW({{awsAccessKeySecret}})")
    .log("done.");
}

还必须将S3Constants.CONTENT_LENGTH标头值设置为文件长度(以字节为单位)。
上面的解决方案是将整个文件读入内存,所以它并不适用于所有情况。然而,上面的代码也是我所知道的使用aws-s3生产者端点的最简单的方法。我仍然很高兴听到其他(和更好的)解决方案。

fykwrbwg

fykwrbwg2#

下面是一个使用camel xml dsl的示例。下面的操作将根据文件src/empty.txt的内容执行S3桶中文件s3File.txt的上载。我使用的是comed 2.21.1

<!--upload file to AWS S3 -->
<setHeader headerName="CamelAwsS3Key">
 <exchangeProperty>s3File.txt</exchangeProperty>
</setHeader>
<setBody>
   <!--fileName with location to be uploaded -->
  <exchangeProperty>src/localfile.txt</exchangeProperty>
</setBody>
<convertBodyTo type="java.io.File"/>

 <!--upload the file to s3 pvc medicare leg2 outbound bucket-->
 <to uri="aws-s3:bucketName?amazonS3Client=#s3Client"/>

相关问题