Flink Kafka配置PKCS12 'ssl.keystore.location =user.p12'而不访问本地文件系统

z8dt9xmd  于 2022-12-09  发布在  Apache
关注(0)|答案(2)|浏览(297)

I can successfully connect to an SSL secured Kafka cluster with the following client properties:

security.protocol=SSL
ssl.truststore.type=PKCS12
ssl.truststore.location=ca.p12
ssl.truststore.password=<redacted>
ssl.keystore.type=PKCS12
ssl.keystore.location=user.p12
ssl.keystore.password=<redacted>

However, I’m writing a Java app that is running in a managed cloud environment, where I don’t have access to the file system. So I can’t just give it a local file path to .p12 files.
Are there any other alternatives, like using loading from S3, or from memory, or from a JVM classpath resource?
Specifically, this is a Flink app running on Amazon's Kinesis Analytics Managed Flink cluster service.

9rnv2umw

9rnv2umw1#

当然,在给予KafkaConsumer一个properties对象之前,你可以从任何地方下载你想要的任何东西,但是,运行Java进程的用户需要访问本地文件系统才能下载文件。
我认为将文件打包为应用程序JAR的一部分更有意义,但是,我不知道一种简单的方法来引用类路径资源,就好像它是一个常规的文件系统路径一样。

jmo0nnb3

jmo0nnb32#

我临时使用了一种解决方法,将证书上载到文件共享,并使应用程序在初始化期间从文件共享下载证书,并将其保存到所选的位置(如/home/site/ca.p12),然后kakfa属性应显示为... ssl.truststore.location=/home/site/ca.p12...
以下几行代码可帮助您下载和保存证书。

// Create the Azure Files client.
    CloudFileClient fileClient = storageAccount.createCloudFileClient();
// Get a reference to the file share
    CloudFileShare share = fileClient.getShareReference("[SHARENAME]");
// Get a reference to the root directory for the share.
    CloudFileDirectory rootDir = share.getRootDirectoryReference();
// Get a reference to the directory where the file to be deleted is in
    CloudFileDirectory containerDir = rootDir.getDirectoryReference("[DIRECTORY]");
    CloudFile file = containerDir.getFileReference("[FILENAME]");
file.downloadToFile("/home/site/ca.p12");

相关问题