org.apache.hadoop.hdfs.server.datanode.DataNode.getDataEncryptionKeyFactoryForBlock()方法的使用及代码示例

x33g5p2x  于2022-01-18 转载在 其他  
字(4.7k)|赞(0)|评价(0)|浏览(120)

本文整理了Java中org.apache.hadoop.hdfs.server.datanode.DataNode.getDataEncryptionKeyFactoryForBlock()方法的一些代码示例,展示了DataNode.getDataEncryptionKeyFactoryForBlock()的具体用法。这些代码示例主要来源于Github/Stackoverflow/Maven等平台,是从一些精选项目中提取出来的代码,具有较强的参考意义,能在一定程度帮忙到你。DataNode.getDataEncryptionKeyFactoryForBlock()方法的具体详情如下:
包路径:org.apache.hadoop.hdfs.server.datanode.DataNode
类名称:DataNode
方法名:getDataEncryptionKeyFactoryForBlock

DataNode.getDataEncryptionKeyFactoryForBlock介绍

[英]Returns a new DataEncryptionKeyFactory that generates a key from the BlockPoolTokenSecretManager, using the block pool ID of the given block.
[中]返回一个新的DataEncryptionKeyFactory,该工厂使用给定块的块池ID从BlockPoolTokenSecretManager生成密钥。

代码示例

代码示例来源:origin: org.apache.hadoop/hadoop-hdfs

IOStreamPair connectToDN(DatanodeInfo datanodeID, int timeout,
             ExtendedBlock block,
             Token<BlockTokenIdentifier> blockToken)
  throws IOException {
 return DFSUtilClient.connectToDN(datanodeID, timeout, getConf(),
   saslClient, NetUtils.getDefaultSocketFactory(getConf()), false,
   getDataEncryptionKeyFactoryForBlock(block), blockToken);
}

代码示例来源:origin: org.apache.hadoop/hadoop-hdfs

private Peer newConnectedPeer(ExtendedBlock b, InetSocketAddress addr,
               Token<BlockTokenIdentifier> blockToken,
               DatanodeID datanodeId)
  throws IOException {
 Peer peer = null;
 boolean success = false;
 Socket sock = null;
 final int socketTimeout = datanode.getDnConf().getSocketTimeout();
 try {
  sock = NetUtils.getDefaultSocketFactory(conf).createSocket();
  NetUtils.connect(sock, addr, socketTimeout);
  peer = DFSUtilClient.peerFromSocketAndKey(datanode.getSaslClient(),
    sock, datanode.getDataEncryptionKeyFactoryForBlock(b),
    blockToken, datanodeId, socketTimeout);
  success = true;
  return peer;
 } finally {
  if (!success) {
   IOUtils.cleanup(null, peer);
   IOUtils.closeSocket(sock);
  }
 }
}

代码示例来源:origin: org.apache.hadoop/hadoop-hdfs

InputStream unbufIn = NetUtils.getInputStream(sock);
DataEncryptionKeyFactory keyFactory =
 getDataEncryptionKeyFactoryForBlock(b);
IOStreamPair saslStreams = saslClient.socketSend(sock, unbufOut,
 unbufIn, keyFactory, accessToken, bpReg);

代码示例来源:origin: org.apache.hadoop/hadoop-hdfs

InputStream unbufProxyIn = NetUtils.getInputStream(proxySock);
DataEncryptionKeyFactory keyFactory =
  datanode.getDataEncryptionKeyFactoryForBlock(block);
IOStreamPair saslStreams = datanode.saslClient.socketSend(proxySock,
  unbufProxyOut, unbufProxyIn, keyFactory, blockToken, proxySource);

代码示例来源:origin: org.apache.hadoop/hadoop-hdfs

InputStream unbufIn = NetUtils.getInputStream(socket);
DataEncryptionKeyFactory keyFactory =
  datanode.getDataEncryptionKeyFactoryForBlock(block);
IOStreamPair saslStreams = datanode.getSaslClient().socketSend(
  socket, unbufOut, unbufIn, keyFactory, blockToken, target);

代码示例来源:origin: org.apache.hadoop/hadoop-hdfs

InputStream unbufMirrorIn = NetUtils.getInputStream(mirrorSock);
DataEncryptionKeyFactory keyFactory =
 datanode.getDataEncryptionKeyFactoryForBlock(block);
IOStreamPair saslStreams = datanode.saslClient.socketSend(mirrorSock,
 unbufMirrorOut, unbufMirrorIn, keyFactory, blockToken, targets[0]);

代码示例来源:origin: ch.cern.hadoop/hadoop-hdfs

InputStream unbufIn = NetUtils.getInputStream(sock);
DataEncryptionKeyFactory keyFactory =
 getDataEncryptionKeyFactoryForBlock(b);
IOStreamPair saslStreams = saslClient.socketSend(sock, unbufOut,
 unbufIn, keyFactory, accessToken, bpReg);

代码示例来源:origin: io.prestosql.hadoop/hadoop-apache

InputStream unbufIn = NetUtils.getInputStream(sock);
DataEncryptionKeyFactory keyFactory =
 getDataEncryptionKeyFactoryForBlock(b);
IOStreamPair saslStreams = saslClient.socketSend(sock, unbufOut,
 unbufIn, keyFactory, accessToken, bpReg);

代码示例来源:origin: ch.cern.hadoop/hadoop-hdfs

InputStream unbufProxyIn = NetUtils.getInputStream(proxySock);
DataEncryptionKeyFactory keyFactory =
  datanode.getDataEncryptionKeyFactoryForBlock(block);
IOStreamPair saslStreams = datanode.saslClient.socketSend(proxySock,
  unbufProxyOut, unbufProxyIn, keyFactory, blockToken, proxySource);

代码示例来源:origin: ch.cern.hadoop/hadoop-hdfs

InputStream unbufMirrorIn = NetUtils.getInputStream(mirrorSock);
DataEncryptionKeyFactory keyFactory =
 datanode.getDataEncryptionKeyFactoryForBlock(block);
IOStreamPair saslStreams = datanode.saslClient.socketSend(mirrorSock,
 unbufMirrorOut, unbufMirrorIn, keyFactory, blockToken, targets[0]);

代码示例来源:origin: io.prestosql.hadoop/hadoop-apache

InputStream unbufProxyIn = NetUtils.getInputStream(proxySock);
DataEncryptionKeyFactory keyFactory =
  datanode.getDataEncryptionKeyFactoryForBlock(block);
IOStreamPair saslStreams = datanode.saslClient.socketSend(proxySock,
  unbufProxyOut, unbufProxyIn, keyFactory, blockToken, proxySource);

代码示例来源:origin: io.prestosql.hadoop/hadoop-apache

InputStream unbufMirrorIn = NetUtils.getInputStream(mirrorSock);
DataEncryptionKeyFactory keyFactory =
 datanode.getDataEncryptionKeyFactoryForBlock(block);
IOStreamPair saslStreams = datanode.saslClient.socketSend(mirrorSock,
 unbufMirrorOut, unbufMirrorIn, keyFactory, blockToken, targets[0]);

相关文章

DataNode类方法