本文整理了Java中org.deeplearning4j.nn.api.Layer.paramTable()
方法的一些代码示例,展示了Layer.paramTable()
的具体用法。这些代码示例主要来源于Github
/Stackoverflow
/Maven
等平台,是从一些精选项目中提取出来的代码,具有较强的参考意义,能在一定程度帮忙到你。Layer.paramTable()
方法的具体详情如下:
包路径:org.deeplearning4j.nn.api.Layer
类名称:Layer
方法名:paramTable
暂无
代码示例来源:origin: org.deeplearning4j/deeplearning4j-nn
@Override
public Map<String, INDArray> paramTable(boolean backpropParamsOnly) {
return insideLayer.paramTable(backpropParamsOnly);
}
代码示例来源:origin: org.deeplearning4j/deeplearning4j-nn
@Override
public Map<String, INDArray> paramTable() {
return insideLayer.paramTable();
}
代码示例来源:origin: org.deeplearning4j/deeplearning4j-nn
public Map<String, INDArray> paramTable(boolean backpropParamsOnly) {
//Get all parameters from all layers
Map<String, INDArray> allParams = new LinkedHashMap<>();
for (int i = 0; i < layers.length; i++) {
Map<String, INDArray> paramMap = layers[i].paramTable(backpropParamsOnly);
for (Map.Entry<String, INDArray> entry : paramMap.entrySet()) {
String newKey = i + "_" + entry.getKey();
allParams.put(newKey, entry.getValue());
}
}
return allParams;
}
代码示例来源:origin: org.deeplearning4j/deeplearning4j-nn
public Map<String, INDArray> paramTable(boolean backpropParamsOnly) {
//Get all parameters from all layers
Map<String, INDArray> allParams = new LinkedHashMap<>();
for (Layer layer : layers) {
Map<String, INDArray> paramMap = layer.paramTable(backpropParamsOnly);
for (Map.Entry<String, INDArray> entry : paramMap.entrySet()) {
String newKey = layer.conf().getLayer().getLayerName() + "_" + entry.getKey();
allParams.put(newKey, entry.getValue());
}
}
return allParams;
}
代码示例来源:origin: org.deeplearning4j/deeplearning4j-nn
public FrozenLayer(Layer insideLayer) {
this.insideLayer = insideLayer;
if (insideLayer instanceof OutputLayer) {
throw new IllegalArgumentException("Output Layers are not allowed to be frozen " + layerId());
}
this.insideLayer = insideLayer;
this.zeroGradient = new DefaultGradient(insideLayer.params());
if (insideLayer.paramTable() != null) {
for (String paramType : insideLayer.paramTable().keySet()) {
//save memory??
zeroGradient.setGradientFor(paramType, null);
}
}
}
代码示例来源:origin: org.deeplearning4j/deeplearning4j-modelimport
/**
* Copy Keras layer weights to DL4J Layer.
*
* @param layer
* @throws InvalidKerasConfigurationException
*/
public void copyWeightsToLayer(org.deeplearning4j.nn.api.Layer layer) throws InvalidKerasConfigurationException {
if (this.getNumParams() > 0) {
String dl4jLayerName = layer.conf().getLayer().getLayerName();
String kerasLayerName = this.getLayerName();
String msg = "Error when attempting to copy weights from Keras layer " + kerasLayerName + " to DL4J layer "
+ dl4jLayerName;
if (this.weights == null)
throw new InvalidKerasConfigurationException(msg + "(weights is null)");
Set<String> paramsInLayer = new HashSet<String>(layer.paramTable().keySet());
Set<String> paramsInKerasLayer = new HashSet<String>(this.weights.keySet());
/* Check for parameters in layer for which we don't have weights. */
paramsInLayer.removeAll(paramsInKerasLayer);
for (String paramName : paramsInLayer)
throw new InvalidKerasConfigurationException(
msg + "(no stored weights for parameter " + paramName + ")");
/* Check for parameters NOT in layer for which we DO have weights. */
paramsInKerasLayer.removeAll(layer.paramTable().keySet());
for (String paramName : paramsInKerasLayer)
throw new InvalidKerasConfigurationException(msg + "(found no parameter named " + paramName + ")");
/* Copy weights. */
for (String paramName : layer.paramTable().keySet())
layer.setParam(paramName, this.weights.get(paramName));
}
}
代码示例来源:origin: org.deeplearning4j/deeplearning4j-nn
Set<String> paraNames = currentLayer.conf().getLearningRateByParam().keySet();
for (String aP : paraNames) {
String paramS = ArrayUtils.toString(currentLayer.paramTable().get(aP).shape());
paramShape += aP + ":" + paramS + ", ";
代码示例来源:origin: org.deeplearning4j/deeplearning4j-nn
Set<String> paraNames = currentLayer.conf().getLearningRateByParam().keySet();
for (String aP : paraNames) {
String paramS = ArrayUtils.toString(currentLayer.paramTable().get(aP).shape());
paramShape += aP + ":" + paramS + ", ";
代码示例来源:origin: org.deeplearning4j/deeplearning4j-nn
Map<String, INDArray> paramTable = layer.paramTable();
List<String> paramNames = new ArrayList<>(paramTable.keySet());
int[] paramEnds = new int[paramNames.size()];
代码示例来源:origin: org.deeplearning4j/deeplearning4j-nn
int currentUpdaterOffset = 0;
for (int i = 0; i < layers.length; i++) {
Map<String, INDArray> layerParamTable = layers[i].paramTable();
if (layerParamTable != null) {
List<String> variables = new ArrayList<>(layerParamTable.keySet()); //Is from a set, but iteration order should be fixed per layer as it's a from a LinkedHashSet
代码示例来源:origin: org.deeplearning4j/deeplearning4j-ui-model
NeuralNetConfiguration conf = l.conf();
Map<String, Double> layerLrs = conf.getLearningRateByParam();
Set<String> backpropParams = l.paramTable(true).keySet();
for (Map.Entry<String, Double> entry : layerLrs.entrySet()) {
if (!backpropParams.contains(entry.getKey()))
Map<String, Double> layerLrs = conf.getLearningRateByParam();
String layerName = conf.getLayer().getLayerName();
Set<String> backpropParams = l.paramTable(true).keySet();
for (Map.Entry<String, Double> entry : layerLrs.entrySet()) {
if (!backpropParams.contains(entry.getKey()))
内容来源于网络,如有侵权,请联系作者删除!