本文整理了Java中org.deeplearning4j.nn.api.Layer.setParam()
方法的一些代码示例,展示了Layer.setParam()
的具体用法。这些代码示例主要来源于Github
/Stackoverflow
/Maven
等平台,是从一些精选项目中提取出来的代码,具有较强的参考意义,能在一定程度帮忙到你。Layer.setParam()
方法的具体详情如下:
包路径:org.deeplearning4j.nn.api.Layer
类名称:Layer
方法名:setParam
暂无
代码示例来源:origin: org.deeplearning4j/deeplearning4j-nn
@Override
public void setParam(String key, INDArray val) {
insideLayer.setParam(key, val);
}
代码示例来源:origin: org.deeplearning4j/deeplearning4j-nn
@Override
public void setParam(String key, INDArray val) {
//Set params for MultiLayerNetwork sub layers.
//Parameter keys here: same as MultiLayerNetwork.backprop().
int idx = key.indexOf('_');
if (idx == -1)
throw new IllegalStateException("Invalid param key: not have layer separator: \"" + key + "\"");
int layerIdx = Integer.parseInt(key.substring(0, idx));
String newKey = key.substring(idx + 1);
layers[layerIdx].setParam(newKey, val);
}
代码示例来源:origin: org.deeplearning4j/deeplearning4j-nn
@Override
public void setParam(String key, INDArray val) {
// throw new UnsupportedOperationException("Not implemented");
int idx = key.indexOf('_');
if (idx == -1)
throw new IllegalStateException("Invalid param key: not have layer separator: \"" + key + "\"");
String layerName = key.substring(0, idx);
String paramType = key.substring(idx + 1);
getLayer(layerName).setParam(paramType, val);
}
代码示例来源:origin: org.deeplearning4j/deeplearning4j-modelimport
/**
* Copy Keras layer weights to DL4J Layer.
*
* @param layer
* @throws InvalidKerasConfigurationException
*/
public void copyWeightsToLayer(org.deeplearning4j.nn.api.Layer layer) throws InvalidKerasConfigurationException {
if (this.getNumParams() > 0) {
String dl4jLayerName = layer.conf().getLayer().getLayerName();
String kerasLayerName = this.getLayerName();
String msg = "Error when attempting to copy weights from Keras layer " + kerasLayerName + " to DL4J layer "
+ dl4jLayerName;
if (this.weights == null)
throw new InvalidKerasConfigurationException(msg + "(weights is null)");
Set<String> paramsInLayer = new HashSet<String>(layer.paramTable().keySet());
Set<String> paramsInKerasLayer = new HashSet<String>(this.weights.keySet());
/* Check for parameters in layer for which we don't have weights. */
paramsInLayer.removeAll(paramsInKerasLayer);
for (String paramName : paramsInLayer)
throw new InvalidKerasConfigurationException(
msg + "(no stored weights for parameter " + paramName + ")");
/* Check for parameters NOT in layer for which we DO have weights. */
paramsInKerasLayer.removeAll(layer.paramTable().keySet());
for (String paramName : paramsInKerasLayer)
throw new InvalidKerasConfigurationException(msg + "(found no parameter named " + paramName + ")");
/* Copy weights. */
for (String paramName : layer.paramTable().keySet())
layer.setParam(paramName, this.weights.get(paramName));
}
}
代码示例来源:origin: org.deeplearning4j/deeplearning4j-nn
layer = clone.getLayer().instantiate(clone, iterationListeners, this.index, paramsView, true);
layer.setParam(DefaultParamInitializer.WEIGHT_KEY, w.transpose().dup());
layer.setParam(DefaultParamInitializer.BIAS_KEY, newB);
if (vb != null)
layer.setParam(PretrainParamInitializer.VISIBLE_BIAS_KEY, newVB);
} catch (Exception e) {
throw new RuntimeException("Unable to construct transposed layer: " + layerId(), e);
内容来源于网络,如有侵权,请联系作者删除!