KerasTuner自定义目标函数

ulydmbyx  于 2023-06-30  发布在  其他
关注(0)|答案(1)|浏览(108)

我正在尝试使用Keras Tuner进行超参数微调。我想最大化auc。有人能帮助我使用kerastuner.Objective作为自定义指标吗?

EXECUTIONS_PER_TRIAL = 5

b_tuner = BayesianOptimization(
    tune_nn_model,
    objective='val_binary_accuracy',
    max_trials=MAX_TRIALS,
    executions_per_trial=EXECUTIONS_PER_TRIAL,
    directory='test_dir101897',
    project_name='b_tune_nn',
    seed=12347
)

我尝试定义一个自定义函数,如:

from sklearn import metrics
from keras import backend as K

def auc(y_true, y_pred):
    auc = tf.metrics.auc(y_true, y_pred)[1]
    K.get_session().run(tf.local_variables_initializer())
    return auc

插上电源

objective='val_auc'

但这行不通

w3nuxt5m

w3nuxt5m1#

感谢@Shiva上面提供的GitHub页面,我尝试使用Keras调谐器获取验证数据的AUC,并且成功了。我的模型是一个LSTM,我已经创建了MyHyperModel类,以便能够调优batch_sizeas described here。如果您想使用固定的batch_size,则不必这样做。您可以取消注解任何其他指标,并以相同的方式基于它们进行正则化。

# make X_train, y_train, X_valid, y_valid
mask_value=-9999.99
epochs=200

class MyHyperModel(kt.HyperModel):
  def build(self, hp):
    hp_lstm_units = hp.Int('units', min_value=16, max_value=128, step=16)
    hp_dropout_rate = hp.Float('drop_out_rate', min_value=0, max_value=0.6)
    hp_recurrent_dropout_rate = hp.Float('recurrent_dropout_rate', min_value=0, max_value=0.6)
    hp_initial_learning_rate = hp.Float('initial_learning_rate',  min_value=1e-3, max_value=1e-1, sampling='log')
    hp_decay = hp.Int('decay', min_value=10, max_value=100, step=10 )

    # model
    model = tf.keras.Sequential()

    model.add(tf.keras.layers.Masking(mask_value=mask_value, input_shape = (X_train.shape[1], X_train.shape[2])))
    model.add(tf.keras.layers.LSTM(hp_lstm_units,
    dropout=hp_dropout_rate, recurrent_dropout=hp_recurrent_dropout_rate))
    model.add(tf.keras.layers.Dense(1, activation='sigmoid'))
    model.compile(loss=tf.keras.losses.BinaryCrossentropy(from_logits=False), 
                  optimizer=keras.optimizers.SGD(learning_rate=hp_initial_learning_rate, decay=hp_decay), 
                  metrics=[
                      # tf.keras.metrics.TruePositives(name='tp'),
                      # tf.keras.metrics.FalsePositives(name='fp'),
                      # tf.keras.metrics.TrueNegatives(name='tn'),
                      # tf.keras.metrics.FalseNegatives(name='fn'),
                      # tf.keras.metrics.BinaryAccuracy(name='accuracy'),
                      # tf.keras.metrics.Precision(name='precision'),
                      # tf.keras.metrics.Recall(name='recall'),
                      tf.keras.metrics.AUC(name='auc'),
                  ])
    return model

    def fit(self, hp):
      hp_batch_size = hp.Int('batch_size', min_value=8, max_value=128, step=8)
      return model.fit(
          *args,
          batch_size=hp_batch_size,
          **kwargs)
      

tuner = kt.BayesianOptimization(
    MyHyperModel(),
    objective=kt.Objective('val_auc', direction='max'),
    overwrite=True,
    max_trials=100,
    directory="MyDirectory",
    project_name="MyProject",
)

tuner.search(X_train, y_train, epochs=200, validation_data=(X_valid, y_valid))

相关问题