尝试在flask REST应用程序中 Package keras模型,但收到ValueError

whlutmcx  于 2023-02-23  发布在  其他
关注(0)|答案(2)|浏览(128)

我可以通过运行以下命令创建一个简单的keras模型

python create-flask-model.py

创建 flask 模型.py

##points in square that are in or out of a quarter circle
import random
import math
import numpy as np
from keras.models import Sequential
from keras.layers import Dense

training_size = 8000
testing_size = 2000
batch_size = 10
epoch_no = 30
modelStructureFileName = 'simple-flask.json'
modelWeightFileName = 'simple-flask.h5'

def get_model():
    model = Sequential()
    model.add(Dense(4, input_dim=2, activation='tanh'))
    model.add(Dense(4, activation='tanh'))
    model.add(Dense(1, activation='sigmoid'))
    model.compile(loss='binary_crossentropy',optimizer='rmsprop')
    return model

def get_data_instances(size):
    result = []
    for i in range(0, size):
        number_1 = random.uniform(0,1)
        number_2 = random.uniform(0,1)
        squares = math.pow(number_1,2) + math.pow(number_2,2)
        target = 0
        if squares < 0.49:
            target = 1
        line = number_1,number_2,target
        result.append(line)
    return np.array(result)

##create data and split in to training and test, features and targets
data_instances = get_data_instances(training_size+testing_size)
train_x, train_y = data_instances[:training_size,0:2], data_instances[:training_size,-1]
test_x, test_y = data_instances[training_size:,0:2], data_instances[training_size:,-1]

##load model and train
model = get_model()
history = model.fit(train_x, train_y, batch_size=batch_size, epochs=epoch_no, validation_data=(test_x, test_y))

##save the model
model_json = model.to_json()
with open(modelStructureFileName, 'w') as json_file:
    json_file.write(model_json)
model.save_weights(modelWeightFileName)

##how to get prediction for an instance
#instance = np.array([0.3, 0.6])
#instance = instance.reshape(1,2)
#yhat = model.predict(instance)
#print(yhat)

我希望将生成的模型加载到一个flask应用程序中,并能够将示例作为json对象传递,然后进行预测并返回。

python flask-app.py

与模型json和h5文件位于同一目录中。

** flask 应用程序.py**

import json
import numpy as np
from flask import Flask
from keras.models import model_from_yaml

app = Flask(__name__)
model = None
modelStructureFileName = 'simple-flask.json'
modelWeightFileName = 'simple-flask.h5'

def load_model():
    yaml_file = open(modelStructureFileName, 'r')
    loaded_model_yaml = yaml_file.read()
    yaml_file.close()
    global model
    model = model_from_yaml(loaded_model_yaml)
    model.load_weights(modelWeightFileName)

@app.route('/flask/<input>', methods=['GET'])
def predict(input):
    input_array = json.loads(input)
    instance = np.array(input_array)
    instance = instance.reshape(1,2)
    yhat = model.predict(instance)
    return str(yhat)

if __name__ == '__main__':
    load_model()
    app.run(port = 9000, debug = True)

如果导航到 http://localhost:9000/flask/[0.3,0.6],则会出现错误

builtins.ValueError
ValueError: Tensor Tensor("dense_3/Sigmoid:0", shape=(?, 1), dtype=float32) is not an element of this graph.

我认为这与模型在应用中的作用域有关,但我想不出来。如果我在request方法中加载模型,它会工作一次,但随后失败并出现另一个错误。我只想加载模型一次。我如何才能让 flask 应用按预期工作?
编辑:我最终使用瓶子而不是 flask ,它的工作没有问题。

瓶应用程序.py

from bottle import route, run
import json
import numpy as np
from keras.models import model_from_yaml

modelStructureFileName = 'simple-flask.json'
modelWeightFileName = 'simple-flask.h5'

yaml_file = open(modelStructureFileName, 'r')
loaded_model_yaml = yaml_file.read()
yaml_file.close()
model = model_from_yaml(loaded_model_yaml)
model.load_weights(modelWeightFileName)
print('model loaded')

@route('/bottle/<input>')
def predict(input):
    input_array = json.loads(input)
    instance = np.array(input_array)
    instance = instance.reshape(1,2)
    yhat = model.predict(instance)
    print(input_array, yhat)
    return str(yhat[0][0])

run(host='localhost', port=9000, debug=True)
63lcw9qa

63lcw9qa1#

发生这种情况的原因是,默认情况下,您在flask中启用了多线程。Tensorflow模型无法很好地与多线程一起工作。您可以在下面的链接中阅读更多相关信息

以下变通方法对我很有效

global graph
graph = tf.get_default_graph()

with graph.as_default():
   model.compile()
   model.fit()

with graph.as_default():
   model.predict()
ghg1uchk

ghg1uchk2#

该答案与瓶API有关。
问题是Flask API只工作一次,然后在它给出错误之后。所以,在这种情况下,你应该在return语句之前的API末尾写K.clear_session()
不要忘记在顶部写from keras import backend as K行。

相关问题