Tensorflow keras输入形状声明为(None,None),而打印输出为(20,30)

wtlkbnrh  于 2023-10-19  发布在  其他
关注(0)|答案(1)|浏览(124)

我正在尝试使用Keras创建一个CNN-LTSM模型,我想使用keras.utils.Sequence将数据输入模型(因为数据是连续的,我希望数据中的每个可用窗口都用于训练)。
但是Keras一直在抱怨我输入数据的形状。
错误代码:

Epoch 1/100
WARNING:tensorflow:Model was constructed with shape (None, 20, 30) for input KerasTensor(type_spec=TensorSpec(shape=(None, 20, 30), dtype=tf.float32, name='input_7'), name='input_7', description="created by layer 'input_7'"), but it was called on an input with incompatible shape (None, None).

Cell 5 line 2
     15 dataSeq = TrainDataFeedSequence(train, INPUT_SIZE)
     17 # for i in range(0, len(dataSeq)):
     18 #     curInputShape = dataSeq.__getitem__(i)[0].shape
     19 #     if (curInputShape[0] == None or curInputShape[1] == None or curInputShape == (None, None)):
     20 #         print(dataSeq.__getitem__(i)[0])
---> 22 model.fit(x=dataSeq, epochs=100, verbose=1)

in filter_traceback.<locals>.error_handler(*args, **kwargs)
     67     filtered_tb = _process_traceback_frames(e.__traceback__)
     68     # To get the full stack trace, call:
     69     # `tf.debugging.disable_traceback_filtering()`
---> 70     raise e.with_traceback(filtered_tb) from None
     71 finally:
     72     del filtered_tb

in outer_factory.<locals>.inner_factory.<locals>.tf__train_function(iterator)
     13 try:
     14     do_return = True
---> 15     retval_ = ag__.converted_call(ag__.ld(step_function), (ag__.ld(self), ag__.ld(iterator)), None, fscope)
     16 except:
     17     do_return = False

ValueError: Exception encountered when calling layer "sequential_6" "                 f"(type Sequential).
    
    Input 0 of layer "conv1d_3" is incompatible with the layer: expected min_ndim=3, found ndim=2. Full shape received: (None, None)

我的序列的代码

class TrainDataFeedSequence(tf.keras.utils.Sequence):

    def __init__(self, data, batch_size):
        self.data = data
        self.batch_size = batch_size

    def __len__(self):
        return len(self.data) - self.batch_size

    def __getitem__(self, idx):
        inputs = self.data.iloc[idx:idx+self.batch_size,:]
        targets = self.data.iloc[idx+self.batch_size,:1]
        return (inputs, targets)

dataSeq = TrainDataFeedSequence(train, 20)

# this prints out nothing
# for i in range(0, len(dataSeq)):
#     curInputShape = dataSeq.__getitem__(i)[0].shape
#     if (curInputShape[0] == None or curInputShape[1] == None or curInputShape == (None, None)):
#         print(dataSeq.__getitem__(i)[0])

model.fit(x=dataSeq, epochs=100)

我很确定我的序列中的每一个项目都不符合形状(无,无),那么是什么出错了呢?
Keras == 2.10.0,tensorflow == 2.10.0,python == 3.9.18
我试着打印出所有的输入项的形状,以及他们都在形状(20,30)。

vdzxcuhz

vdzxcuhz1#

对于任何想知道的人,这是因为TrainDataFeedSequence不返回inputtargets作为np.Array
使两个np.Array与输入和目标,并返回他们解决了错误
工作版本:

class TrainDataFeedSequence(tf.keras.utils.Sequence):

    def __init__(self, data, batch_size):
        self.data = data
        self.batch_size = batch_size

    def __len__(self):
        return len(self.data) - self.batch_size

    def __getitem__(self, idx):
        inputs = self.data.iloc[idx:idx+self.batch_size,:]
        targets = self.data.iloc[idx+self.batch_size,:1]
        return np.array(inputs), np.array(targets)

dataSeq = TrainDataFeedSequence(train, 20)

# this prints out nothing
# for i in range(0, len(dataSeq)):
#     curInputShape = dataSeq.__getitem__(i)[0].shape
#     if (curInputShape[0] == None or curInputShape[1] == None or curInputShape == (None, None)):
#         print(dataSeq.__getitem__(i)[0])

model.fit(x=dataSeq, epochs=100)

值得一提的是,这个精确的手工序列是负责搞砸我的训练过程,所以最好不要直接使用它,这应该有所帮助

相关问题