Paddle ResourceExhaustedError: Out of memory error on GPU 0

jrcvhitl  于 2022-11-13  发布在  其他
关注(0)|答案(1)|浏览(956)

问题描述 Please describe your issue

I'm training a custom dataset on the SETR model but it throws the same error though the GPU has enough memory. I've tried reducing batch size but, nothing seems to work. below is the complete error message.

Traceback (most recent call last):
File "train.py", line 232, in
main(args)
File "train.py", line 208, in main
train(
File "/home/usr/SETR/PaddleSeg/paddleseg/core/train.py", line 206, in train
logits_list = ddp_model(images) if nranks > 1 else model(images)
File "/home/usr/.local/lib/python3.8/site-packages/paddle/fluid/dygraph/layers.py", line 930, in call
return self._dygraph_call_func(*inputs, **kwargs)
File "/home/usr/.local/lib/python3.8/site-packages/paddle/fluid/dygraph/layers.py", line 915, in _dygraph_call_func
outputs = self.forward(*inputs, **kwargs)
File "/home/usr/SETR/PaddleSeg/paddleseg/models/setr.py", line 93, in forward
feats, _shape = self.backbone(x)
File "/home/usr/.local/lib/python3.8/site-packages/paddle/fluid/dygraph/layers.py", line 930, in call
return self._dygraph_call_func(*inputs, **kwargs)
File "/home/usr/.local/lib/python3.8/site-packages/paddle/fluid/dygraph/layers.py", line 915, in _dygraph_call_func
outputs = self.forward(*inputs, **kwargs)
File "/home/usr/SETR/PaddleSeg/paddleseg/models/backbones/vision_transformer.py", line 276, in forward
x = blk(x)
File "/home/usr/.local/lib/python3.8/site-packages/paddle/fluid/dygraph/layers.py", line 930, in call
return self._dygraph_call_func(*inputs, **kwargs)
File "/home/usr/.local/lib/python3.8/site-packages/paddle/fluid/dygraph/layers.py", line 915, in _dygraph_call_func
outputs = self.forward(*inputs, **kwargs)
File "/home/usr/SETR/PaddleSeg/paddleseg/models/backbones/vision_transformer.py", line 119, in forward
x = x + self.drop_path(self.attn(self.norm1(x)))
File "/home/usr/.local/lib/python3.8/site-packages/paddle/fluid/dygraph/layers.py", line 930, in call
return self._dygraph_call_func(*inputs, **kwargs)
File "/home/usr/.local/lib/python3.8/site-packages/paddle/fluid/dygraph/layers.py", line 915, in _dygraph_call_func
outputs = self.forward(*inputs, **kwargs)
File "/home/usr/SETR/PaddleSeg/paddleseg/models/backbones/vision_transformer.py", line 77, in forward
attn = (q.matmul(k.transpose((0, 1, 3, 2)))) * self.scale
File "/home/usr/.local/lib/python3.8/site-packages/paddle/fluid/dygraph/math_op_patch.py", line 217, in impl
return scalar_method(self, other_var)
File "/home/usr/.local/lib/python3.8/site-packages/paddle/fluid/dygraph/math_op_patch.py", line 197, in scalar_mul
return scalar_elementwise_op(var, value, 0.0)
File "/home/usr/.local/lib/python3.8/site-packages/paddle/fluid/dygraph/math_op_patch.py", line 124, in scalar_elementwise_op
return _C_ops.scale(var, 'scale', scale, 'bias', bias)
SystemError: (Fatal) Operator scale raises an paddle::memory::allocation::BadAlloc exception.
The exception content is
:ResourceExhaustedError:

Out of memory error on GPU 0. Cannot allocate 1.001954GB memory on GPU 0, 11.685364GB memory has been allocated and available memory is only 230.125000MB.

Please check whether there is any other process using GPU 0.

  1. If yes, please stop them, or start PaddlePaddle on another GPU.
  2. If no, please decrease the batch size of your model.
    If the above ways do not solve the out of memory problem, you can try to use CUDA managed memory. The command is export FLAGS_use_cuda_managed_memory=false .
    (at /paddle/paddle/fluid/memory/allocation/cuda_allocator.cc:87)
    . (at /paddle/paddle/fluid/imperative/tracer.cc:307)
rhfm7lfc

rhfm7lfc1#

您好,我们已经收到了您的问题,会安排技术人员尽快解答您的问题,请耐心等待。请您再次检查是否提供了清晰的问题描述、复现代码、环境&版本、报错信息等。同时,您也可以通过查看 官网API文档常见问题历史IssueAI社区 来寻求解答。祝您生活愉快~

Hi! We've received your issue and please be patient to get responded. We will arrange technicians to answer your questions as soon as possible. Please make sure that you have posted enough message to demo your request. You may also check out the APIFAQGithub Issue and AI community to get the answer.Have a nice day!

相关问题