Advertisement
KDLPro

Error 1

Apr 24th, 2024
465
0
Never
Not a member of Pastebin yet? Sign Up, it unlocks many cool features!
Python 2.24 KB | None | 0 0
  1. ---------------------------------------------------------------------------
  2. TypeError                                 Traceback (most recent call last)
  3. File D:\ProgramData\Miniconda_3.9\envs\rnn-sample-py3.9\lib\site-packages\torchsummary\torchsummary.py:140, in summary(model, input_data, batch_dim, branching, col_names, col_width, depth, device, dtypes, verbose, *args, **kwargs)
  4.     139     with torch.no_grad():
  5. --> 140         _ = model.to(device)(*x, *args, **kwargs)  # type: ignore[misc]
  6.     141 except Exception as e:
  7.  
  8. File D:\ProgramData\Miniconda_3.9\envs\rnn-sample-py3.9\lib\site-packages\torch\nn\modules\module.py:1518, in Module._wrapped_call_impl(self, *args, **kwargs)
  9.    1517 else:
  10. -> 1518     return self._call_impl(*args, **kwargs)
  11.  
  12. File D:\ProgramData\Miniconda_3.9\envs\rnn-sample-py3.9\lib\site-packages\torch\nn\modules\module.py:1527, in Module._call_impl(self, *args, **kwargs)
  13.    1524 if not (self._backward_hooks or self._backward_pre_hooks or self._forward_hooks or self._forward_pre_hooks
  14.    1525         or _global_backward_pre_hooks or _global_backward_hooks
  15.    1526         or _global_forward_hooks or _global_forward_pre_hooks):
  16. -> 1527     return forward_call(*args, **kwargs)
  17.    1529 try:
  18.  
  19. TypeError: forward() missing 1 required positional argument: 'buffer_in'
  20.  
  21. The above exception was the direct cause of the following exception:
  22.  
  23. RuntimeError                              Traceback (most recent call last)
  24. Cell In[20], line 2
  25.       1 from torchsummary import summary
  26. ----> 2 summary(gps_rnn, (32,14,2))
  27.  
  28. File D:\ProgramData\Miniconda_3.9\envs\rnn-sample-py3.9\lib\site-packages\torchsummary\torchsummary.py:143, in summary(model, input_data, batch_dim, branching, col_names, col_width, depth, device, dtypes, verbose, *args, **kwargs)
  29.     141 except Exception as e:
  30.     142     executed_layers = [layer for layer in summary_list if layer.executed]
  31. --> 143     raise RuntimeError(
  32.     144         "Failed to run torchsummary. See above stack traces for more details. "
  33.     145         "Executed layers up to: {}".format(executed_layers)
  34.     146     ) from e
  35.     147 finally:
  36.     148     if hooks is not None:
  37.  
  38. RuntimeError: Failed to run torchsummary. See above stack traces for more details. Executed layers up to: []
Advertisement
Add Comment
Please, Sign In to add comment
Advertisement