3
In this code I’m studying he declares it the way super(RandomWalker, self).__init__()
, but I always see it said that in Python3+ it can be said as super().__init__()
:
import torch.nn as nn
import torch
from .randomwalker2D import RandomWalker2D as RW2D
class RandomWalker(nn.Module):
def __init__(self, num_grad, max_backprop = True)
super(RandomWalker, self).__init__()
self.rw = RW2D
self.num_grad = num_grad
self.max_backprop = max_backprop
But when I switch to super().__init__()
and having rotated seems to stay in a looping. Because it happens if technically they are the same thing?
All the code is in git https://github.com/hci-unihd/pytorch-LearnedRandomWalker
.
I’ve had this doubt for a while, but I never find the answer that explains what I really need.
The ideal would be to put a reduced version of the superclass and subclass, to reproduce the problem with as little code as possible. Pointing external repositories is not considered good practice in the OS.
– epx
I understand, as it was my machine that was slow, I’m having the same performance with both modes (as expected). Thanks for clarifying, now I have more clarity on how to use super().
– Aldimir Bruzadin