I tired to follow the example of using MAGRAD optimizer ``` import torch_optimizer as optim optimizer = optim.MAGRAD(model.parameters(), lr=0.1) optimizer.zero_grad() loss_fn(model(input), target).backward() optimizer.step() ``` but got `AttributeError: module 'torch_optimizer' has no attribute 'MAGRAD'`