We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
When I train simple linear models, the loss function oscillates wildly. For example, using inSituAdam:
inSituAdam
model_linear = neu.Sequential([ neu.ClementsLayer(N), neu.Activation(neu.Abs(N)), neu.DropMask(N, keep_ports=range(N_classes)) ]) losses = neu.InSituAdam(model_linear, neu.CategoricalCrossEntropy, step_size=step_size).fit(x_train_flattened, y_train_onehot, epochs=n_epochs, batch_size=batch_size)
This may be a sign that the gradients are incorrect. Should double check.
The text was updated successfully, but these errors were encountered:
I think its fine. This is probably due to the way the batching works.
Sorry, something went wrong.
Just got this email notif, I don't think you would get this issue if you removed the abs nonlinearity and did a meansquareerror loss.
9d35920
No branches or pull requests
When I train simple linear models, the loss function oscillates wildly. For example, using
inSituAdam
:This may be a sign that the gradients are incorrect. Should double check.
The text was updated successfully, but these errors were encountered: