You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
We might consider encapsulating the concept of a 'trainer' into the neural network just as we encapsulated the idea of a 'activation function' object into the neural layer.
this would leave training and error propogation as the job of the trainer and polymorphic and not something the neural network itself handles directly. This should clean up the issue of the name / enum switch thing while also offering the chance to expand into new training types in the future.
The text was updated successfully, but these errors were encountered:
We might consider encapsulating the concept of a 'trainer' into the neural network just as we encapsulated the idea of a 'activation function' object into the neural layer.
this would leave training and error propogation as the job of the trainer and polymorphic and not something the neural network itself handles directly. This should clean up the issue of the name / enum switch thing while also offering the chance to expand into new training types in the future.
The text was updated successfully, but these errors were encountered: