-
Notifications
You must be signed in to change notification settings - Fork 189
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Question about backprop #54
Comments
Core ML: no, gradients are not exposed in the API. Metal: yes, but you'll have to re-implement the model using the lowest-level MPS primitives, such as MPSCNNConvolutionGradient. |
I see. So it would be:
1- implement model using MPS primitives
2- load the weights and bias to MPS model
What happens next? Are there metal functions for forward and backward passes? Or would those also need to be implemented?
|
It's possible a nicer API is available these days (I haven't used MPS in a while) but in the past you had to implement both the forward and backward pass yourself. So a very simple model would be MPSLinearLayer -> MPSLossFunction -> MPSLinearLayerGradient where the MPSLinearLayerGradient is the backward pass. |
This sounds like the current API. Does your book (or any of your articles) have an example implementation of this?
|
I don't have any examples for this, unfortunately. |
Hi Matt,
is it possible to get the gradient from back propagation using core ml or metal? I am trying to implement an adversarial attack on a core ML resnet 50 and don't know how to go about it?
by the way, the tutorials you post are excellent
The text was updated successfully, but these errors were encountered: