Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Complex number support #646

Open
ErikQQY opened this issue Nov 29, 2024 · 10 comments
Open

Complex number support #646

ErikQQY opened this issue Nov 29, 2024 · 10 comments
Labels
core Related to the core utilities of the package

Comments

@ErikQQY
Copy link
Contributor

ErikQQY commented Nov 29, 2024

Find this while implementing SciML/BoundaryValueDiffEq.jl#258

This MWE is just proof of the idea and may not be meaningful, but I think it can still prove that the current DI lacks support for sparse Jacobian of complex numbers.

using ForwardDiff, DifferentiationInterface, SparseMatrixColorings, ADTypes
const DI = DifferentiationInterface
#backend = AutoForwardDiff() # this works
backend = AutoSparse(
    AutoForwardDiff(),
    sparsity_detector = ADTypes.KnownJacobianSparsityDetector(ones(3, 3)),
    coloring_algorithm = ConstantColoringAlgorithm(ones(3, 3), ones(Int64, 3))
)
u0 = [1.0; 0.0; 0.0] .+1im
jac_cache = DI.prepare_jacobian(nothing, similar(u0), backend, u0)

stack trace

ERROR: MethodError: no method matching DifferentiationInterfaceSparseMatrixColoringsExt.PushforwardSparseJacobianPrep(::DifferentiationInterface.BatchSizeSettings{…}, ::SparseMatrixColorings.ColumnColoringResult{…}, ::Matrix{…}, ::Vector{…}, ::Vector{…}, ::DifferentiationInterfaceForwardDiffExt.ForwardDiffTwoArgPushforwardPrep{…})

Closest candidates are:
  DifferentiationInterfaceSparseMatrixColoringsExt.PushforwardSparseJacobianPrep(::BS, ::C, ::M, ::S, ::R, ::E) where {BS<:DifferentiationInterface.BatchSizeSettings, C<:(AbstractColoringResult{:nonsymmetric, :column}), M<:(AbstractMatrix{<:Real}), S<:(AbstractVector{<:Tuple{Vararg{T, N}} where {N, T}}), R<:(AbstractVector{<:Tuple{Vararg{T, N}} where {N, T}}), E<:DifferentiationInterface.PushforwardPrep}
   @ DifferentiationInterfaceSparseMatrixColoringsExt ~/.julia/packages/DifferentiationInterface/bulUW/ext/DifferentiationInterfaceSparseMatrixColoringsExt/jacobian.jl:11

Stacktrace:
 [1] _prepare_sparse_jacobian_aux_aux(::DifferentiationInterface.BatchSizeSettings{…}, ::SparseMatrixColorings.ColumnColoringResult{…}, ::Vector{…}, ::Tuple{…}, ::AutoSparse{…}, ::Vector{…})
   @ DifferentiationInterfaceSparseMatrixColoringsExt ~/.julia/packages/DifferentiationInterface/bulUW/ext/DifferentiationInterfaceSparseMatrixColoringsExt/jacobian.jl:107
 [2] _prepare_sparse_jacobian_aux(::DifferentiationInterface.PushforwardFast, ::Vector{…}, ::Tuple{…}, ::AutoSparse{…}, ::Vector{…})
   @ DifferentiationInterfaceSparseMatrixColoringsExt ~/.julia/packages/DifferentiationInterface/bulUW/ext/DifferentiationInterfaceSparseMatrixColoringsExt/jacobian.jl:81
 [3] prepare_jacobian(::Nothing, ::Vector{…}, ::AutoSparse{…}, ::Vector{…})
   @ DifferentiationInterfaceSparseMatrixColoringsExt ~/.julia/packages/DifferentiationInterface/bulUW/ext/DifferentiationInterfaceSparseMatrixColoringsExt/jacobian.jl:49
 [4] top-level scope
   @ ~/Random/test2.jl:27
Some type information was truncated. Use `show(err)` to see complete types.

I believe the culprit is the annotation of the compressed matrix being too restricted in the SparseMatrixColorings extensions, I locally changed this into more generic ones and the errors are gone.

struct PushforwardSparseJacobianPrep{
BS<:BatchSizeSettings,
C<:AbstractColoringResult{:nonsymmetric,:column},
M<:AbstractMatrix{<:Real},
S<:AbstractVector{<:NTuple},
R<:AbstractVector{<:NTuple},
E<:PushforwardPrep,
} <: SparseJacobianPrep
batch_size_settings::BS
coloring_result::C
compressed_matrix::M
batched_seeds::S
batched_results::R
pushforward_prep::E
end
struct PullbackSparseJacobianPrep{
BS<:BatchSizeSettings,
C<:AbstractColoringResult{:nonsymmetric,:row},
M<:AbstractMatrix{<:Real},
S<:AbstractVector{<:NTuple},
R<:AbstractVector{<:NTuple},
E<:PullbackPrep,
} <: SparseJacobianPrep
batch_size_settings::BS
coloring_result::C
compressed_matrix::M
batched_seeds::S
batched_results::R
pullback_prep::E
end

@gdalle
Copy link
Member

gdalle commented Nov 29, 2024

Note that DI currently doesn't guarantee anything vis a vis complex number support (this should probably be emphasized in the docs, PR welcome). The reason is that many backends don't support them at all, and the remaining ones probably have wildly diverging conventions.
So while we can relax the type annotation, complex support is not part of the API at the moment (which also means it is not tested).

@ErikQQY
Copy link
Contributor Author

ErikQQY commented Nov 29, 2024

I can understand that different AD backends may support this differently, so what is the current solution for this? I can PR to relax these type annotations and add proper tests(which I believe will resolve the ForwardDiff and ReverseDiff cases), we have to start somewhere right🤠?

@gdalle
Copy link
Member

gdalle commented Nov 29, 2024

Feel free to open a PR replacing any <:Real annotation in the package with <:Number, since it will solve your immediate problem. But I'm not ready to add complex number support to the promises of DI's public API yet, precisely because we cannot guarantee the answers will be the same across backends. And the concepts themselves are not necessarily well defined.
Do you have an idea which backends support complex numbers and how their support differs? I've never looked into it.

@gdalle gdalle changed the title Complex number support for sparse jacobian Complex number support Nov 29, 2024
@ErikQQY
Copy link
Contributor Author

ErikQQY commented Nov 29, 2024

From my testing, the funny thing is that the dense AD using DI would just work when dealing with complex numbers(only tested on ForwardDiff), it’s only sparse AD that suffers problems.

@gdalle
Copy link
Member

gdalle commented Nov 29, 2024

Sure, but what I mean is that even if it "just works" (because DI forwards everything without constraining types), different backends may have different interpretations of what the complex derivatives mean. And I don't know if it is wise to expose these different interpretations under a common interface, possibly confusing users by letting them believe the operations are semantically the same. Does that make sense?
There's a long thread which I never really read in detail about this, but I might have to dive back in: https://discourse.julialang.org/t/taking-complex-autodiff-seriously-in-chainrules/39317

@ErikQQY
Copy link
Contributor Author

ErikQQY commented Nov 30, 2024

And I don't know if it is wise to expose these different interpretations under a common interface, possibly confusing users by letting them believe the operations are semantically the same. Does that make sense?

Yeah, that does make sense. So now I just relax the type annotations of the compressed matrix for my special use case then.

@gdalle
Copy link
Member

gdalle commented Dec 5, 2024

Added a docs warning in #651

@gdalle gdalle added the core Related to the core utilities of the package label Dec 5, 2024
@wsmoses
Copy link

wsmoses commented Dec 9, 2024

This feels like something it would be really great to have in DI (e.g. have a DI convention, and convert the corresponding package gradient/etc calls to be able to use it).

@gdalle
Copy link
Member

gdalle commented Dec 9, 2024

But even if we tried to homogenize the outputs/interpretation, there is nothing we can do about inputs. For instance, ForwardDiff's gradient will fail on a function with complex input, even if the function is real-valued.

@wsmoses
Copy link

wsmoses commented Dec 9, 2024

sure, just throw a nicer error message and mark it as unsupported in your "is X backend supported on Y function" table

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
core Related to the core utilities of the package
Projects
None yet
Development

No branches or pull requests

3 participants