-
Notifications
You must be signed in to change notification settings - Fork 4
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
feat: add PositiveDefinite
#89
base: main
Are you sure you want to change the base?
feat: add PositiveDefinite
#89
Conversation
…underlying model
….,hcat,eachcol(...))` for compatibility with GPUArrays
…avor of comparing absolute difference with a small threshold
0445dd8
to
ebf0efe
Compare
src/layers/containers.jl
Outdated
""" | ||
@concrete struct PositiveDefinite <: AbstractLuxWrapperLayer{:model} | ||
model <: AbstractLuxLayer | ||
x0 <: AbstractVector |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Don't store a vector here. Instead pass in a initialization_function (ideally from WeightInitializers.jl) and construct the vector inside initialstates
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I believe I've resolved this, but I'd like you to check if I did what you were asking when you get the chance.
src/layers/containers.jl
Outdated
in_val <: AbstractVector | ||
out_val <: AbstractVector |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
same as above
…dices to leave alone.
src/layers/containers.jl
Outdated
end | ||
function PositiveDefinite(model; in_dims::Integer, ψ=Base.Fix1(sum, abs2), | ||
r=Base.Fix1(sum, abs2) ∘ -) | ||
return PositiveDefinite(model, () -> zeros(in_dims), ψ, r) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
pass in zeros32
?
() -> copy(x0)
Here you can pass a dummy function that takes in (rng, in_dims) and ignores them
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Do you mean something like the change I just made?
end | ||
|
||
function (pd::PositiveDefinite)(x::AbstractMatrix, ps, st) | ||
ϕ0, _ = pd.model(st.x0, ps, st.model) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
don't ignore the returned states here
end | ||
|
||
function (s::ShiftTo)(x::AbstractMatrix, ps, st) | ||
ϕ0, _ = s.model(st.in_val, ps, st.model) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
same as above
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
If model
is the mathematical function ShiftTo
is supposed to represent PositiveDefinite
does something similar). In a sense,
Is it possible to combine the calls? I suppose I could call with hcat(x, st.in_val)
, but I don't know how I'd separate the last column back out without using the scalar indexing that CUDA hates so much.
@@ -11,7 +11,7 @@ using Static: Static | |||
|
|||
using ForwardDiff: ForwardDiff | |||
|
|||
using Lux: Lux, LuxOps, StatefulLuxLayer | |||
using Lux: Lux, LuxOps, StatefulLuxLayer, WeightInitializers |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
import it as a package, not from Lux
A
PositiveDefinite
container wraps an underlying model and results in a model that returns a postive number whenever the input is nonzero (or not equal to a different point specified when defining the container). This is useful, among other applications, in neural Lyapunov applications.