You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I was experimenting with alternatives to FluxML/Optimisers.jl#57 when I
encountered the following weird issue.
Look at the number of allocations when computing the gradient of loss1
functionloss1(m)
ls =0f0for l in Functors.fleaves(m)
if l isa AbstractArray{<:Number}
ls +=sum(l)
endendreturn ls
endfunctionloss2(m)
sum(sum(l) for l in Functors.fleaves(m) if l isa AbstractArray{<:Number})
endfunctionloss3(m)
sum([sum(l) for l in Functors.fleaves(m) if l isa AbstractArray{<:Number}])
endfunctionperf()
m =Chain(Dense(128=>128, relu), BatchNorm(3), Dense(128=>10))
@btimegradient(loss1, $m)[1]
@btimegradient(loss2, $m)[1]
@btimegradient(loss3, $m)[1]
println()
endperf(); #1st callperf(); #2nd callperf(); #3rd call
# OUTPUT
154.795 ms (1022652 allocations: 39.16 MiB)
1.734 ms (7605 allocations: 352.62 KiB)
1.314 ms (5948 allocations: 288.08 KiB)
258.556 ms (1658450 allocations: 63.37 MiB)
1.735 ms (7605 allocations: 352.62 KiB)
1.316 ms (5948 allocations: 288.08 KiB)
336.418 ms (2154374 allocations: 82.29 MiB)
1.739 ms (7605 allocations: 352.62 KiB)
1.319 ms (5948 allocations: 288.08 KiB)
What's going on?
The text was updated successfully, but these errors were encountered:
I was experimenting with alternatives to FluxML/Optimisers.jl#57 when I
encountered the following weird issue.
Look at the number of allocations when computing the gradient of
loss1
What's going on?
The text was updated successfully, but these errors were encountered: