Error in hyperelasticity demo: function NNlib.σ must be explicitly imported
Closed this issue · 3 comments
I just tried to test the hyperelastic demo but get the below error on this line:
σ(∇u) = (1.0/J(F(∇u)))*F(∇u)⋅S(∇u)⋅(F(∇u))'
ERROR: LoadError: error in method definition: function NNlib.σ must be explicitly imported to be extended
Stacktrace:
[1] top-level scope
@ none:0
I am using Julia Version 1.6.3 on Ubuntu 20.04 with VScode.
Hi @Kevin-Mattheus-Moerman, thanks for reporting!
It seems that some dependency has introduced function σ and it is conflicting with our definition. But I don't understand why...
Are you explicitly using NNlib
? or just running the tutorial verbatim in a fresh Julia session? (this last detail is important)
@fverdugo when I restart the script runs fine. It seems that it fails if I run a script featuring flux.jl
first, see below. It features a symbol σ, is that it then? Is there a way to clear definitions before running a script? (as you can see I am new to Julia so did not know these two scripts could "bite" each other in this way.
using CairoMakie, Flux, Statistics
## Auxiliary functions for generating our data
function generate_real_data(n)
x1 = rand(1,n) .- 0.5
x2 = (x1 .* x1)*3 .+ randn(1,n)*0.1
return vcat(x1,x2)
end
function generate_fake_data(n)
θ = 2*π*rand(1,n)
r = rand(1,n)/3
x1 = @. r*cos(θ)
x2 = @. r*sin(θ)+0.5
return vcat(x1,x2)
end
function NeuralNetwork()
return Chain(
Dense(2, 25,relu),
Dense(25,1,x->σ.(x))
)
end
## Creating our data
train_size = 5000
real = generate_real_data(train_size)
fake = generate_fake_data(train_size)
## Visualizing
fig1 = Figure()
ax1 = Axis(fig1[1,1],title = "Some dots")
scatter!(ax1,real[1,1:500],real[2,1:500])
scatter!(ax1,fake[1,1:500],fake[2,1:500])
fig1
# Organizing the data in batches
X = hcat(real,fake)
Y = vcat(ones(train_size),zeros(train_size))
data = Flux.Data.DataLoader((X, Y'), batchsize=100,shuffle=true);
# Defining our model, optimization algorithm and loss function
m = NeuralNetwork()
opt = Descent(0.05)
loss(x, y) = sum(Flux.Losses.binarycrossentropy(m(x), y))
# Training Method 1
ps = Flux.params(m)
epochs = 20
for i in 1:epochs
Flux.train!(loss, ps, data, opt)
end
println(mean(m(real)),mean(m(fake)))
# Print model prediction# Visualizing the model predictions
fig2 = Figure()
ax2 = Axis(fig2[1,1],title = "Dodged bars with legend 2")
scatter!(ax2,real[1,1:100],real[2,1:100],zcolor=m(real)')
scatter!(ax2,fake[1,1:100],fake[2,1:100],zcolor=m(fake)',legend=false)
fig2
This is resolved thanks