Skip to content

Commit

Permalink
Minor: printed chart in multi-branch nn tutorial
Browse files Browse the repository at this point in the history
  • Loading branch information
sylvaticus committed Jul 7, 2023
1 parent 8bf69fc commit 026573f
Showing 1 changed file with 3 additions and 1 deletion.
Original file line number Diff line number Diff line change
Expand Up @@ -88,7 +88,7 @@ l8 = DenseLayer(15,1,f=relu,rng=copy(AFIXEDRNG))

# Finally we put the layers together and we create our `NeuralNetworkEstimator` model:
layers = [l1,l2,l3,l4,l5,l6,l7,l8]
m = NeuralNetworkEstimator(layers=layers,opt_alg=ADAM(),epochs=100,verbosity=HIGH,rng=copy(AFIXEDRNG))
m = NeuralNetworkEstimator(layers=layers,opt_alg=ADAM(),epochs=100,rng=copy(AFIXEDRNG))

# ## Fitting the model
println(now(), " ", "- model fitting..." ) #src
Expand All @@ -99,12 +99,14 @@ Ŷ = fit!(m,X,Y)
println(now(), " ", "- assessing the model quality..." ) #src
# We can compute the relative mean error between the "true" Y and the Y estimated by the model.
rme = relative_mean_error(Y,Ŷ)
@test rme <0.1 #src

# Of course we know there is no actual relation here between the X and The Y, as both are randomly generated, the result above just tell us that the network has been able to find a path between the X and Y that has been used for training, but we hope that in the real application this learned path represent a true, general relation beteen the inputs and the outputs.

# Finally we can also plot Y again Ŷ and visualize how the average loss reduced along the training:
scatter(Y,Ŷ,xlabel="vol observed",ylabel="vol estimated",label=nothing,title="Est vs. obs volumes")

#-
loss_per_epoch = info(m)["loss_per_epoch"]

plot(loss_per_epoch, xlabel="epoch", ylabel="loss per epoch", label=nothing, title="Loss per epoch")
Expand Down

2 comments on commit 026573f

@sylvaticus
Copy link
Owner Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@JuliaRegistrator register

Release notes:

  • NN: added ReplicatorLayer, GropedLayer and relative tutorial to implement and train multi-branch deep networks

@JuliaRegistrator
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Registration pull request created: JuliaRegistries/General/87060

After the above pull request is merged, it is recommended that a tag is created on this repository for the registered package version.

This will be done automatically if the Julia TagBot GitHub Action is installed, or can be done manually through the github interface, or via:

git tag -a v0.10.2 -m "<description of version>" 026573f5ea5c9476ccd27066acdb17c7636a7f64
git push origin v0.10.2

Please sign in to comment.