Problem with Examples: Activation Units
filmo opened this issue · 2 comments
filmo commented
Hi, just installed this a few minutes ago and ended up using the example code on the preliminary site (rather than the included code). Looks like the old activation methods are still being used which causes an error with the current version of deeppy.
For example:
dp.Activation('relu')
http://andersbll.github.io/deeppy-website/examples/index.html
May want to update code on site in case anybody else runs across this.
(The example code on GitHub does work)
andersbll commented
Hey, thank you for the info! I have a larger update of the codebase pending and will make sure this is fixed at that point.
Ziul commented
For the ones that try the examples, here are some updates that need (and i've seen so far, version deeppy==0.1.dev0
):
dataset.arrays
replacesdataset.data
dp.SupervisedFeed
replacesdp.SupervisedInput
db.Feed
replacesdb.Input
dp.ReLU()
replacesdp.Activation('relu')
dp.GradientDescent
have diferents params- looks like
dp.GradientDescent.train_epochs
replacesdp.GradientDescent.train_epochs
, asmax_epochs
isn't a param ofdp.GradientDescent
anymore.