How to decode the transform result?
y9c opened this issue · 7 comments
How to decode the transform result, and calculate the similarity between the decoded one with the original matrix ?
At present there is no subroutine (is it really needed?) defined to get the deocoded result. The autoencoder gives you the latent features only. However, you can easily get the decoded result.
Check this commented line in the code:
https://github.com/rajarsheem/libsdae/blob/master/deepautoencoder/stacked_autoencoder.py#L129
Also I feel that checking similarity is not a suitable subroutine. In fact, the loss itself is a metric to evaluate the similarity. But again, you can always fork the repository and add your modifications.
Hi, @rajarsheem
I tried to read the code. It seems that calculating loss
is not included in transform
function.
BTW, after the uncomment line 129, this error occur.
127 print('epoch {0}: global loss = {1}'.format(i, l))
128 # debug
--> 129 print('Decoded', sess.run(decoded, feed_dict={x: self.data_x_})[0])
130 self.weights.append(sess.run(encode['weights']))
131 self.biases.append(sess.run(encode['biases']))
AttributeError: 'StackedAutoEncoder' object has no attribute 'data_x_'
I am a beginner of tensorflow, and it is a little difficult for me to fork this code. Can you show me how to include checking similarity
in the transform
subroutine?
Thank you for your reply.
The transform function gives you latent features (or in other words, it converts the input to a layers to it's hidden features), it doesn't calculate loss.
Regarding the error, omit the self
. Check this.
You have to run decoded
in the similar way as done in line 132.
@rajarsheem thank you very much. I am trying to edit the code now.
I hope your issue is resolved. I am closing it for now. Have a nice day!
Thank you @rajarsheem,
I can't print out loss by add line 129 now.
I run the test.py
script, and the print out lose is as below.
I wonder whether they are too big.
# for the testing set
Decoded [ 0.19410324 -0.80855119 0.04349327 0.45655292 0.7569443 0.29246199
0.57239246 -0.73703629 1.04319215 0.78330296 0.08215475 -0.03966716
0.22089517 0.29183397 1.06575072 0.49130109 -0.89586771 0.07742095
0.18414375 0.29854727 -0.00219518 0.41591752 -0.3404398 0.82669973
-0.37267661 -0.5081231 -1.1587857 0.300556 1.51589501 -0.56638539
-0.21519709 -0.27730778 -0.04051077 0.08296168 0.25760895 -0.6262818
0.37675434 0.48922759 0.76240593 0.06220865 0.08346117 -0.21002531
-0.12697542 -0.4284333 -0.20329994 0.1062052 -0.29259431 -0.15798257
0.81781769 -0.56369209 0.26571798 0.60078782 0.33278495 0.14875543
0.52916014 -0.16898535 -0.42866033 -0.84731263 0.41544652 -0.12549052
0.38813394 -0.26518467 -0.0144836 -0.12156664 0.58113551 0.21133372
-0.39966226 -1.18121099 -1.70309532 -0.23637436 -0.15937304 -0.300762
0.14584368 -0.62949514 -0.01828933 0.82925105 -0.15204427 0.21565779
-0.02752241 1.42342949 -1.79351115 0.70898336 0.22771528 1.5639832
0.40643287 1.02484238 -0.52410579 0.44012505 0.19901231 -0.21501279
-0.8748368 0.16990572 -0.10545719 -0.09983534 -0.00859177 -0.66635561
....
# for the training set
Decoded [ -5.16670704e-01 1.60838604e-01 -1.29701644e-02 -9.31391358e-01
-1.66932344e-01 -1.05433667e+00 3.31253409e-01 -1.71889424e+00
-1.33675396e-01 -3.03752482e-01 1.79831088e-01 4.81573641e-01
3.58204812e-01 -4.08629298e-01 -1.06527507e-01 5.44087410e-01
1.63092101e+00 4.42952484e-01 -3.21321845e-01 -2.01633215e-01
1.69724226e-02 6.64343536e-01 -5.90095997e-01 -7.56379366e-01
-6.84591353e-01 -1.10545361e+00 -4.00293201e-01 -5.05989790e-01
4.01353598e-01 6.78894699e-01 4.29206967e-01 -1.05870366e+00
-1.55221611e-01 7.53084540e-01 -4.03234839e-01 5.29231727e-01
-2.10052341e-01 -6.83722973e-01 -4.72686857e-01 5.33213258e-01
1.11254203e+00 -7.47130215e-01 8.98506641e-02 -3.84994984e-01
-5.52200556e-01 4.95311201e-01 5.02683222e-01 2.13940725e-01
-1.73815727e-01 1.03989553e+00 -1.56957597e-01 1.28602672e+00
5.88526547e-01 8.94701481e-01 -5.63555896e-01 -5.79340816e-01
1.36770725e-01 1.11180925e+00 -1.31206870e-01 2.35631168e-01
2.93583632e-01 5.05746007e-02 -1.62091553e-01 -1.03108120e+00
-4.33603138e-01 -5.78906775e-01 1.33094633e+00 4.42742586e-01
5.32514811e-01 2.55604625e-01 5.79367936e-01 2.70208329e-01
-1.52480209e+00 4.66503739e-01 2.95481324e-01 -1.32633328e+00
5.40029466e-01 1.22016490e+00 2.08209664e-01 2.91971147e-01
3.95809412e-01 1.01738572e-01 9.45007801e-03 2.62386739e-01
...