bearpaw/pytorch-pose

why the loss is always very small?

Closed this issue · 1 comments

hi, everyone. I have read the code and trained the modes of this pytorch-pose. I confused why the train loss and validation loss so small ( always 1e-3 )? It looks different from traing other deep learning models. Below is my training process of stack-8 block-1. Thank you for the help!
Epoch LR Train Loss Val Loss Train Acc Val Acc
1.000000 0.000100 0.040519 0.006604 0.066519 0.154301
2.000000 0.000100 0.005778 0.005863 0.196297 0.295626
3.000000 0.000100 0.005601 0.007943 0.249709 0.333515
4.000000 0.000100 0.005433 0.006463 0.292266 0.382962
5.000000 0.000100 0.005278 0.011357 0.340348 0.433160
6.000000 0.000100 0.005151 0.013878 0.379125 0.458898
7.000000 0.000100 0.005042 0.011793 0.414802 0.508453
8.000000 0.000100 0.004942 0.006337 0.445820 0.515834
9.000000 0.000100 0.004853 0.008923 0.476248 0.553932
10.000000 0.000100 0.004755 0.005029 0.510092 0.593099
11.000000 0.000100 0.004662 0.004275 0.542758 0.619362
12.000000 0.000100 0.004590 0.004533 0.563967 0.634098
13.000000 0.000100 0.004530 0.004842 0.581655 0.632911
14.000000 0.000100 0.004477 0.004521 0.595700 0.646019
15.000000 0.000100 0.004435 0.004461 0.606501 0.667473
16.000000 0.000100 0.004382 0.004361 0.619863 0.679526
17.000000 0.000100 0.004341 0.004131 0.630043 0.691884
18.000000 0.000100 0.004302 0.004430 0.638425 0.689532
19.000000 0.000100 0.004270 0.004337 0.645869 0.695146
20.000000 0.000100 0.004236 0.004448 0.652779 0.694888
21.000000 0.000100 0.004204 0.004129 0.658970 0.716643
22.000000 0.000100 0.004180 0.004079 0.664444 0.710762
23.000000 0.000100 0.004145 0.004864 0.673128 0.702015
24.000000 0.000100 0.004116 0.004012 0.677755 0.719457
25.000000 0.000100 0.004089 0.004043 0.683418 0.726996
26.000000 0.000100 0.004075 0.004792 0.687907 0.717509
27.000000 0.000100 0.004049 0.004221 0.691811 0.729067
28.000000 0.000100 0.004026 0.004235 0.698158 0.728886
29.000000 0.000100 0.004002 0.003933 0.703373 0.743552
30.000000 0.000100 0.003983 0.003977 0.707411 0.738715
31.000000 0.000100 0.003959 0.003856 0.712823 0.749882
32.000000 0.000100 0.003947 0.004130 0.716247 0.750792
33.000000 0.000100 0.003936 0.004193 0.717466 0.743339
34.000000 0.000100 0.003905 0.004147 0.722726 0.748920
35.000000 0.000100 0.003887 0.003915 0.727711 0.749792
36.000000 0.000100 0.003873 0.004049 0.727161 0.759734
37.000000 0.000100 0.003859 0.004056 0.731356 0.740315
38.000000 0.000100 0.003834 0.003991 0.737066 0.766997
39.000000 0.000100 0.003833 0.004084 0.736956 0.761718
40.000000 0.000100 0.003819 0.003690 0.739787 0.769920
41.000000 0.000100 0.003797 0.003952 0.745627 0.760283
42.000000 0.000100 0.003791 0.003901 0.746524 0.772981
43.000000 0.000100 0.003773 0.004090 0.748357 0.766894
44.000000 0.000100 0.003758 0.003963 0.751360 0.774523
45.000000 0.000100 0.003753 0.004113 0.752556 0.775152
46.000000 0.000100 0.003728 0.004195 0.758551 0.782942
47.000000 0.000100 0.003711 0.003826 0.759776 0.781414
48.000000 0.000100 0.003717 0.003780 0.758977 0.783643
49.000000 0.000100 0.003705 0.004291 0.762976 0.783146
50.000000 0.000100 0.003684 0.003696 0.765159 0.782737
51.000000 0.000100 0.003675 0.003813 0.768794 0.788034
52.000000 0.000100 0.003665 0.003854 0.770016 0.793802
53.000000 0.000100 0.003661 0.003855 0.770352 0.787637
54.000000 0.000100 0.003640 0.003734 0.774723 0.790245
55.000000 0.000100 0.003636 0.003884 0.775233 0.794752
56.000000 0.000100 0.003624 0.003924 0.776930 0.785818
57.000000 0.000100 0.003613 0.003705 0.779447 0.796602
58.000000 0.000100 0.003604 0.003853 0.781621 0.795611
59.000000 0.000100 0.003594 0.003764 0.782702 0.798791
60.000000 0.000100 0.003591 0.003811 0.783326 0.797562
61.000000 0.000010 0.003265 0.003198 0.801716 0.814483
62.000000 0.000010 0.003238 0.003191 0.806262 0.815575
63.000000 0.000010 0.003230 0.003192 0.808299 0.814489
64.000000 0.000010 0.003216 0.003176 0.811464 0.815816
65.000000 0.000010 0.003214 0.003177 0.809532 0.817346
66.000000 0.000010 0.003201 0.003169 0.813702 0.817257
67.000000 0.000010 0.003202 0.003175 0.814226 0.814987
68.000000 0.000010 0.003195 0.003175 0.814527 0.816386
69.000000 0.000010 0.003190 0.003159 0.815462 0.819750
70.000000 0.000010 0.003196 0.003170 0.814338 0.817474
71.000000 0.000010 0.003187 0.003166 0.816677 0.820087
72.000000 0.000010 0.003186 0.003158 0.817255 0.820635
73.000000 0.000010 0.003181 0.003165 0.816634 0.818904
74.000000 0.000010 0.003184 0.003162 0.817163 0.819837
75.000000 0.000010 0.003176 0.003158 0.818142 0.818684
76.000000 0.000010 0.003172 0.003158 0.819772 0.820288
77.000000 0.000010 0.003169 0.003159 0.820208 0.819323
78.000000 0.000010 0.003167 0.003153 0.820191 0.820840
79.000000 0.000010 0.003164 0.003160 0.821483 0.820544
80.000000 0.000010 0.003164 0.003154 0.820176 0.820650
81.000000 0.000010 0.003160 0.003155 0.821318 0.822272
82.000000 0.000010 0.003157 0.003146 0.822337 0.823513
83.000000 0.000010 0.003163 0.003163 0.821482 0.819525
84.000000 0.000010 0.003157 0.003154 0.822139 0.822554
85.000000 0.000010 0.003152 0.003150 0.823072 0.824411
86.000000 0.000010 0.003151 0.003149 0.824456 0.823629
87.000000 0.000010 0.003150 0.003155 0.822820 0.822656
88.000000 0.000010 0.003152 0.003143 0.824378 0.824872
89.000000 0.000010 0.003146 0.003139 0.824860 0.823768
90.000000 0.000010 0.003136 0.003139 0.826952 0.825458
91.000000 0.000001 0.003135 0.003131 0.825520 0.825208
92.000000 0.000001 0.003128 0.003131 0.826755 0.824568
93.000000 0.000001 0.003131 0.003133 0.827131 0.825824
94.000000 0.000001 0.003126 0.003132 0.827385 0.824603
95.000000 0.000001 0.003128 0.003133 0.826717 0.824171
96.000000 0.000001 0.003131 0.003135 0.828059 0.824281
97.000000 0.000001 0.003127 0.003130 0.827289 0.824826
98.000000 0.000001 0.003121 0.003131 0.828627 0.823672
99.000000 0.000001 0.003127 0.003132 0.827220 0.825334
100.000000 0.000001 0.003126 0.003133 0.828195 0.823772
101.000000 0.000001 0.003122 0.003133 0.828492 0.825362
102.000000 0.000001 0.003123 0.003134 0.827998 0.825257
103.000000 0.000001 0.003120 0.003131 0.829216 0.825391
104.000000 0.000001 0.003131 0.003134 0.826828 0.824552
105.000000 0.000001 0.003121 0.003130 0.828140 0.826133
106.000000 0.000001 0.003124 0.003133 0.826996 0.824674
107.000000 0.000001 0.003125 0.003131 0.827876 0.826003
108.000000 0.000001 0.003122 0.003129 0.827146 0.826141
109.000000 0.000001 0.003118 0.003126 0.829902 0.827371
110.000000 0.000001 0.003120 0.003130 0.828066 0.825935
111.000000 0.000001 0.003116 0.003127 0.828986 0.825615
112.000000 0.000001 0.003125 0.003123 0.827381 0.826918
113.000000 0.000001 0.003123 0.003127 0.828666 0.824989
114.000000 0.000001 0.003119 0.003130 0.827995 0.825304
115.000000 0.000001 0.003121 0.003125 0.828034 0.825756
116.000000 0.000001 0.003119 0.003129 0.829143 0.825079
117.000000 0.000001 0.003122 0.003127 0.829562 0.826239
118.000000 0.000001 0.003120 0.003125 0.828884 0.825675
119.000000 0.000001 0.003118 0.003129 0.829933 0.824718
120.000000 0.000001 0.003122 0.003132 0.827937 0.823845
121.000000 0.000000 0.003118 0.003127 0.828167 0.826449
122.000000 0.000000 0.003114 0.003124 0.830013 0.826401
123.000000 0.000000 0.003120 0.003126 0.828216 0.826354
124.000000 0.000000 0.003115 0.003124 0.828879 0.826535
125.000000 0.000000 0.003113 0.003128 0.828978 0.826570
126.000000 0.000000 0.003124 0.003129 0.827313 0.824960
127.000000 0.000000 0.003120 0.003130 0.828555 0.825859
128.000000 0.000000 0.003121 0.003130 0.828872 0.825361
129.000000 0.000000 0.003122 0.003129 0.829600 0.824821
130.000000 0.000000 0.003112 0.003127 0.830041 0.826295
131.000000 0.000000 0.003110 0.003128 0.830714 0.825847
132.000000 0.000000 0.003117 0.003129 0.830742 0.824571
133.000000 0.000000 0.003118 0.003126 0.829283 0.827287
134.000000 0.000000 0.003123 0.003127 0.828451 0.825599
135.000000 0.000000 0.003118 0.003127 0.828554 0.826196
136.000000 0.000000 0.003119 0.003130 0.827563 0.825281

@heroxx2011 Because the loss is calculated element-wise. You can change the parameter of MSEloss like "size_average" or "reduce". See http://pytorch.org/docs/0.3.1/nn.html#mseloss for details.