Ideas to improve the training accuracy
Leslie-Fang opened this issue · 4 comments
Leslie-Fang commented
I am trying to fine-tune the model with UCF-101 dataset.
with the SGD optimizer and lr decay, I can get almost 91.79% accuracy after 6000 steps with BS: 32.
global_step = tf.Variable(0, trainable=False)
learning_rate = tf.compat.v1.train.exponential_decay(3, global_step, 100, 0.96)
optimizer = tf.compat.v1.train.GradientDescentOptimizer(learning_rate).minimize(loss, global_step=global_step)
From the paper, I see the accuracy could be 94.x%, any ideas to improve my accuracy?
joaoluiscarreira commented
How many frames are being used in training and testing ?
Joao
…On Wed, Feb 12, 2020 at 2:45 PM Leslie-Fang ***@***.***> wrote:
I am trying to fine-tune the model with UCF-101 dataset.
with the SGD optimizer and lr decay, I can get almost 91.79% accuracy
after 6000 steps with BS: 32.
global_step = tf.Variable(0, trainable=False)
learning_rate = tf.compat.v1.train.exponential_decay(3, global_step, 100, 0.96)
optimizer = tf.compat.v1.train.GradientDescentOptimizer(learning_rate).minimize(loss, global_step=global_step)
From the paper, I see the accuracy could be 94.x%, any ideas to improve my
accuracy?
—
You are receiving this because you are subscribed to this thread.
Reply to this email directly, view it on GitHub
<#93>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/ADXKU2QV7URPHTL7SC4A4ADRCQDSNANCNFSM4KT3DOCQ>
.
Leslie-Fang commented
@joaoluiscarreira Thanks for looking into my issue.
Training: almost 9500 frame in one epoch.
Testing: almost 3700 frame.
BTW: I am using depth 64.
joaoluiscarreira commented
Let me clarify: what we did was to use 64 frames in training and 250 frames
in testing. If you're testing on only 64 frames for each video that would
have a negative impact.
Joao
…On Wed, Feb 12, 2020 at 2:50 PM Leslie-Fang ***@***.***> wrote:
@joaoluiscarreira <https://github.com/joaoluiscarreira> Thanks for
looking into my issue.
Training: almost 9500 frame in one epoch.
Testing: almost 3700 frame.
BTW: I am using depth 64.
—
You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHub
<#93>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/ADXKU2VMJL7D5ZHYW3WMNP3RCQECZANCNFSM4KT3DOCQ>
.
jzq0102 commented
我正在尝试使用 UCF-101 数据集微调模型。 使用 SGD 优化器和 lr 衰减,在 BS: 32 的 6000 步后,我可以获得几乎 91.79% 的准确率。
global_step = tf.Variable(0, trainable=False) learning_rate = tf.compat.v1.train.exponential_decay(3, global_step, 100, 0.96) optimizer = tf.compat.v1.train.GradientDescentOptimizer(learning_rate).minimize(loss, global_step=global_step)
从论文中,我看到准确率可能是 94.x%,有什么想法可以提高我的准确率吗?
Can I share the code?