openai/finetune-transformer-lm

BUG in utils.py

YichenGong opened this issue · 2 comments

def get_ema_if_exists(v, gvs):      
    name = v.name.split(':')[0]  
    ema_name = name+'/ExponentialMovingAverage:0'  
    ema_v = [v for v in gvs if v.name == ema_name]  
    if len(ema_v) == 0:  
        ema_v = [v]  
    return ema_v[0]

Here is a piece of code in utils.py line 117 - 123
When you use the list comprehension, the variable v overwrites the argument v in the function.
So what returned is always the last tensor in gvs if the length of ema_v is 0.

This may cause problem. In this case, you are using the same variable for all normalization function.

Newmu commented

Great catch!

Thankfully in python 3 variables in list comprehensions do not leak into outside scope. In python 2 this would be a bug, though.

For example:

>>> v = 'test'
>>> v2 = [v for v in ['oh', 'no']]
>>> v
'test'
Newmu commented

This is vestigial code that shouldn't have been committed in the first place so I've decided to just remove it. I don't know if this codebase even works in python 2 so best to just play it safe.