metatable tab-completions don't seem to work as expected
szym opened this issue · 2 comments
szym commented
I like most trepl's features but the tab-completion of methods doesn't seem to work properly
require 'optim'
l = optim.Logger('out.log')
l:<TAB>
luajit alone gives:
l:add( l:epsfile l:idx. l:names. l:plot( l:style( l:symbols.
l:empty l:file: l:name l:new( l:setNames( l:styles.
th (luajit with trepl) gives:
_G break false io module pairs repeat sys utils
_LAST collectgarbage for ipairs monitor_G paths repl_linenoise table while
_RESULTS coroutine function jit newproxy pcall repl_readline then who
_VERSION debug gcinfo l next print require tonumber xerror
...
Tab-completions of l.<TAB>
are likewise wrong.
szym commented
The fix is good but not perfect.
require 'nn'
m = nn.Sequential()
m:<TAB>
trepl:
__add __init __pow add share
__call __le __sub evaluate size
__concat __len __tostring get training
__div __lt __tostring__ insert updateGradInput
__eq __mod __unm new updateOutput
__factory __mul accGradParameters parameters updateParameters
__index __newindex accUpdateGradParameters reset zeroGradParameters
luajit:
model:accGradParameters( model:float( model:sharedAccUpdateGradParameters(
model:accUpdateGradParameters( model:forward( model:size(
model:add( model:get( model:training(
model:backward( model:getParameters( model:type(
model:backwardUpdate( model:insert( model:updateGradInput(
model:clone( model:modules. model:updateOutput(
model:cuda( model:new( model:updateParameters(
model:double( model:parameters( model:zeroGradParameters(
model:evaluate( model:reset(
model:findModules( model:share(
TAB-completions for nn.Module
work as expected, so it seems that trepl completions just need to follow the inheritance chain. I can't say I understand how inheritance works in torch, but I think torch.getmetatable(torch.typename(t.__metatable))
will allow recursing into the super-metatable.