RecursionError: maximum recursion depth exceeded
Closed this issue · 5 comments
Hi,
I am trying to use the expr.to('excel')
utility of skompile, and I am getting following error:
RecursionError: maximum recursion depth exceeded
The error stack is as below:
---------------------------------------------------------------------------
RecursionError Traceback (most recent call last)
<ipython-input-150-7dcc4a57e7a5> in <module>
1 expr = skompile(final_model.predict)
2 print(expr)
----> 3 expr.to('excel')
4 #expr.to('sqlalchemy/sqlite')
/remote/vghawk1/nayaka/soft/anaconda3/envs/tf2_pytorch/lib/python3.7/site-packages/skompiler/ast.py in to(self, target, *args, **kw)
266 mod = import_module(module)
267 translator = getattr(mod, callable)
--> 268 return translator(self, *dialect, *args, **kw)
269 #endregion
270
/remote/vghawk1/nayaka/soft/anaconda3/envs/tf2_pytorch/lib/python3.7/site-packages/skompiler/fromskast/excel.py in translate(node, component, multistage, assign_to, multistage_subexpression_min_length, _max_subexpression_length)
62 multistage_subexpression_min_length=multistage_subexpression_min_length,
63 _max_subexpression_length=_max_subexpression_length)
---> 64 result = writer(node)
65 if component is not None:
66 result = result[component]
/remote/vghawk1/nayaka/soft/anaconda3/envs/tf2_pytorch/lib/python3.7/site-packages/skompiler/fromskast/_common.py in __call__(self, node, **kw)
31
32 def __call__(self, node, **kw):
---> 33 return getattr(self, node.__class__.__name__)(node, **kw)
34
35
/remote/vghawk1/nayaka/soft/anaconda3/envs/tf2_pytorch/lib/python3.7/site-packages/skompiler/fromskast/_common.py in LFold(self, node, **kw)
80 if not node.elems:
81 raise ValueError("LFold expects at least one element")
---> 82 return self(reduce(lambda x, y: BinOp(node.op, x, y), node.elems), **kw)
83
84 def Let(self, node, **kw):
/remote/vghawk1/nayaka/soft/anaconda3/envs/tf2_pytorch/lib/python3.7/site-packages/skompiler/fromskast/_common.py in __call__(self, node, **kw)
31
32 def __call__(self, node, **kw):
---> 33 return getattr(self, node.__class__.__name__)(node, **kw)
34
35
/remote/vghawk1/nayaka/soft/anaconda3/envs/tf2_pytorch/lib/python3.7/site-packages/skompiler/fromskast/_common.py in BinOp(self, node, **kw)
64 the operation elementwise and returns a list."""
65
---> 66 left = self(node.left, **kw)
67 right = self(node.right, **kw)
68 op = self(node.op, **kw)
... last 2 frames repeated, from the frame below ...
/remote/vghawk1/nayaka/soft/anaconda3/envs/tf2_pytorch/lib/python3.7/site-packages/skompiler/fromskast/_common.py in __call__(self, node, **kw)
31
32 def __call__(self, node, **kw):
---> 33 return getattr(self, node.__class__.__name__)(node, **kw)
34
35
RecursionError: maximum recursion depth exceeded
Please help in resolving this issue.
Regards
Keki
Update:
Got same error for expr.to('sqlalchemy/sqlite')
Try doing
import sys
sys.setrecursionlimit(10000)
before the expr.to(...)
call.
Thanks a lot @konstantint
That fixed the error. But I am not sure if I am correctly using SKompiler for my purpose. It will be a great help if you could guide me for this.
In my project I am using a RandomForestRegressor to predict certain outputs with given inputs. My aim is to check and analyse (preferably visually) the contribution/weights of my model's input features which are governing a particular output prediction. Something like what we can do in a neural network: (https://playground.tensorflow.org/).
Can SKompiler help me do that? If not, do you by chance have any idea how can I achieve my purpose?
I think you should instead check out treeinterpreter.
Thanks @konstantint
I had already started working with treeinterpreter. But I have multi-label predictions and with treeinterpreter I am getting following message:
**ValueError**: Multilabel classification trees not supported
I am trying to work this around. I'll close this thread now as the original doubt put in this thread is resolved. Thanks a lot for your help :).
Regards
Keki