after setup.py, I got these message. What should I do?
Closed this issue · 3 comments
TakuNishiumi commented
Describe the bug
After gitclone
and executed python setup.py
, I got below message.
In [1]: import pytransit
/home/nishiumi/.conda/envs/py35/lib/python3.6/site-packages/llvmlite/binding/ffi.py:138: UserWarning: Module pytransit was already imported from /home/nishiumi/.conda/envs/py35/lib/python3.6/site-packages/PyTransit-2.0-py3.6.egg/pytransit/__init__.py, but /home/nishiumi/PyTransit is being added to sys.path
from pkg_resources import resource_filename
/home/nishiumi/.conda/envs/py35/lib/python3.6/site-packages/numba/compiler.py:602: NumbaPerformanceWarning:
The keyword argument 'parallel=True' was specified but no transformation for parallel execution was possible.
To find out why, try turning on parallel diagnostics, see http://numba.pydata.org/numba-doc/latest/user/parallel.html#diagnostics for help.
File "../.conda/envs/py35/lib/python3.6/site-packages/PyTransit-2.0-py3.6.egg/pytransit/orbits/orbits_py.py", line 168:
@njit("f8[:](f8[:], f8, f8, f8, f8)", parallel=True)
def ta_newton_v(t, t0, p, e, w):
^
self.func_ir.loc))
/home/nishiumi/.conda/envs/py35/lib/python3.6/site-packages/numba/compiler.py:602: NumbaPerformanceWarning:
The keyword argument 'parallel=True' was specified but no transformation for parallel execution was possible.
To find out why, try turning on parallel diagnostics, see http://numba.pydata.org/numba-doc/latest/user/parallel.html#diagnostics for help.
File "../.conda/envs/py35/lib/python3.6/site-packages/PyTransit-2.0-py3.6.egg/pytransit/orbits/orbits_py.py", line 176:
@njit("f8[:](f8[:],f8,f8,f8,f8)", parallel=True)
def ta_iter_v(t, t0, p, e, w):
^
self.func_ir.loc))
Desktop (please complete the following information):
- OS: ubuntu 18.04
- conda, python 3.6
hpparvi commented
Hi Taku,
This wasn't anything critical, but is fixed now. Pull the latest changes, reinstall, and the warnings should be gone :)
Cheers,
Hannu
TakuNishiumi commented
Hello, Hannu.
Thank you!
I will try.
Sincerely,
Taku
TakuNishiumi commented
solved!
Thank you very much.
Taku