rst.animate_nodal_solution doesn't work off notebook
CesarRodriguezPereira opened this issue ยท 4 comments
๐ Before submitting the issue
- I have searched among the existing issues
- I am using a Python virtual environment
๐ Description of the bug
Launching animations on ipynb files within VSCode demands that the user runs the animate nodal solution with the notebook=False keyword argument, else you'd only get the first frame and the animation would get stuck. However, the latest version of the rst class doesn't allow the use of this kwarg, thus completely eliminating the possibility of result animation in this manner.
The error happens once animate_nodal_solution
calls _plot_point_scalars
, as this tries to pass on the notebook argument to plotter.add_mesh
. This keyword argument obviously isn't meant for that function, and thus, you get a TypeError
.
----> result.animate_nodal_solution(1, displacement_factor=1000,notebook=False)
File c:\Users\redacted\anaconda3\envs\redacted\lib\site-packages\ansys\mapdl\reader\rst.py:1028, in Result.animate_nodal_solution(self, rnum, comp, node_components, element_components, sel_type_all, add_text, displacement_factor, n_frames, loop, movie_filename, progress_bar, **kwargs)
1025 else:
1026 grid = self.grid
-> 1028 return self._plot_point_scalars(scalars, rnum=rnum, grid=grid,
1029 add_text=add_text,
1030 animate=True,
1031 node_components=node_components,
1032 element_components=element_components,
1033 sel_type_all=sel_type_all,
1034 n_frames=n_frames,
1035 displacement_factor=displacement_factor,
1036 movie_filename=movie_filename,
1037 loop=loop,
1038 progress_bar=progress_bar,
1039 **kwargs)
File c:\Users\redacted\anaconda3\envs\redacted\lib\site-packages\ansys\mapdl\reader\rst.py:2864, in Result._plot_point_scalars(self, scalars, rnum, grid, show_displacement, displacement_factor, add_text, animate, n_frames, overlay_wireframe, node_components, element_components, sel_type_all, movie_filename, treat_nan_as_zero, progress_bar, **kwargs)
2860 plotter.add_mesh(self.grid, style='wireframe', color='w',
2861 opacity=0.5)
2863 copied_mesh = mesh.copy()
...
345 grammar = "are invalid keyword arguments"
346 message = f"{bad_arguments} {grammar} for `{caller}`"
--> 347 raise TypeError(message)
I'll try to downgrade version by version once I have the time, see if that fixes it, will update here if I can solve it by only downgrading the reader (although incompatibilities with core may arise).
๐ Steps to reproduce
- Run a modal analysis.
- Store modal solution with the idiomatic
result = mapdl.result
- Run the solution animation with
result.animate_nodal_solution(0, notebook=False)
- Get TypeError
๐ป Which operating system are you using?
Windows
๐ Which Python version are you using?
3.8
๐ฆ Installed packages
aiohttp==3.8.1
aiosignal==1.2.0
ansys-api-mapdl==0.5.1
ansys-api-platform-instancemanagement==1.0.0b3
ansys-corba==0.1.1
ansys-mapdl-core==0.62.1
ansys-mapdl-reader==0.51.14
ansys-platform-instancemanagement==1.0.2
appdirs==1.4.4
asttokens @ file:///home/conda/feedstock_root/build_artifacts/asttokens_1618968359944/work
async-timeout==4.0.2
attrs==21.4.0
backcall @ file:///home/conda/feedstock_root/build_artifacts/backcall_1592338393461/work
backports.functools-lru-cache @ file:///home/conda/feedstock_root/build_artifacts/backports.functools_lru_cache_1618230623929/work
bleach @ file:///home/conda/feedstock_root/build_artifacts/bleach_1656355450470/work
bokeh @ file:///D:/bld/bokeh_1652969844800/work
brotlipy @ file:///D:/bld/brotlipy_1648854320485/work
certifi==2022.6.15
cffi @ file:///D:/bld/cffi_1656782950753/work
charset-normalizer @ file:///home/conda/feedstock_root/build_artifacts/charset-normalizer_1655906222726/work
colorama @ file:///home/conda/feedstock_root/build_artifacts/colorama_1655412516417/work
colorcet @ file:///home/conda/feedstock_root/build_artifacts/colorcet_1638280441091/work
cryptography @ file:///D:/bld/cryptography_1657174149370/work
cycler @ file:///home/conda/feedstock_root/build_artifacts/cycler_1635519461629/work
debugpy @ file:///D:/bld/debugpy_1649586564473/work
decorator @ file:///home/conda/feedstock_root/build_artifacts/decorator_1641555617451/work
entrypoints @ file:///home/conda/feedstock_root/build_artifacts/entrypoints_1643888246732/work
executing @ file:///home/conda/feedstock_root/build_artifacts/executing_1646044401614/work
fonttools @ file:///D:/bld/fonttools_1657249571382/work
frozenlist==1.3.0
geomdl==5.3.1
googleapis-common-protos==1.56.4
grpcio==1.47.0
holoviews @ file:///home/conda/feedstock_root/build_artifacts/holoviews_1657227861300/work
idna @ file:///home/conda/feedstock_root/build_artifacts/idna_1642433548627/work
imageio==2.19.3
importlib-metadata==4.12.0
ipykernel @ file:///D:/bld/ipykernel_1657295151080/work
ipython @ file:///D:/bld/ipython_1653755022901/work
jedi @ file:///D:/bld/jedi_1649067326950/work
Jinja2 @ file:///home/conda/feedstock_root/build_artifacts/jinja2_1654302431367/work
jupyter-client @ file:///home/conda/feedstock_root/build_artifacts/jupyter_client_1654730843242/work
jupyter-core @ file:///D:/bld/jupyter_core_1652365473010/work
kiwisolver @ file:///D:/bld/kiwisolver_1655141745440/work
Markdown @ file:///home/conda/feedstock_root/build_artifacts/markdown_1651821407140/work
MarkupSafe @ file:///D:/bld/markupsafe_1648737751065/work
matplotlib @ file:///D:/bld/matplotlib-suite_1651609674242/work
matplotlib-inline @ file:///home/conda/feedstock_root/build_artifacts/matplotlib-inline_1631080358261/work
multidict==6.0.2
munkres==1.1.4
nest-asyncio @ file:///home/conda/feedstock_root/build_artifacts/nest-asyncio_1648959695634/work
numpy @ file:///D:/bld/numpy_1657483969318/work
packaging @ file:///home/conda/feedstock_root/build_artifacts/packaging_1637239678211/work
pandas @ file:///D:/bld/pandas_1656001298215/work
panel @ file:///home/conda/feedstock_root/build_artifacts/panel_1653429099596/work
param @ file:///home/conda/feedstock_root/build_artifacts/param_1655892834767/work
parso @ file:///home/conda/feedstock_root/build_artifacts/parso_1638334955874/work
pickleshare @ file:///home/conda/feedstock_root/build_artifacts/pickleshare_1602536217715/work
Pillow @ file:///D:/bld/pillow_1657007269509/work
prompt-toolkit @ file:///home/conda/feedstock_root/build_artifacts/prompt-toolkit_1656332401605/work
protobuf==3.20.1
protoc-gen-swagger==0.1.0
psutil @ file:///D:/bld/psutil_1653089356250/work
pure-eval @ file:///home/conda/feedstock_root/build_artifacts/pure_eval_1642875951954/work
pycparser @ file:///home/conda/feedstock_root/build_artifacts/pycparser_1636257122734/work
pyct==0.4.6
Pygments @ file:///home/conda/feedstock_root/build_artifacts/pygments_1650904496387/work
pyiges==0.2.1
pyOpenSSL @ file:///home/conda/feedstock_root/build_artifacts/pyopenssl_1643496850550/work
pyparsing @ file:///home/conda/feedstock_root/build_artifacts/pyparsing_1652235407899/work
PySocks @ file:///D:/bld/pysocks_1648857453381/work
python-dateutil @ file:///home/conda/feedstock_root/build_artifacts/python-dateutil_1626286286081/work
pytz @ file:///home/conda/feedstock_root/build_artifacts/pytz_1647961439546/work
pyvista==0.35.1
pyviz-comms @ file:///home/conda/feedstock_root/build_artifacts/pyviz_comms_1648629824948/work
pywin32==303
PyYAML @ file:///D:/bld/pyyaml_1648757284286/work
pyzmq @ file:///D:/bld/pyzmq_1656183679173/work
requests @ file:///home/conda/feedstock_root/build_artifacts/requests_1656534056640/work
scipy==1.8.1
scooby==0.5.12
six @ file:///home/conda/feedstock_root/build_artifacts/six_1620240208055/work
stack-data @ file:///home/conda/feedstock_root/build_artifacts/stack_data_1655315839047/work
tornado @ file:///D:/bld/tornado_1656937958156/work
tqdm @ file:///home/conda/feedstock_root/build_artifacts/tqdm_1649051611147/work
traitlets @ file:///home/conda/feedstock_root/build_artifacts/traitlets_1655411388954/work
typing_extensions @ file:///home/conda/feedstock_root/build_artifacts/typing_extensions_1656706066251/work
unicodedata2 @ file:///D:/bld/unicodedata2_1649112097357/work
urllib3 @ file:///home/conda/feedstock_root/build_artifacts/urllib3_1657224465922/work
vtk==9.1.0
wcwidth @ file:///home/conda/feedstock_root/build_artifacts/wcwidth_1600965781394/work
webencodings==0.5.1
win-inet-pton @ file:///D:/bld/win_inet_pton_1648771891747/work
wincertstore==0.2
wslink==1.6.6
yarl==1.7.2
zipp==3.8.1
Adding a comment to note: this TypeError happens with every version back to 0.51.6. Versions before that give another error, which is a key error during normal computation, my bet is that that happens due to either Pyvista updates or to this specific model being composed of all beam elements (thus screwing up something in the normals calculation). Now I'll probably see if downgrading ansys-mapdl-core fixes this, and later I'll try downgrading pyvista (animation is a key feature I'll be using to understand how to run and postprocess some transient models directly within python). Anyway, hopefully it's an easy fix and we can get this functionality back on the latest releases!
EDIT (for the sake of completness): downgraded to ansys-mapdl-core==0.60.3
, ansys-mapdl-reader==0.51.5
, keeping pyvista==0.35.1
and notebook=False
is working again, no problem with arguments or normals.
Sorry about getting back to you so late regarding this. We're really trying to shift people over to DPF and this isn't getting as much love as it used to.
However, as long as people are using it, I'm happy to support it. This should be fixed in #149 and ansys-mapdl-reader==0.51.14
, which should be out tonight if I'm lucky.
I guess animation is missing from both PyDPF-Core and PyDPF-Post.
Great that it works, thanks! When I can I'll have a deep dive into DPF-Core and Post, but yeah, animations just bring so much understanding when performing modal or tranisent analyses.
I'd say the biggest hurdle (for me at least) that has prevented me from taking a dive into those libraries is the integration with PyMAPDL: a simple result = mapdl.result
gets you an easy post-processing object that fits many needs, as most examples show, and can also be easily attached to other objects (thus allowing easy user designed objects that contain everything in a model in a single instance, i.e. attach results to their inputs).
I'd say the biggest hurdle (for me at least) that has prevented me from taking a dive into those libraries is the integration with PyMAPDL: a simple
result = mapdl.result
gets you an easy post-processing object that fits many needs, as most examples show, and can also be easily attached to other objects (thus allowing easy user designed objects that contain everything in a model in a single instance, i.e. attach results to their inputs).
That's on the to-do list for sure, it's just a matter of getting around to it. I agree with having simple integration and that's a feature being implemented in ansys/pymapdl#1298