Export 3D animation to files?
chilipeppr opened this issue · 3 comments
Hi there,
I'm wondering if there might be a way to export the 3D Two Gaussian Wells to 3D files like STEP, STL, FBX, or GLTF. I realize it's a probability cloud so the edges of the 3D object are fuzzy, but perhaps there's a cut-off to "pick" an edge of the 3D coordinates to have that be what represents a final solid 3D output.
We're trying to import the 3D Two Gaussian Wells animation that you have so beautifully visualized here into BabylonJS so it can be easier played with inside that environment. The most native file BabylonJS supports is GLTF and those file formats can even include animation timelines now.
Thanks for the help, if any.
-John
Hi John!
The array property of the Eigenstates object (an instance is returned by H.solve on line 22 of 3D_four_gaussian_wells.py) contains all of the wave function data. For 3D systems eigenstates.array
is just a Numpy array with four indices, whereeigenstates.array[0]
will give you a 3D array that corresponds to the ground state wave function,eigenstates.array[1]
will give you the first excited state, and so on. You can then just plot these as a volume render.
Searching around, I think the module pygltflib can help you with GLTF files, but I’ve never actually used it. Personally, at least for the initial prototype I would try Plotly first since it’s a browser-based renderer that works directly with Python.
Thanks so much for the advice. Working on some different angle based on your suggestions. In particular I'm trying to see if perhaps I can just get Mayavi to export transparent PNGs but control the 3D angle being viewed for each state for each frame exported. I only want the 3D visualization itself, not the border box or axes labels, so just having to get to know Mayavi better to control that and even see if transparent PNGs are possible. I'd also need to try to control the colors being used.
The goal is to overlay this imagery into live video edits so having the transparent PNGs is key.
Also, it's been VERY difficult to get the pip install qmsolve[with_mayavi]
to work anywhere including Windows 11, Ubuntu, or WSL2 Ubuntu on Windows. It typically fails with trying to build Mayavi. You get a 'build_src' error. I finally seemed to be able to fix it, and get it running in all 3 of those environments, with directly installing Mayavi from the Github repo...
First do...
pip install https://github.com/enthought/mayavi/zipball/master
Then do...
pip install qmsolve[with_mayavi]
FYI, here it is running on Windows 11 natively, not in WSL2.
Also, I added a cache so I could keep re-running examples without always computing. I had to place cache.py into the examples/eigenstate solver examples
directory.
Here is the contents of cache.py
import json
import pickle
import os.path
from pathlib import Path
import inspect
# This file allows you to cache your eigenstates by utilizing Pythons pickle serializer/deserializer
# Usage example
# import cache as uc
# # See if we have cached data. If so use it, otherwise generate it.
# eigenstates = None
# if uc.is_exists(uc.pickle_cache_name()):
# print("We have a cache. Using it.")
# eigenstates = uc.deserialize_pickle(eigenstates, uc.pickle_cache_name())
# else:
# print("No cache, so solving eigenstates")
# eigenstates = H.solve(max_states = 30)
# uc.serialize_pickle(eigenstates, uc.pickle_cache_name())
def create_directory():
if os.path.exists("cache"):
# print("Cache folder already exists")
pass
else:
# print("Creating cache dir")
os.mkdir("cache")
def pickle_cache_name():
create_directory()
return "cache/" + curname() + ".pickle"
def curname():
# return Path(os.path.basename(__file__)).stem
frame = inspect.stack()[len(inspect.stack()) - 1]
module = inspect.getmodule(frame[0])
filename = module.__file__
# frame = inspect.stack()[1]
# filename = frame[0].f_code.co_filename
return Path(filename).stem
def is_exists(path=None):
return os.path.isfile(path)
def serialize_pickle(instance=None, path=None):
print("Serializing to cache to file", path)
with open(path, 'wb') as handle:
pickle.dump(instance, handle, protocol=pickle.HIGHEST_PROTOCOL)
def deserialize_pickle(instance=None, path=None):
print("Deserializing cache from file", path)
with open(path, 'rb') as handle:
b = pickle.load(handle)
return b
def serialize_json(instance=None, path=None):
dt = {}
dt.update(vars(instance))
with open(path, "w") as file:
json.dump(dt, file)
def deserialize_json(cls=None, path=None):
def read_json(_path):
with open(_path, "r") as file:
return json.load(file)
data = read_json(path)
instance = object.__new__(cls)
for key, value in data.items():
setattr(instance, key, value)
return instance
def test():
print("Test completed")
print("Loaded cache util")
if __name__ == "__main__":
print("curname:", curname())
print("is_exists:", is_exists(os.path.basename(__file__)) )
print("is_exists: (should be false)", is_exists(curname()) )
print("pickle cache file name:", pickle_cache_name())
Then in any file where you want to cache, like in 3D_hydrogen_atom.py, you change the line:
eigenstates = H.solve( max_states = 14, method ='lobpcg')
To the lines:
# See if we have cached data. If so use it, otherwise generate it.
eigenstates = None
if uc.is_exists(uc.pickle_cache_name()):
print("We have a cache. Using it.")
eigenstates = uc.deserialize_pickle(eigenstates, uc.pickle_cache_name())
else:
print("No cache, so solving eigenstates")
eigenstates = H.solve( max_states = 14, method ='lobpcg')
uc.serialize_pickle(eigenstates, uc.pickle_cache_name())