DepthAnything/Depth-Anything-V2

Only Producing 8 bit integer precision from command line for some reason?

petermg opened this issue · 1 comments

So when I use the command line I get a depth map that is only 8 bit integer precision thus using it on a 2D plane for deformation in Blender I get stair stepping, for example:
image
HOWEVER, when I use the web ui extension from here:
https://github.com/graemeniedermayer/stable-diffusion-webui-normalmap-script
the depthanything_v2 branch, I get much better results as the depth maps are 16 bit integer precision resulting in smooth outputs when applied to a 2D plane in Blender as seen here:
image

ANYONE KNOW WHY or HOW TO FIX the fact that the command line isn't making better than 8 bit integer precision depth maps? 8 bit looks so bad when used on 2D planes in Blender for mesh deformation it isn't practical.

It should be possible to save a 16-bit copy of the depth map by adding the following 2 lines just after the depth prediction is made (i.e. below the infer_image(...) part):

# Save 16-bit version of the depth-map
depth_uint16 = ((depth - depth.min()) / (depth.max() - depth.min()) * 65535.0).astype(np.uint16)
cv2.imwrite(os.path.join(args.outdir, os.path.splitext(os.path.basename(filename))[0] + '_uint16.png'), depth_uint16)

The first line makes a copy of the prediction scaled to the 16-bit range (0 to 65535) and the second line both builds the save pathing and saves the png file.