{
  "cells": [
    {
      "cell_type": "markdown",
      "metadata": {
        "id": "h94CikXccQsj"
      },
      "source": [
        "[![Map Pipeline](https://github.com/YertleTurtleGit/depth-from-normals/actions/workflows/map_pipeline.yml/badge.svg)](https://github.com/YertleTurtleGit/depth-from-normals/actions/workflows/map_pipeline.yml)\n",
        "[![Lint](https://github.com/YertleTurtleGit/depth-from-normals/actions/workflows/lint.yml/badge.svg)](https://github.com/YertleTurtleGit/depth-from-normals/actions/workflows/lint.yml)\n",
        "<a target=\"_blank\" href=\"https://colab.research.google.com/github/YertleTurtleGit/depth-from-normals/blob/main/README.ipynb\">\n",
        "<img src=\"https://colab.research.google.com/assets/colab-badge.svg\" alt=\"Open In Colab\"/>\n",
        "</a>\n",
        "\n",
        "<!-- START doctoc generated TOC please keep comment here to allow auto update -->\n",
        "<!-- END doctoc generated TOC please keep comment here to allow auto update -->"
      ]
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "id": "jl7JQyYBcQsm"
      },
      "source": [
        "# Introduction\n"
      ]
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "id": "lD98aTMvcQso"
      },
      "source": [
        "This algorithm utilizes the normal mapping to approximate a 3D integral by means of surface integrals of vector fields. Initially, the directional gradients of the normals are determined along the x- and y-directions. These gradients are then used to compute the integrated values by employing a cumulative sum (Riemann sum). To enhance the accuracy of the estimated values, this process is repeated multiple times with the gradient mapping rotated in different orientations, and the results are averaged."
      ]
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "id": "XuNYSidbcQsp"
      },
      "source": [
        "# Quickstart"
      ]
    },
    {
      "cell_type": "code",
      "execution_count": 1,
      "metadata": {
        "id": "ADia-Ss_cQsp",
        "outputId": "d2c7023c-411f-4ee8-9a42-9fb4e1ed8130",
        "colab": {
          "base_uri": "https://localhost:8080/"
        }
      },
      "outputs": [
        {
          "output_type": "stream",
          "name": "stdout",
          "text": [
            "/content/depth-from-normals\n"
          ]
        }
      ],
      "source": [
        "%pip install -q pathlib2\n",
        "import pathlib2 as pathlib\n",
        "\n",
        "if not (pathlib.Path('.git').is_dir() and pathlib.Path.cwd().name == \"depth-from-normals\"):\n",
        "    !git clone -q https://github.com/YertleTurtleGit/depth-from-normals.git\n",
        "    %cd depth-from-normals/\n",
        "    %pip install -q -r requirements.txt"
      ]
    },
    {
      "cell_type": "code",
      "execution_count": null,
      "metadata": {
        "id": "apmGr9RmcQsr"
      },
      "outputs": [],
      "source": [
        "from height_map import (\n",
        "    estimate_height_map,\n",
        ")  # local file 'height_map.py' in this repository\n",
        "from matplotlib import pyplot as plt\n",
        "import numpy as np\n",
        "from skimage import io\n",
        "\n",
        "NORMAL_MAP_A_PATH: str = (\n",
        "    \"https://raw.githubusercontent.com/YertleTurtleGit/depth-from-normals/main/normal_mapping_a.png\"\n",
        ")\n",
        "NORMAL_MAP_B_PATH: str = (\n",
        "    \"https://raw.githubusercontent.com/YertleTurtleGit/depth-from-normals/main/normal_mapping_b.png\"\n",
        ")\n",
        "NORMAL_MAP_A_IMAGE: np.ndarray = io.imread(NORMAL_MAP_A_PATH)\n",
        "NORMAL_MAP_B_IMAGE: np.ndarray = io.imread(NORMAL_MAP_B_PATH)"
      ]
    },
    {
      "cell_type": "code",
      "execution_count": null,
      "metadata": {
        "id": "dABgl39LcQss"
      },
      "outputs": [],
      "source": [
        "heights = estimate_height_map(NORMAL_MAP_A_IMAGE, raw_values=True)\n",
        "\n",
        "figure, axes = plt.subplots(1, 2, figsize=(7, 3))\n",
        "_ = axes[0].imshow(NORMAL_MAP_A_IMAGE)\n",
        "_ = axes[1].imshow(heights)\n",
        "\n",
        "x, y = np.meshgrid(range(heights.shape[1]), range(heights.shape[0]))\n",
        "_, axes = plt.subplots(1, 1, subplot_kw={\"projection\": \"3d\"})\n",
        "_ = axes.scatter(x, y, heights, c=heights)"
      ]
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "id": "dWgfaCfucQss"
      },
      "source": [
        "# Explanation\n"
      ]
    },
    {
      "cell_type": "code",
      "execution_count": null,
      "metadata": {
        "execution": {
          "iopub.execute_input": "2022-03-21T08:04:05.289303Z",
          "iopub.status.busy": "2022-03-21T08:04:05.289164Z",
          "iopub.status.idle": "2022-03-21T08:04:19.591044Z",
          "shell.execute_reply": "2022-03-21T08:04:19.590486Z"
        },
        "id": "qJM0ecFCGW7m"
      },
      "outputs": [],
      "source": [
        "import cv2 as cv\n",
        "from scipy.integrate import cumulative_trapezoid, simpson\n",
        "from multiprocessing.pool import ThreadPool as Pool\n",
        "from multiprocessing import cpu_count\n",
        "from typing import List, Tuple\n",
        "from matplotlib.colors import TwoSlopeNorm"
      ]
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "id": "KAkFdU1ucQst"
      },
      "source": [
        "## Gradients\n",
        "\n",
        "By deriving gradients in the x- and y-directions from the normal mapping, a mapping of angles is generated, which can be utilized to compute the directional gradients.\n",
        "\n",
        "Given the normal vector $\\vec{n} \\in \\mathbb{R}^{3}$ and a rotation value $r \\in \\mathbb{R}[0,2\\pi]$, the anisotropic gradients are calculated:\n",
        "\n",
        "$$\n",
        "a_h = \\arccos{\\vec{n_x}}, \\hspace{5px} g_l = (1 - \\sin{a_h}) * sgn(a_h - \\frac{\\pi}{2})\n",
        "$$\n",
        "\n",
        "$$\n",
        "a_v = \\arccos{\\vec{n_y}}, \\hspace{5px} g_t = (1 - \\sin{a_v}) * sgn(a_v - \\frac{\\pi}{2})\n",
        "$$\n"
      ]
    },
    {
      "cell_type": "code",
      "execution_count": null,
      "metadata": {
        "id": "lOMgiZVEcQst"
      },
      "outputs": [],
      "source": [
        "def calculate_gradients(normals: np.ndarray) -> Tuple[np.ndarray, np.ndarray]:\n",
        "    horizontal_angle_map = np.arccos(np.clip(normals[:, :, 0], -1, 1))\n",
        "    left_gradients = (1 - np.sin(horizontal_angle_map)) * np.sign(\n",
        "        horizontal_angle_map - np.pi / 2\n",
        "    )\n",
        "\n",
        "    vertical_angle_map = np.arccos(np.clip(normals[:, :, 1], -1, 1))\n",
        "    top_gradients = -(1 - np.sin(vertical_angle_map)) * np.sign(\n",
        "        vertical_angle_map - np.pi / 2\n",
        "    )\n",
        "\n",
        "    return left_gradients, top_gradients\n",
        "\n",
        "\n",
        "normals = ((NORMAL_MAP_A_IMAGE[:, :, :3] / 255) - 0.5) * 2\n",
        "left_gradients, top_gradients = calculate_gradients(normals)\n",
        "\n",
        "\n",
        "figsize = (14, 14)\n",
        "figure, axes = plt.subplots(1, 3, figsize=figsize)\n",
        "axes[0].set_title(\"anisotropic left gradients (left to right)\")\n",
        "_ = axes[0].imshow(left_gradients, cmap=\"RdBu\", norm=TwoSlopeNorm(0))\n",
        "axes[1].set_title(\"anisotropic top gradients (top to bottom)\")\n",
        "_ = axes[1].imshow(top_gradients, cmap=\"RdBu\", norm=TwoSlopeNorm(0))\n",
        "axes[2].set_title(\"normals (clipped)\")\n",
        "_ = axes[2].imshow(np.clip(normals, 0, 255))"
      ]
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "id": "FHBdsZHpcQsu"
      },
      "source": [
        "## Heights\n",
        "\n",
        "The height values $h(x,y) \\in \\mathbb{R}^{2}, \\ \\ x,y \\in \\mathbb{N}^{0}$ can be obtained by performing a cumulative sum over the gradients, which eventually approaches an integral over $g(x,y)$:\n",
        "\n",
        "$$\n",
        "h(x_t,y_t) = \\iint g(x,y) dydx \\ \\ (x_t,y_t) \\approx \\sum_{x_i=0}^{x_t} g(x_i,y_t)\n",
        "$$\n",
        "\n",
        "The isotropic (non-directional) heights are determined by a combination of all the anisotropic heights.\n"
      ]
    },
    {
      "cell_type": "code",
      "execution_count": null,
      "metadata": {
        "id": "4FNLlBUEcQsu"
      },
      "outputs": [],
      "source": [
        "def integrate_gradient_field(gradient_field: np.ndarray, axis: int) -> np.ndarray:\n",
        "    return np.cumsum(gradient_field, axis=axis)\n",
        "\n",
        "\n",
        "def calculate_heights(\n",
        "    left_gradients: np.ndarray, top_gradients: np.ndarray\n",
        ") -> Tuple[np.ndarray, np.ndarray, np.ndarray, np.ndarray]:\n",
        "    left_heights = integrate_gradient_field(left_gradients, axis=1)\n",
        "    right_heights = np.fliplr(\n",
        "        integrate_gradient_field(np.fliplr(-left_gradients), axis=1)\n",
        "    )\n",
        "    top_heights = integrate_gradient_field(top_gradients, axis=0)\n",
        "    bottom_heights = np.flipud(\n",
        "        integrate_gradient_field(np.flipud(-top_gradients), axis=0)\n",
        "    )\n",
        "    return left_heights, right_heights, top_heights, bottom_heights\n",
        "\n",
        "\n",
        "left_heights, right_heights, top_heights, bottom_heights = calculate_heights(\n",
        "    left_gradients, top_gradients\n",
        ")\n",
        "\n",
        "\n",
        "def combine_heights(*heights: np.ndarray) -> np.ndarray:\n",
        "    return np.mean(np.stack(heights, axis=0), axis=0)\n",
        "\n",
        "\n",
        "isotropic_heights = combine_heights(\n",
        "    left_heights, right_heights, top_heights, bottom_heights\n",
        ")\n",
        "\n",
        "\n",
        "def visualize_heights(heights_list: List[np.ndarray], labels: List[str]):\n",
        "    if len(heights_list) == 1:\n",
        "        heights = heights_list[0]\n",
        "        plt.title(labels[0])\n",
        "        _ = plt.imshow(heights)\n",
        "        x, y = np.meshgrid(range(heights.shape[1]), range(heights.shape[0]))\n",
        "        _, axes = plt.subplots(1, 1, subplot_kw={\"projection\": \"3d\"})\n",
        "        _ = axes.scatter(x, y, heights, c=heights)\n",
        "        return\n",
        "\n",
        "    figure, axes = plt.subplots(1, len(heights_list), figsize=(19, 5))\n",
        "    for index, heights in enumerate(heights_list):\n",
        "        axes[index].set_title(labels[index])\n",
        "        _ = axes[index].imshow(heights, norm=TwoSlopeNorm(0.5))\n",
        "\n",
        "    x, y = np.meshgrid(range(left_heights.shape[0]), range(left_heights.shape[1]))\n",
        "    figure, axes = plt.subplots(\n",
        "        1, len(heights_list), subplot_kw={\"projection\": \"3d\"}, figsize=(19, 5)\n",
        "    )\n",
        "    for index, heights in enumerate(heights_list):\n",
        "        _ = axes[index].scatter(x, y, heights, c=heights)\n",
        "\n",
        "\n",
        "visualize_heights(\n",
        "    [left_heights, right_heights, top_heights, bottom_heights, isotropic_heights],\n",
        "    [\n",
        "        \"anisotropic left heights\",\n",
        "        \"anisotropic right heights\",\n",
        "        \"anisotropic top heights\",\n",
        "        \"anisotropic bottom heights\",\n",
        "        \"isotropic heights\",\n",
        "    ],\n",
        ")"
      ]
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "id": "S9A4BziBcQsu"
      },
      "source": [
        "## Rotation\n",
        "\n",
        "While using the cumulative sum (Riemann sum) to calculate the height map is a straightforward and efficient method, it may result in errors, particularly when the gradient mapping contains abrupt changes in direction. In order to mitigate such errors, the estimate_height_map function utilizes multiple rotated versions of the gradient mapping and computes their averages to generate the height map. This approach aids in the reduction of errors and enhances the precision of the height map estimates.\n",
        "\n",
        "$$\n",
        "h(x_t,y_t) = \\sum_{r=0}^{2\\pi} \\sum_{x_i=0}^{x_t} g R_\\theta (x_i,y_t)\n",
        "$$\n",
        "\n",
        "To refer to the height maps in polar coordinates representing the left, right, top, and bottom, it would be more appropriate to name them as 180°, 0°, 90°, and 270° height maps, respectively.\n"
      ]
    },
    {
      "cell_type": "code",
      "execution_count": null,
      "metadata": {
        "id": "2FCRN9qOcQsv"
      },
      "outputs": [],
      "source": [
        "plt.polar()\n",
        "_ = plt.yticks([1])"
      ]
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "id": "12tOr-8GcQsv"
      },
      "source": [
        "When computing an anisotropic height map for a 225° direction, it is necessary to first rotate the normal map. However, a standard image rotation technique may lead to incorrect normal vectors, hence it is also essential to perform a corresponding rotation of the normal vectors."
      ]
    },
    {
      "cell_type": "code",
      "execution_count": null,
      "metadata": {
        "id": "nQJM21ZEcQsv"
      },
      "outputs": [],
      "source": [
        "ANGLE = 225\n",
        "\n",
        "\n",
        "def rotate(matrix: np.ndarray, angle: float) -> np.ndarray:\n",
        "    h, w = matrix.shape[:2]\n",
        "    center = (w / 2, h / 2)\n",
        "\n",
        "    rotation_matrix = cv.getRotationMatrix2D(center, angle, 1.0)\n",
        "    corners = cv.transform(\n",
        "        np.array([[[0, 0], [w, 0], [w, h], [0, h]]]), rotation_matrix\n",
        "    )[0]\n",
        "\n",
        "    _, _, w, h = cv.boundingRect(corners)\n",
        "\n",
        "    rotation_matrix[0, 2] += w / 2 - center[0]\n",
        "    rotation_matrix[1, 2] += h / 2 - center[1]\n",
        "    result = cv.warpAffine(matrix, rotation_matrix, (w, h), flags=cv.INTER_LINEAR)\n",
        "\n",
        "    return result\n",
        "\n",
        "\n",
        "rotated_normal_map_wrong = rotate(NORMAL_MAP_A_IMAGE, ANGLE)\n",
        "\n",
        "wrong_normals = (\n",
        "    (rotated_normal_map_wrong[:, :, :3].astype(np.float64) / 255) - 0.5\n",
        ") * 2\n",
        "\n",
        "\n",
        "def rotate_vector_field_normals(normals: np.ndarray, angle: float) -> np.ndarray:\n",
        "    angle = np.radians(angle)\n",
        "    cos_angle = np.cos(angle)\n",
        "    sin_angle = np.sin(angle)\n",
        "\n",
        "    rotated_normals = np.empty_like(normals)\n",
        "    rotated_normals[:, :, 0] = (\n",
        "        normals[:, :, 0] * cos_angle - normals[:, :, 1] * sin_angle\n",
        "    )\n",
        "    rotated_normals[:, :, 1] = (\n",
        "        normals[:, :, 0] * sin_angle + normals[:, :, 1] * cos_angle\n",
        "    )\n",
        "\n",
        "    return rotated_normals\n",
        "\n",
        "\n",
        "rotated_normals = rotate_vector_field_normals(wrong_normals, ANGLE)\n",
        "rotated_normal_map = (((rotated_normals + 1) / 2) * 255).astype(np.uint8)\n",
        "\n",
        "figure, axes = plt.subplots(1, 3, figsize=figsize)\n",
        "axes[0].set_title(\"normal map\")\n",
        "_ = axes[0].imshow(NORMAL_MAP_A_IMAGE)\n",
        "axes[1].set_title(\"rotated normal map (wrong)\")\n",
        "_ = axes[1].imshow(rotated_normal_map_wrong)\n",
        "axes[2].set_title(\"rotated normal map (correct)\")\n",
        "_ = axes[2].imshow(rotated_normal_map)"
      ]
    },
    {
      "cell_type": "code",
      "execution_count": null,
      "metadata": {
        "id": "8VB9qUx_cQsv"
      },
      "outputs": [],
      "source": [
        "def centered_crop(image: np.ndarray, target_resolution: Tuple[int, int]) -> np.ndarray:\n",
        "    return image[\n",
        "        (image.shape[0] - target_resolution[0])\n",
        "        // 2 : (image.shape[0] - target_resolution[0])\n",
        "        // 2\n",
        "        + target_resolution[0],\n",
        "        (image.shape[1] - target_resolution[1])\n",
        "        // 2 : (image.shape[1] - target_resolution[1])\n",
        "        // 2\n",
        "        + target_resolution[1],\n",
        "    ]"
      ]
    },
    {
      "cell_type": "code",
      "execution_count": null,
      "metadata": {
        "id": "epBSd0x2cQsv"
      },
      "outputs": [],
      "source": [
        "def integrate_vector_field(\n",
        "    vector_field: np.ndarray,\n",
        "    target_iteration_count: int,\n",
        "    thread_count: int = cpu_count(),\n",
        ") -> np.ndarray:\n",
        "    shape = vector_field.shape[:2]\n",
        "    angles = np.linspace(0, 90, target_iteration_count, endpoint=False)\n",
        "\n",
        "    def integrate_vector_field_angles(angles: List[float]) -> np.ndarray:\n",
        "        all_combined_heights = np.zeros(shape)\n",
        "\n",
        "        for angle in angles:\n",
        "            rotated_vector_field = rotate_vector_field_normals(\n",
        "                rotate(vector_field, angle), angle\n",
        "            )\n",
        "\n",
        "            left_gradients, top_gradients = calculate_gradients(rotated_vector_field)\n",
        "            (\n",
        "                left_heights,\n",
        "                right_heights,\n",
        "                top_heights,\n",
        "                bottom_heights,\n",
        "            ) = calculate_heights(left_gradients, top_gradients)\n",
        "\n",
        "            combined_heights = combine_heights(\n",
        "                left_heights, right_heights, top_heights, bottom_heights\n",
        "            )\n",
        "            combined_heights = centered_crop(rotate(combined_heights, -angle), shape)\n",
        "            all_combined_heights += combined_heights / len(angles)\n",
        "\n",
        "        return all_combined_heights\n",
        "\n",
        "    with Pool(processes=thread_count) as pool:\n",
        "        heights = pool.map(\n",
        "            integrate_vector_field_angles,\n",
        "            np.array(\n",
        "                np.array_split(angles, thread_count),\n",
        "                dtype=object,\n",
        "            ),\n",
        "        )\n",
        "        pool.close()\n",
        "        pool.join()\n",
        "\n",
        "    isotropic_height = np.zeros(shape)\n",
        "    for height in heights:\n",
        "        isotropic_height += height / thread_count\n",
        "\n",
        "    return isotropic_height\n",
        "\n",
        "\n",
        "def estimate_height_map(\n",
        "    normal_map: np.ndarray, target_iteration_count: int = 250\n",
        ") -> np.ndarray:\n",
        "    normals = ((normal_map[:, :, :3] / 255) - 0.5) * 2\n",
        "    heights = integrate_vector_field(normals, target_iteration_count)\n",
        "    return heights\n",
        "\n",
        "\n",
        "figure, axes = plt.subplots(1, 4, figsize=(14, 6))\n",
        "\n",
        "for index in range(4):\n",
        "    target_iteration_count = max(1, index * 5)\n",
        "    heights = estimate_height_map(NORMAL_MAP_B_IMAGE, target_iteration_count)\n",
        "    x, y = np.meshgrid(range(heights.shape[0]), range(heights.shape[1]))\n",
        "\n",
        "    axes[index].set_title(f\"target iteration count: {target_iteration_count}\")\n",
        "    _ = axes[index].imshow(heights)"
      ]
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "id": "P1GY3VyKcQsw"
      },
      "source": [
        "# Discussion\n",
        "\n",
        "## Integration\n",
        "\n",
        "The cumulative sum method is a rudimentary approach for computing integrals. In the following, we have implemented the trapezoid and Simpson's method to provide additional options for integral computation."
      ]
    },
    {
      "cell_type": "code",
      "execution_count": null,
      "metadata": {
        "id": "ii4PzLUScQsw"
      },
      "outputs": [],
      "source": [
        "from enum import auto, Enum\n",
        "\n",
        "\n",
        "class INTEGRATION_METHODS(Enum):\n",
        "    SUM = auto()\n",
        "    TRAPEZOID = auto()\n",
        "    SIMSPSON = auto()\n",
        "\n",
        "\n",
        "def integrate_gradient_field(gradient_field: np.ndarray, axis: int) -> np.ndarray:\n",
        "    if INTEGRATION_METHOD == INTEGRATION_METHOD.SUM:\n",
        "        return np.cumsum(gradient_field, axis=axis)\n",
        "\n",
        "    if INTEGRATION_METHOD == INTEGRATION_METHOD.TRAPEZOID:\n",
        "        return cumulative_trapezoid(gradient_field, axis=axis, initial=0)\n",
        "\n",
        "    if INTEGRATION_METHOD == INTEGRATION_METHOD.SIMSPSON:\n",
        "        integral = np.zeros(gradient_field.shape[:2])\n",
        "\n",
        "        if axis == 1:\n",
        "            for y in range(gradient_field.shape[0]):\n",
        "                for x in range(1, gradient_field.shape[1]):\n",
        "                    integral[y, x] = simpson(gradient_field[y, :x])\n",
        "\n",
        "        elif axis == 0:\n",
        "            for x in range(gradient_field.shape[1]):\n",
        "                for y in range(1, gradient_field.shape[0]):\n",
        "                    integral[y, x] = simpson(gradient_field[:y, x])\n",
        "\n",
        "        return integral\n",
        "\n",
        "    raise NotImplementedError(\n",
        "        f\"Integration method '{INTEGRATION_METHOD}' not implemented.\"\n",
        "    )\n",
        "\n",
        "\n",
        "target_iteration_count = 1\n",
        "\n",
        "\n",
        "INTEGRATION_METHOD = INTEGRATION_METHODS.TRAPEZOID\n",
        "trapezoid_heights = estimate_height_map(NORMAL_MAP_A_IMAGE, target_iteration_count)\n",
        "\n",
        "INTEGRATION_METHOD = INTEGRATION_METHODS.SIMSPSON\n",
        "simpson_heights = estimate_height_map(NORMAL_MAP_A_IMAGE, target_iteration_count)\n",
        "\n",
        "INTEGRATION_METHOD = INTEGRATION_METHODS.SUM\n",
        "sum_heights = estimate_height_map(NORMAL_MAP_A_IMAGE, target_iteration_count)\n",
        "\n",
        "visualize_heights(\n",
        "    [sum_heights, trapezoid_heights, simpson_heights], [\"sum\", \"trapezoid\", \"Simpson\"]\n",
        ")"
      ]
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "id": "uew0B3TYcQsw"
      },
      "source": [
        "Although they may appear similar and effectively provide equivalent results, the Simpson method demands considerably more computational time in this implementation than the sum and trapezoid methods. This outcome can be disheartening regarding the use of polynomial approximation to enhance the resulting heights in general."
      ]
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "id": "esCnXOeAcQsw"
      },
      "source": [
        "## Confidence\n",
        "\n",
        "A straightforward technique for computing the confidence of a pixel is to modify the heights combination function to return the negative standard deviation rather than the mean."
      ]
    },
    {
      "cell_type": "code",
      "execution_count": null,
      "metadata": {
        "id": "6kp-skgocQsw"
      },
      "outputs": [],
      "source": [
        "def combine_heights(*heights: np.ndarray) -> np.ndarray:\n",
        "    return -np.std(np.stack(heights, axis=0), axis=0)\n",
        "\n",
        "\n",
        "confidences = estimate_height_map(NORMAL_MAP_A_IMAGE)\n",
        "plt.title(\"confidences\")\n",
        "plt.xlabel(f\"mean: {np.mean(confidences)}\")\n",
        "_ = plt.imshow(confidences)\n",
        "_ = plt.colorbar()"
      ]
    }
  ],
  "metadata": {
    "anaconda-cloud": {},
    "colab": {
      "name": "map-test.ipynb",
      "provenance": []
    },
    "kernelspec": {
      "display_name": "Python 3.10.6 64-bit",
      "language": "python",
      "name": "python3"
    },
    "language_info": {
      "codemirror_mode": {
        "name": "ipython",
        "version": 3
      },
      "file_extension": ".py",
      "mimetype": "text/x-python",
      "name": "python",
      "nbconvert_exporter": "python",
      "pygments_lexer": "ipython3",
      "version": "3.10.12"
    },
    "vscode": {
      "interpreter": {
        "hash": "916dbcbb3f70747c44a77c7bcd40155683ae19c65e1c03b4aa3499c5328201f1"
      }
    }
  },
  "nbformat": 4,
  "nbformat_minor": 0
}