ternaus/TernausNetV2

Import error: from models.ternausnet2 import TernausNetV2

itsrrworld opened this issue · 3 comments

Hi, i was trying to use the demo.ipynb file given, but it throws an import error.

ImportError                               Traceback (most recent call last)
<ipython-input-2-b7b37541d9f9> in <module>
----> 1 from models.ternausnet2 import TernausNetV2

~\Downloads\worldview3\TernausNetV2\models\__init__.py in <module>
----> 1 from .resnext import *
      2 from .resnet import *
      3 from .wider_resnet import *
      4 from .densenet import *

~\Downloads\worldview3\TernausNetV2\models\resnext.py in <module>
      5 import torch.nn as nn
      6 
----> 7 from modules import IdentityResidualBlock, ABN, GlobalAvgPool2d
      8 from ._util import try_index
      9 

~\Downloads\worldview3\TernausNetV2\modules\__init__.py in <module>
----> 1 from .bn import ABN, InPlaceABN, InPlaceABNSync
      2 from .functions import ACT_RELU, ACT_LEAKY_RELU, ACT_ELU, ACT_NONE
      3 from .misc import GlobalAvgPool2d, SingleGPU
      4 from .residual import IdentityResidualBlock
      5 from .dense import DenseModule

~\Downloads\worldview3\TernausNetV2\modules\bn.py in <module>
      8     from Queue import Queue
      9 
---> 10 from .functions import *
     11 
     12 

~\Downloads\worldview3\TernausNetV2\modules\functions.py in <module>
     16                     "inplace_abn_cuda_half.cu"
     17                 ]],
---> 18                 extra_cuda_cflags=["--expt-extended-lambda"])
     19 
     20 # Activation names

C:\ProgramData\Anaconda3\lib\site-packages\torch\utils\cpp_extension.py in load(name, sources, extra_cflags, extra_cuda_cflags, extra_ldflags, extra_include_paths, build_directory, verbose, with_cuda, is_python_module)
    642         verbose,
    643         with_cuda,
--> 644         is_python_module)
    645 
    646 

C:\ProgramData\Anaconda3\lib\site-packages\torch\utils\cpp_extension.py in _jit_compile(name, sources, extra_cflags, extra_cuda_cflags, extra_ldflags, extra_include_paths, build_directory, verbose, with_cuda, is_python_module)
    822     if verbose:
    823         print('Loading extension module {}...'.format(name))
--> 824     return _import_module_from_library(name, build_directory, is_python_module)
    825 
    826 

C:\ProgramData\Anaconda3\lib\site-packages\torch\utils\cpp_extension.py in _import_module_from_library(module_name, path, is_python_module)
    965 def _import_module_from_library(module_name, path, is_python_module):
    966     # https://stackoverflow.com/questions/67631/how-to-import-a-module-given-the-full-path
--> 967     file, path, description = imp.find_module(module_name, [path])
    968     # Close the .so file after load.
    969     with file:

C:\ProgramData\Anaconda3\lib\imp.py in find_module(name, path)
    294         break  # Break out of outer loop when breaking out of inner loop.
    295     else:
--> 296         raise ImportError(_ERR_MSG.format(name), name=name)
    297 
    298     encoding = None

ImportError: No module named 'inplace_abn' 

conda version: 4.6.14
python: 3.7
cuda: 10.1
pytorch: 1.0
gcc: 5.4

I'm running into the same problem - did you end up finding a solution?

Have you found some solution?

Hello folks, I faced the same problem and was stuck on the same for many days. Following solution is applicable for MacOS and Linux as I tried on these both only. Firstly, error is arising due to non compatibility of entire code base of git (and the versions they are operating on) and your respective versions of Python, Pytorch and OpenCV. So, first primarily, I'll recommend you to make a virtual environment with Python 3.6.8. Warning- DO NOT bypass this step or else you wont be able to install PyTorch 0.4 and thus proceed with pip install of each dependency as it is. (Torch 0.4, etc). You'll also need to install extra libraries of matplotlib and skimage, Then, you wont be getting this error. And further, if you dont want to run it on GPU and rather on CPU then follow this link for the answer. #16 (comment) . Comment, if problem still persists.