anaconda/nb_conda_kernels

Execution starts from the beginning on changing kernel

rayadastidar opened this issue · 2 comments

I installed nb_conda_kernels in the base environment using
conda install nb_conda_kernels
I created the configuration file for jupyter
Running 'jupyter kernelspec list' returned this output:

[ListKernelSpecs] WARNING | Config option `kernel_spec_manager_class` not recognized by `ListKernelSpecs`.
Available kernels:
  conda-env-delight-py          /home/raya/.local/share/jupyter/kernels/conda-env-delight-py
  conda-env-pymc_env-py         /home/raya/.local/share/jupyter/kernels/conda-env-pymc_env-py
  conda-env-pystan_env-py       /home/raya/.local/share/jupyter/kernels/conda-env-pystan_env-py
  conda-env-python_3_gpss-py    /home/raya/.local/share/jupyter/kernels/conda-env-python_3_gpss-py
  conda-root-py                 /home/raya/.local/share/jupyter/kernels/conda-root-py
  python3                       /home/raya/anaconda3/share/jupyter/kernels/python3

I hope the installation is fine as in my Jupyter Notebook, under the tab 'kernel -- Change Kernel', all the kernels can be seen.
But every time I change kernel in the notebook, execution starts from the beginning.
For example, I started a notebook, changed to the 'delight' kernel, and imported delight in the first cell. Then if I want to import pymc in the next cell, on changing the kernel to pymc_env, the notebook restarts.

What could have possibly gone wrong here?

A kernel is the reflection of a Python environment. They are isolated from each other (as expected for environment). So a notebook document is meant to be executed within the same kernel.

To take an analogy with a Python script, you can run

(env1) $ python my_script.py

or

(env2) $ python my_script.py

But it is impossible to run my_script.py, partially with env1 and env2.

Ok, thanks, I got the point!