When running inside jupyter, ClearML basically stored the notebook itself inside the uncommitted changes, so that you can run this again yourself
When working outside of my venv I can see my jupyter notebook inside "uncommitted changes" and also as an artifact. When inside a venv I dont see it as an artifact and all I see under "uncommited changes" is the contents of ipykernel_launcher.py
Or enqueue a clone of the task for remote execution
I should mention that reporting artifacts and everything else works. I'm just not seeing the changes inside the web ui.
let me correct myself, when switching to a different python kernel it breaks.
@<1535793988726951936:profile|YummyElephant76>
Whenever I create any task the "uncommitted changes" are the contents of
ipykernel_launcher.py
, is there a way to make ClearML recognize that I'm running inside a venv?
This sounds like a bug, it should have the entire notebook there, no?
Hi @<1535793988726951936:profile|YummyElephant76> , what changes are you not seeing?
So it sounds as if for some reason calling Task.init inide a notebook on your jupyterhub is not detecting the notebook.
Is there anything special about the jupyterhub deployment ? how is it deployed ? is it password protected ? is this reproducible ?
It is deployed on an on premise, secured network that has no access to the outside world.
Is it password protected or something of that nature?
Perhaps we could find a different solution or work around, rather than solving a technical issue.
Solving it means allowing the python code to ask the JupyterLab server for the notebook file
However, once working with ClearML and using a venv (and not the default python kernel),
Are you saying on your specific setup (i.e. OpenShift+helm etc) in some occurrences everything work and on some it does not?
@<1523701205467926528:profile|AgitatedDove14> hey, we found out what was causing that issue
when a new venv is created it does not contain any python libraries, so when ClearML was trying to list the current running jupyter servers (using the Jupyter Notebook python library) it was failing since that library does not exist.
Not sure why there were no warnings or errors regarding it...
We fixed it by running pip install notebook
inside the venv, and it worked!
CC: @<1564422819012415488:profile|DisturbedCormorant14>
@<1535793988726951936:profile|YummyElephant76> oh you mean like jupyter server was running, then inside the notebook you would start a new venv, in that venv "notebook" package was missing, hence it failed detecting the notebook ?
Hmm Okay, I think the takeaway is that we should print "missing notebook
package" 🙂
Hi @<1523701205467926528:profile|AgitatedDove14> , I'm RoseGold's coworker.
The issue still occurs. I wil try to provide you with the necessary information.
We are using a bitnami helm release of jupyterhub for deployement on Openshift.
the version is 3.1.0.
It is deployed on an on premise, secured network that has no access to the outside world.
Perhaps we could find a different solution or work around, rather than solving a technical issue.
The thing is that researcher in our organization sometimes need to use specific versions of specific libraries for their model to run successfully. In order to do that the use venv's with their requirements.
However, once working with ClearML and using a venv (and not the default python kernel), then the notebook will not upload to the ClearML server as an artifact and is not found in storage.
Would really appreciate it if you could help us out
When I'm using the default python kernel for notebooks it does store it successfully, however when switching to a venv it doesn't.
seems like an issue when using ipykernel...
steps to reproduce:|
virtualenv my_env
source my_env/bin/activate
pip install ipykernl
python -m ipykernel install --user --name=my_env
then switch the kernel to the new ipykernel to reproduce