but one possible workaround is to try to figure out if it's running in a gateway and then find the only notebook running on that server
and cat /var/log/studio/kernel_gateway.log | grep ipynb comes up empty
What happens when you call:
from clearml.backend_interface.task.repo import ScriptInfo
print(ScriptInfo._ScriptInfo__legacy_jupyter_notebook_server_json_parsing(None))
At the top there should be the URL of the notebook (I think)
environ{'PYTHONNOUSERSITE': '0',
'HOSTNAME': 'gfp-science-ml-t3-medium-d579233e8c4b53bc5ad626f2b385',
'AWS_CONTAINER_CREDENTIALS_RELATIVE_URI': '/_sagemaker-instance-credentials/xxx',
'JUPYTER_PATH': '/usr/share/jupyter/',
'SAGEMAKER_LOG_FILE': '/var/log/studio/kernel_gateway.log',
'PATH': '/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/tmp/miniconda3/condabin:/tmp/anaconda3/condabin:/tmp/miniconda2/condabin:/tmp/anaconda2/condabin',
'REGION_NAME': 'us-east-1',
'AWS_INTERNAL_IMAGE_OWNER': 'Custom',
'AWS_DEFAULT_REGION': 'us-east-1',
'PWD': '/home/sagemaker-user',
'AWS_REGION': 'us-east-1',
'SHLVL': '1',
'HOME': '/home/sagemaker-user',
'AWS_SAGEMAKER_PYTHONNOUSERSITE': '0',
'AWS_ACCOUNT_ID': 'xxx',
'_': '/opt/.sagemakerinternal/conda/bin/jupyter-kernelgateway',
'LC_CTYPE': 'C.UTF-8',
'KERNEL_LAUNCH_TIMEOUT': '40',
'KERNEL_WORKING_PATH': '',
'KERNEL_GATEWAY': '1',
'JPY_PARENT_PID': '9',
'PYDEVD_USE_FRAME_EVAL': 'NO',
'TERM': 'xterm-color',
'CLICOLOR': '1',
'FORCE_COLOR': '1',
'CLICOLOR_FORCE': '1',
'PAGER': 'cat',
'GIT_PAGER': 'cat',
'MPLBACKEND': '
_inline'}
print(requests.get(url='
print(requests.get(url='
as best I can tell it'll only have one .ipynb in $HOME with this setup, which may work...
but even then the sessions endpoint is still empty
I think it just ends up in /home/sagemaker-user/{notebook}.ipynb every time
lots of things like {"__timestamp__": "2023-02-23T23:49:23.285946Z", "__schema__": "sagemaker.kg.request.schema", "__schema_version__": 1, "__metadata_version__": 1, "account_id": "", "duration": 0.0007679462432861328, "method": "GET", "uri": "/api/kernels/6ba227af-ff2c-4b20-89ac-86dcac95e2b2", "status": 200}
at least in 2018 it returned sessions! None
which I looked at previously to see if I could import sagemaker.kg or kernelgateway or something, but no luck
and this
server_info['url'] = f"http://{server_info['hostname']}:{server_info['port']}/{server_info['base_url']}/"
curious whether it impacts anything besides sagemaker. I'm thinking it's generically a kernel gateway issue, but I'm not sure if other platforms are using that yet
Hmm what do you have here?
os.system("cat /var/log/studio/kernel_gateway.log")
. I'm thinking it's generically a kernel gateway issue, but I'm not sure if other platforms are using that yet
The odd thing is that you can access the notebook, but it returns zero kernels ..
I additionally tried using a Sagemaker Notebook instance, to see if it was the kernel dockerization that Studio uses that was messing things up. But it seems to actually log less information from a Notebook instance vs Studio .


Hmm and you are getting empty list for thi one:
server_info['url'] = f"http://{server_info['hostname']}:{server_info['port']}/"
As in, which tab when I'm viewing the Experiment should I see it on? Should it be code, an artifact, or something else?
Yep I think you are correct, you should have had the same output as a local jupyter notebook, and it seems that in sagemaker studio it is not working 😞
Let me check something
What do you have in "server_info['url']" ?
This is strange, let me see if we can get around it, because I'm sure it worked 🙂
@<1532532498972545024:profile|LittleReindeer37> nice!!! 😍
Do you want to PR? it will be relatively easy to merge and test, and I think that they might even push it to the next version (or worst case quick RC)