Examples: query, "exact match", wildcard*, wild?ard, wild*rd
Fuzzy search: cake~ (finds cakes, bake)
Term boost: "red velvet"^4, chocolate^2
Field grouping: tags:(+work -"fun-stuff")
Escaping: Escape characters +-&|!(){}[]^"~*?:\ with \, e.g. \+
Range search: properties.timestamp:[1587729413488 TO *] (inclusive), properties.title:{A TO Z}(excluding A and Z)
Combinations: chocolate AND vanilla, chocolate OR vanilla, (chocolate OR vanilla) NOT "vanilla pudding"
Field search: properties.title:"The Title" AND text
Answered
What Sort Of Integration Is Possible With Clearml And Sagemaker? On The Page

What sort of integration is possible with ClearML and SageMaker? On the page describing ClearML Remote it says:

Create a remote development environment (e.g. AWS SageMaker, GCP CoLab, etc.) on any on-prem machine or any cloud.

But the only mention of SageMaker I see in the docs is the release notes for 0.13 saying "Add support for SageMaker".

I have SageMaker Studio up and running with access to my ClearML server and it's successfully able to log plots and scalars from experiments, but in terms of code it just logs the code used to launch the kernel:

"""Entry point for launching an IPython kernel.
This is separate from the ipykernel package so we can avoid doing imports until
after removing the cwd from sys.path.
"""
import sys

if __name__ == '__main__':
    # Remove the CWD from sys.path while we load stuff.
    # This is added back by InteractiveShellApp.init_path()
    if sys.path[0] == '':
        del sys.path[0]
    from ipykernel import kernelapp as app
    app.launch_new_instance()

Is it possible to capture more than that while using SageMaker?

  
  
Posted 7 months ago
Votes Newest

Answers 77


sh-4.2$ cat /var/log/studio/kernel_gateway.log | head -n10
{"__timestamp__": "2023-02-23T21:48:28.036559Z", "__schema__": "sagemaker.kg.request.schema", "__schema_version__": 1, "__metadata_version__": 1, "account_id": "", "duration": 0.0012829303741455078, "method": "GET", "uri": "/api", "status": 200}
{"__timestamp__": "2023-02-23T21:48:39.111068Z", "__schema__": "sagemaker.kg.request.schema", "__schema_version__": 1, "__metadata_version__": 1, "account_id": "", "duration": 0.0012879371643066406, "method": "GET", "uri": "/api/kernels", "status": 200}
{"__timestamp__": "2023-02-23T21:48:39.116324Z", "__schema__": "sagemaker.kg.request.schema", "__schema_version__": 1, "__metadata_version__": 1, "account_id": "", "duration": 0.0007715225219726562, "method": "GET", "uri": "/api/terminals", "status": 200}
{"__timestamp__": "2023-02-23T21:48:39.272822Z", "__schema__": "sagemaker.kg.request.schema", "__schema_version__": 1, "__metadata_version__": 1, "account_id": "", "duration": 0.0007491111755371094, "method": "GET", "uri": "/api/terminals", "status": 200}
{"__timestamp__": "2023-02-23T21:48:43.000795Z", "__schema__": "sagemaker.kg.request.schema", "__schema_version__": 1, "__metadata_version__": 1, "account_id": "", "duration": 2.539133071899414, "method": "POST", "uri": "/api/kernels", "status": 201}
{"__timestamp__": "2023-02-23T21:48:43.073568Z", "__schema__": "sagemaker.kg.request.schema", "__schema_version__": 1, "__metadata_version__": 1, "account_id": "", "duration": 0.0013430118560791016, "method": "GET", "uri": "/api/kernels/6ba227af-ff2c-4b20-89ac-86dcac95e2b2", "status": 200}
{"__timestamp__": "2023-02-23T21:48:43.469751Z", "__schema__": "sagemaker.kg.request.schema", "__schema_version__": 1, "__metadata_version__": 1, "account_id": "", "duration": 0.0013761520385742188, "method": "GET", "uri": "/api/kernels/6ba227af-ff2c-4b20-89ac-86dcac95e2b2", "status": 200}
{"__timestamp__": "2023-02-23T21:48:43.702549Z", "__schema__": "sagemaker.kg.request.schema", "__schema_version__": 1, "__metadata_version__": 1, "account_id": "", "duration": 0.0013780593872070312, "method": "GET", "uri": "/api/kernels/6ba227af-ff2c-4b20-89ac-86dcac95e2b2", "status": 200}
{"__timestamp__": "2023-02-23T21:48:43.986808Z", "__schema__": "sagemaker.kg.request.schema", "__schema_version__": 1, "__metadata_version__": 1, "account_id": "", "duration": 0.0007445812225341797, "method": "GET", "uri": "/api/kernels/6ba227af-ff2c-4b20-89ac-86dcac95e2b2", "status": 200}
{"__timestamp__": "2023-02-23T21:48:43.992860Z", "__schema__": "sagemaker.kg.request.schema", "__schema_version__": 1, "__metadata_version__": 1, "account_id": "", "duration": 0.001028299331665039, "method": "GET", "uri": "/api/kernels", "status": 200}
  
  
Posted 7 months ago

nope, that's wrong

  
  
Posted 7 months ago

sounds good!

  
  
Posted 7 months ago

I think it just ends up in /home/sagemaker-user/{notebook}.ipynb every time

  
  
Posted 7 months ago

at least in 2018 it returned sessions! None

  
  
Posted 7 months ago

This is very odd ... let me check something

  
  
Posted 7 months ago

and the only calls to "uri": "/api/sessions" are the ones I made during testing - sagemaker doesn't seem to ever call that itself

  
  
Posted 7 months ago

So it's seemingly not the image, but maybe something to do with how Studio runs it as a kernel.

Yeah I think that for some reason it fails detecting this is actually jupyter noteboko (not really sure why), Thank you for double checking on the container !!

  
  
Posted 7 months ago

but r.json() is an empty list

  
  
Posted 7 months ago

so notebooks ends up empty

  
  
Posted 7 months ago

right now I can't figure out how to get the session in order to get the notebook path

you mean the code that fires "HTTPConnectionPool" ?

  
  
Posted 7 months ago

the problem is here: None

  
  
Posted 7 months ago

api/kernels does report back the active kernel, but doesn't give notebook paths or anything

  
  
Posted 7 months ago

and cat /var/log/studio/kernel_gateway.log | grep ipynb comes up empty

  
  
Posted 7 months ago

As another test I ran Jupyter Lab locally using the same custom Docker container that we're using for Sagemaker Studio, and it works great there, just like the native local Jupyter Lab. So it's seemingly not the image, but maybe something to do with how Studio runs it as a kernel.

  
  
Posted 7 months ago

yep

  
  
Posted 7 months ago

one possibility for getting the notebook filepath is finding and parsing /home/sagemaker-user/.jupyter/lab/workspaces/default-37a8.jupyterlab-workspace I think, but I don't know if I can tie that to a specific session

  
  
Posted 7 months ago

Hmm and you are getting empty list for thi one:

server_info['url'] = f"http://{server_info['hostname']}:{server_info['port']}/"
  
  
Posted 7 months ago

weird that it won't return that single session

  
  
Posted 7 months ago

lots of things like {"__timestamp__": "2023-02-23T23:49:23.285946Z", "__schema__": "sagemaker.kg.request.schema", "__schema_version__": 1, "__metadata_version__": 1, "account_id": "", "duration": 0.0007679462432861328, "method": "GET", "uri": "/api/kernels/6ba227af-ff2c-4b20-89ac-86dcac95e2b2", "status": 200}

  
  
Posted 7 months ago

still empty
image

  
  
Posted 7 months ago

as best I can tell it'll only have one .ipynb in $HOME with this setup, which may work...

  
  
Posted 7 months ago

but even then the sessions endpoint is still empty

  
  
Posted 7 months ago

if I use the same kernel there'll be two

  
  
Posted 7 months ago

looks like the same as in server_info

  
  
Posted 7 months ago

print(os.environ)
  
  
Posted 7 months ago

Hi @<1532532498972545024:profile|LittleReindeer37> @<1523701205467926528:profile|AgitatedDove14>
I got the session with a bit of "hacking".
See this script:

import boto3, requests, json
from urllib.parse import urlparse

def get_notebook_data():
    log_path = "/opt/ml/metadata/resource-metadata.json"
    with open(log_path, "r") as logs:
        _logs = json.load(logs)
    return _logs

notebook_data = get_notebook_data()
client = boto3.client("sagemaker")
response = client.create_presigned_domain_url(
    DomainId=notebook_data["DomainId"],
    UserProfileName=notebook_data["UserProfileName"]
)
authorized_url = response["AuthorizedUrl"]
authorized_url_parsed = urlparse(authorized_url)
unauthorized_url = authorized_url_parsed.scheme + "://" + authorized_url_parsed.netloc
with requests.Session() as s:
    s.get(authorized_url)
    print(s.get(unauthorized_url + "/jupyter/default/api/sessions").content)

Basically, we can get the session directly from AWS, but we need to be authenticated.
One way I found was to create a presigned url through boto3, by getting the domain id and profile name from a resoure-metadata file that is found on the machine None .
Then use that to get the session...
Maybe there are some other ways to do this (safer), but this is a good start. We know it's possible

  
  
Posted 6 months ago

nice! Just tested it on my end as well, looks like it works!

  
  
Posted 6 months ago

it does return kernels, just not sessions

  
  
Posted 7 months ago

sadly no

  
  
Posted 7 months ago
5K Views
77 Answers
7 months ago
6 months ago
Tags
Similar posts