Reputation
Badges 1
53 × Eureka!we didn't change a thing from the defaults that's in your github 😄 so it's 500M?
Okay, thank you for the suggestions, we'll try it out
SOLVED: It was an expired service account key in a clearml config
Yes, thank you. That's exactly what I'm refering to.
The agent is deployed on our on-premise machines
By language, I meant the syntax. What is Args
and what is batch
in Args/batch
and what other values exist 😀
By commit hash, I mean the hash od the commit a task was run from. I wish to refer to that commit hash in another task (started with a triggerscheduler) in code
trigger.add_task_trigger(name='export', schedule_task_id=SCHEDULE_ID, task_overrides={...})
I would like to override the commit hash of the SCHEDULE_ID
with task_overrides
to answer myself, the first part, task.get_parameters()
retrieves a list of all the arguments which can be set. The syntax seems to be Args/{argparse destination}
However, this does not return the commit hash :((
SuccessfulKoala55 sorry for the bump, what's the status of the fix?
I don't think I expressed myself well 😅
My problem is I don't know how to run a jupyterhub Task. Basically what I want is a clearml-session
but with a docker container running JupyterHub instead of JupyterLab.
Do I write a Python script? If yes, how can I approach writing it? If not, what are the altenatives?
CostlyOstrich36 jupyterhub is a multi-user server, which allows many users to login and spawn their own jupyterlab instances (with custom dependencies, data etc) for runing notebooks
AgitatedDove14 no errors, because I don't know how to start 😅 I am just exploring if anyone did this before I get my hands dirty
Hey AgitatedDove14 ,
This sort of works but not quite. The process runs sucessfully (i can attach the container, ping thr JPH etc) however the portforwarding fails.
When I do the port forward on my own using ssh -L
also seems to fail for jupyterlab and vscode, too, which i find odd
You are not missing nothing, it is what we would like to have, to allow multiple people have their own notebook servers. We have multiple people doing different experiments, and JupyterHub would be their "playground" environment
Mostly the configurabilty of clearml-session
and how it was designed. Jupyterhub spawns a process at :8000 which we had to port foreward by hand, but spawning new docker containers using jupyterhub.Dockerspawner
and connecting them to the correct network (the hub should talk to them without --network host
) seem too difficult or even impossible.
Oh, and there was no JupyterHub stdout in the console output on clearml server, it shows the jupyterlab's output by default
I succeeded with your instructions, so thank you!
However, we concluded that we don't want to run it through ClearML after all, so we ran it standalone.
But, I'll update you if we ever run it with ClearML so you could also provide it
That's only a part of a solution.
You'd also have to allow specifying jupyterhub_config.py
, mounting it inside a container at a right place, mounting the docker socket in a secure manner to allow spawning user containers, connecting them to the correct network ( --host
won't work), persisting the user database and user data...
AgitatedDove14 Well, we have gotten relatively close to the goal, i suppose you wouldn't have to do a lot of work to support it natively
We've sucessfully deployed it without helm with custom made docker-compose and makefiles 😄
Haha we manage our own deployment without k8s, so no dice there
But, it turns out we are using nginx as a reverse proxy so putting a client_max_body_size
inside a nginx.conf solved it for us. Thanks :))
MelancholyElk85 thank you, however I am not sure where do I put that label?
It could work but slack demands a minimum of 512x512