While I'm wishing for things: it'd be awesome if it had a queue already set up. But if there's not a way to do that in the docker compose file, I could potentially write a script that uses the creds to create one using API calls
Interesting . It’s actually just running locally on my laptop. It seemed only to be an issue when pointing the ClearML session CLI at my local version of ClearML. Still thinking about this one.
What you're seeing looks like a payload being stripped from the request body (typically sent using a GET request), which is typical of GCP load-balancers
The issue went away. I'm still not sure why, but what finally made it work was creating a set of credentials manually in the UI and then setting those in my ~/clearml.conf
file.
Do you happen to have a link to a docker-compose.yaml
file that has a hardcoded set of credentials?
I want to seed the clearml instance with a set of credentials and ~/clearml.conf
to run automated tests.
I'm trying to add a docker-compose.yaml
to the repo to
- make it more convenient for contributors to develop locally
- spin up a local ClearML instance in CI to run automated tests
Here's the docker-compose file (mostly the standard file, except I altered the volume mounts, and I added minIO)
Here's the clearml.conf file (only custom settings are theapi
section)
I'm able to enqueue and run normal tasks, okay. But not clearml-session
s. When I run
CLEARML_CONFIG_FILE=path/to/clearml.conf clearml-session --queue sessions --docker python:3.9 --verbose
The only log I get is
clearml-session - CLI for launching JupyterLab / VSCode on a remote machine
Error:
I tried adding print statements to the clearml_session/__main__.py
file to see what was going on. It seems to fail the first time it ever tries to make a request to the clearml backend.
When the first request is made, these are the parameters before receiving an HTTP status 400
$ clearml-session --queue sessions --docker python:3.9 1 ↵
clearml-session - CLI for launching JupyterLab / VSCode on a remote machine
headers: {'Authorization': 'Bearer eyJ0eXAiOiJKV1QiLCJhbGciOiJIUzI1NiJ9.eyJpZGVudGl0eSI6eyJ1c2VyX25hbWUiOiJhcGlzZXJ2ZXIiLCJyb2xlIjoic3lzdGVtIiwiY29tcGFueSI6ImQxYmQ5MmEzYjAzOTQwMGNiYWZjNjBhN2E1YjFlNTJiIiwiY29tcGFueV9uYW1lIjoiY2xlYXJtbCIsInVzZXIiOiJfX2FwaXNlcnZlcl9fIn0sImlhdCI6MTY5OTM1MDM1NSwiZW52IjoiPHVua25vd24-IiwiYXV0aF90eXBlIjoiQmVhcmVyIiwiYXBpX3ZlcnNpb24iOiIyLjI2Iiwic2VydmVyX3ZlcnNpb24iOiIxLjEyLjEiLCJzZXJ2ZXJfYnVpbGQiOiIzOTciLCJmZWF0dXJlX3NldCI6ImJhc2ljIn0.u8ry4WrTh1kSCnJUXVqPxRy8lXw3DCTMmC1pHWWCIBI'} service: users action: get_current_user version: None method: get data: None json: None params: None
got here <Response [400]>
I think I'm down a rabbit hole 😆
You can actually use the agent's --create-queue
command line option to make it automatically create a queue for you
@<1541954607595393024:profile|BattyCrocodile47> any chance you're using a server hosted in Google Cloud?