Reputation
Badges 1
25 × Eureka!CheerfulGorilla72 my guess is the Slack token does not have credentials for the private channel, could that be ?
I get what you're saying. Only problem is in the case of AutoLogging, I don't have the model id, for the model being saved.
Task.models['output'] should return all the model objects the autologging created
Hi UnevenDolphin73
Is there an easy way to add a link to one of the tasks panels? (as an artifact, configuration, info, etc)?
You can add a link as an artifact, that is probably the easiest:tasl.upload_artifact(name="just link", artifact_object=" ")
EDIT: And follow up regarding the dataset. As discussed somewhere previously, the datasets are now automatically moved to a hidden "sub-project" prefixed with
.datasets
. This creates several annoyances that I...
Which would mean the error is because of a company firewall/self-signed certificate.
The easiest solution,Disable SSL certificate check for ClearML.
Create the ~/clearml.conf manually:
` #disable SSL certificate check
api.verify_certificate: False
copy paste the credentials section from the UI
it should look something like:
api {
# web_server on port 8080
web_server: " "
# Notice: 'api_server' is the api server (default port 8008), not the web server.
api_server: ...
Hmm could it be this is on the "helper functions" ?
This is a horrible setup, it means no authentication will pass, it will literally break every JWT authentication scheme
Ohh I see, so basically the ASG should check if the agent is Idle, rather than the Task is running ?
now, I need to pass a variable to the Preprocess class
you mean for the construction ?
BTW: I suspect this is the main issue:
https://github.com/python-poetry/poetry/issues/2179
I think this is the only mount you need:
Data persisted in every Kubernetes volume by ClearML will be accessible in /tmp/clearml-kind folder on the host.
SuccessfulKoala55 is this correct ?
Funny enough Iβm running into a new issue now.
Sorry my bad, I thought have known π yes it probably should be packages=["clearml==1.1.6"]
BTW: do you have any imports inside the pipeline function itself ? if you do not, then no need to pass "packages" at all, it will just add clearml
parser.add_argument( "--dataset_mean", type
=
float, nargs
=
"+", default
=
0.5)
I think providing nargs='+ ' assumes the type is a list. nonetheless we should be able to support it. Could you please add a GitHub issue so we do not forget ?
on the side note, is there any way to automatically give more meaningful names to the running docker containers?
What do you mean by that? running where? and where will you see them ?
and sometimes there are hanging containers or containers that consume too much RAM.
Hmmm yes, but can't you see it in CLearML dashboard ?
unless I explicitly add container name in container arguments, it will have a random name,
it would be great if we could set default container name for each experiment (e.g., experiment id)
Sounds like a great feature! with little implementation work π Can you add a GitHub issue on clearml-agent ?
CheerfulGorilla72 sounds like a great idea, I'll pass along to documentation ppl π
TrickySheep9
Is there a way to see a roadmap on such thingsΒ
?Β (edited)
Hmm I think we have some internal one, I have to admit these things change priority all the time (so it is hard to put an actual date on them).
Generally speaking, pipelines with functions should be out in a week or so, TaskScheduler + Task Triggers should be out at about the same time.
UI for creating pipelines directly from the web app is in the working, but I do not have a specific ETA on that
Hi FierceFly22
You called execute_remotely a bit too soon. If you have any manual configuration, they have to be called before, so they are stored in the Task. This includes task.connect and task.connct_configuration.
CrookedWalrus33 this is odd I tested the exact same code.
I suspect something with the environment maybe?
Whats the python version / OS ? also can you send full pipe freeze?2022-07-17 07:59:40,339 - clearml.storage - ERROR - Failed uploading: Parameter validation failed: Invalid type for parameter ContentType, value: None, type: <class 'NoneType'>, valid types: <class 'str'>Yes this is odd, it should add the content-type of the file (for example "application/x-tar" but you are getting N...
Basically run the 'agentin virtual environment mode JumpyDragonfly13 try this one (notice no --docker flag) clearml-agent daemon --queue interactive --create-queue Then from the "laptop" try to get a remote session with: clearml-session `
Like get the tasks that uses the most metrics API?
Hi JitteryCoyote63
I would like to switch to using a single auth token.
What is the rationale behind to that ?
Hi CheekyAnt38
However now I would like to evaluate directly my machine learning model via api requests, directly over clearml. Itβs possible?
This basically means serving the model, is this what you mean?
Oh sure that makes sense, clone the experiment in the UI (right click, clone) then everything is editable :) both uncommitted changes, and branch / commit
Regarding the agentΒ - No particular reason. Can you point me on how to do it?
This is a good place to start
https://clear.ml/docs/latest/docs/getting_started/mlops/mlops_first_steps
We need the automagic...Β
This is one of the great benefits of using clearmlΒ
π
Sure, try this one:Task.debug_simulate_remote_task('reused_task_id') task = Task.init(...)Notice it will take the arguments form the cleaml-task itself (e.g. override argparse arguments with what ...
By default the agent will add the root of the git repository into the pythonpath , so that you can import...
The -m src.train is just the entry script for the execution all the rest is be taken care by the Configuration section (whatever you pass after it will be ignored if you are using Argparse as it is auto-connects with ClearML)
Make sense ?