
Reputation
Badges 1
662 × Eureka!I couldn't find it directly in the SDK at least (in the APIClient)... 🤔
Yup! Seems to have been some brief unavailability for some reason
We’d be happy if ClearML captures that (since it uses e.g. pip, then we have the git + commit hash for reproducibility), as it claims it would 😅
Any thoughts CostlyOstrich36 ?
Sounds like incorrect parsing on ClearML side then, doesn't it? At least, it does not fully support MinIO then
I don't imagine AWS users get a new folder named aws-key-region-xyz-bucket-hostname
when they download_folder(...)
from an AWS S3 bucket, or do they? 🤔
Yes that's what I thought, thanks for confirming.
I'd like to remove the hidden
system tag from a project
Any simple ways around this for now? @<1523701070390366208:profile|CostlyOstrich36>
` # test_clearml.py
import pytest
import shutil
import clearml
@pytest.fixture
def clearml_task():
clearml.Task.set_offline_mode(True)
task = clearml.Task.init(project_name="test", task_name="test")
yield task
shutil.rmtree(task.get_offline_mode_folder())
clearml.Task.set_offline_mode(False)
class ClearMLTests:
def test_something(self, clearml_task):
assert True run with
pytest test_clearml.py `
I'm not sure I follow, how would that solution look like?
Maybe they shouldn't be placed under /tmp
if they're mission critical, but rather the clearml cache folder? 🤔
Thanks, that's what I thought - so I'm missing something else in the installation. I'll dig further 🙂
Well the individual tasks do not seem to have the expected environment.
Okay trying again without detached
For example, we have a complicated YAML file with built-in !include
instructions, so we upload all the included files too. This then clogs up the artifacts sidebar, and it would be nice to be able to say "these are all artifacts from this one file, you can collapse it by clicking here"
I mean, if I search for "model", will it automatically search for tasks containing "model" in their name?
Oh nono, more like:
- Create a pipeline
- Add N steps to it
- Run the pipeline
- It fails/succeeds, the user does something with the output
- The user would like to add/modify some steps based on the results now (after closer inspection).I wonder if at (5), do I have to recreate the pipeline every time? 🤔
I see that the GUI AutoScaler is only in the paid version, wonder why the GCP driver is not open source?
It failed on some missing files in my remote_execution, but otherwise seems fine now
Yes exactly, but I guess I could've googled for that 😅
Copy the uncommitted changes captured by ClearML using the UI, write to changes.patch
, run git apply changes.patch
👍
We just inherit from logging.Handler
and use that in our logging.config.dictConfig
; weird thing is that it still logs most of the tasks, just not the last one?
Or is just integrated in the ClearML slack space and for some reason it's showing the clearml address then?
I can navigate through the projects, but selecting one task in one project, then navigating to another project and selecting a different task -> there is no suggestion to compare the tasks.
In the projects page if I show all - I just see the projects. If I search for a task of similar name, I get results, but I can't compare them via the UI.
The only way I managed so far was to create a pseudo-comparison between unrelated tasks in the same project, then remove one task from comparion, and u...
Well you could start by setting the output_uri
to True
in Task.init
.
FYI @<1523701087100473344:profile|SuccessfulKoala55> (or I might be doing something wrong), but it seems the python migration code comes with carriage returns, so it fails on linux by default (one has to tr -d '\r'
to use it)
EDIT: And also it defaults to /opt/allegro/data
rather than the recommended /opt/clearml/data
which is suggested when installing the server 🤔
Why not give ClearML read-only access credentials to the repository?
We load the endpoint (and S3 credentials) from a .env
file, so they're not immediately available at the time of from clearml import Task
.
It's a convenience thing, rather than exporting many environment variables that are tied together.
I'm aware, but it would be much cleaner to define them in the worker's clearml.conf
and let ClearML expose them locally to running tasks.
EDIT: Also the above is specifically about serving, which is not the target here 🤔 At least not yet 😄
What's new in 1.1.6rc0?