Reputation
Badges 1
662 × Eureka!I think this is about maybe the credential.helper used
Hm, just a small update - I just verified and it does indeed work on linux:
` import clearml
import dotenv
if name == "main":
dotenv.load_dotenv()
config = clearml.backend_api.Config.load() # Success, parsed with environment variables `
Maybe this is part of the paid version, but would be cool if each user (in the web UI) could define their own secrets, and a task could then be assigned to some user and use those secrets during boot?
Thanks AgitatedDove14 , I'll first have to prove viability with the free version :)
That's what I found as well, but it did not like it after all (boto is fine with it, but underlying urllib and requests were not?)
It's fine -- I see the added benefit in making sure the users set up their clearml.conf and I've made a script to edit it to our needs as part of the installation process π Thanks Martin!
The screenshot is small since the data is private anyway, but it's enough to see:
"Metric: untitled 00" "plot image" as the image title The attached histogram has a title ("histogram of ...")
The logs are on the bucket, yes.
The default file server is also set to s3://ip:9000/clearml
Yes that's what I thought, thanks for confirming.
I'm not sure about the intended use of connect_configuration now.
I was under the assumption that in connect_configuration(configuration, name=None, description=None) , the configuration is only used in local execution.
But when I run config = task.connect_configuration({}, name='General') (in remote execution), the configuration is set to the empty dictionary
Here's an example where poetry.lock is removed, and still the console reads:url: .... branch: HEAD commit: 22fffaf8d5f377b7f10140e642a7f6f26b72ffaa root: /.../.clearml/venvs-builds/3.10/task_repository/... Applying uncommitted changes Poetry Enabled: Ignoring requested python packages, using repository poetry lock file! Creating virtualenv ds-platform in /.../.clearml/venvs-builds/3.10/task_repository/.../.venv Updating dependencies Resolving dependencies...
I'll give it a shot. Honestly, the SDK documentation for both InputModel and OutputModel is (sorry) horrible ...
Can't wait for the documentation revamping.
Yes, Iβve found that too (as mentioned, Iβm familiar with the repository). My issue is still that there is documentation as to what this actually offers.
Is this simply a helm chart to run an agent on a single pod? Does it scale in any way? Basically - is it a simple agent (similiar to on-premise agents, running in the background, but here on K8s), or is it a more advanced one that offers scaling features? What is it intended for, and how does it work?
The official documentation are very spa...
Generally, really. I've struggled recently (and in the past), because the documentation seems:
Very complete wrt available SDK (though the formatting is sometimes off) Very lacking wrt to how things interact with one anotherA lot of what I need I actually find from pluging into the source code.
I think ClearML would benefit itself a lot if it adopted a documentation structure similar to numpy ecosystem (numpy, pandas, scipy, scikit-image, scikit-bio, scikit-learn, etc)
SweetBadger76 TimelyPenguin76
We're finally tackling this (since it has kept us back at 1.3.2 even though 1.6.2 is out...), and noticed that now the bucket name is also part of the folder?
So following up from David's latest example:StorageManager.download_folder(remote_url='s3://****-bucket/david/', local_folder='./')Actually creates a new folder ./****-bucket/david/ and puts it contents there.
EDIT: This is with us using internal MinIO, so I believe ClearML parses that end...
Say I upload each of these yamls as a configuration object (as with the above). Once I try to load bar.yaml remotely it will crash, since foo.yaml is missing (and is instead a clearml configuration object).
Does that make sense?
UPDATE: Apparently the quotation type matters for furl ? I switched the ' to \" and it seems to work now
This is with:Task.set_offline_mode(True) task = Task.init(..., auto_connect_streams=False)
Maybe it's better to approach this the other way, if one uses Task.force_requirements_env_freeze() , then the locally updated packages aren't reflected in poetry π€
Interesting, why wonβt it be possible? Quite easy to get the source code using e.g. dill .
I'll try it out, but I would not like to rewrite that code myself maintain it, that's my point π
Or are you suggesting I Task.import_offline_session ?
QuaintPelican38 did you have a workaround for this then? Some cleanup service or similar?
I guess I'll have to rerun the experiment without tags for this?
Hm, I'm not sure I follow π€ How does the API server config relate to the file server?
Yes it would be π
Visualization is always a difficult topic... I'm not sure about that, but a callback would be nice.
One idea that comes to mind (this is of course limited to DataFrames), but think the git diff , where I imagine 3 independent section:
Removed columns (+ truncated preview of removed values) (see below) Added columns (+ truncated preview of removed values)
The middle column is then a bit complicated, but I would see some kind of "shared columns" dataframe, where each ...
The bucket is not a folder, it's just a container. Whether it's implemented as a folder in MinIO should be transparent, shouldn't it?
Since the "fix" in 1.4.0 onwards, we now have to download the folder, and then move all the downloaded files/folders to the correct level.
This now entails we also have to check which storage is used, so we can check if the downloaded folder will contain the bucket name or not, which seems very inconsistent?
Yes, you're correct, I misread the exception.
Maybe it hasn't completed uploading? At least for Datasets one needs to explicitly wait IIRC
Does it make sense to you to run several such glue instances, to manage multiple resource requirements?