Reputation
Badges 1
662 × Eureka!SuccessfulKoala55 That at least did not work, unless one has to specify wildcard patterns perhaps..?
I think so, it was just missing from the official documentation 🙂 Thanks!
Then that did not work, but I'll look into it again soon!
Oh and clearml-agent==1.1.2
My current workaround is to use poetry and tell users to delete poetry.lock if they want their environment copied verbatim
That's enabled; I was aiming if there are flags to add to pip install CLI, such as --no-use-pep517
Not that I recall
I'm running tests with pytest , it consumes/owns the stream
That could work, given that:
Could we add a preview section? One reason I don't like using the configuration section is that it makes debugging much much harder. Will the clearml-agent download and unzip the files, placing them into the same local folder as needed for execution? What if we want to include non-configuration objects? (i.e. the model case I listed)
There's a specific fig[1].set_title(title) call.
I realized it might work too, but looking for a more definitive answer 😄 Has no-one attempted this? 🤔
I commented on your suggestion to this on GH. Uploading the artifacts would happen via some SDK before switching to remote execution.
When cloning a task (via WebUI or SDK), a user should have an option to also clone these input artifacts or simply linking to the original. If linking to the original, then if the original task is deleted - it is the user's mistake.
Alternatively, this potentially suggests "Input Datasets" (as we're imitating now), such that they are not tied to the original t...
In the Profile section, yes, they are well defined (bucket, secret, key, and endpoint)
i.e.ERROR Fetching experiments failed. Reason: Backend timeout (600s)ERROR Fetching experiments failed. Reason: Invalid project ID
That's up and running and is perfectly fine.
Yes 😅 I want ClearML to load and parse the config before that. But now I'm not even sure those settings in the config are even exposed as environment variables?
It's also sufficient to see StorageManager.list("/data/clear") takes a really long time to return no results
Sorry, found it on my end!
Any simple ways around this for now? @<1523701070390366208:profile|CostlyOstrich36>
Yes, I’ve found that too (as mentioned, I’m familiar with the repository). My issue is still that there is documentation as to what this actually offers.
Is this simply a helm chart to run an agent on a single pod? Does it scale in any way? Basically - is it a simple agent (similiar to on-premise agents, running in the background, but here on K8s), or is it a more advanced one that offers scaling features? What is it intended for, and how does it work?
The official documentation are very spa...
If everything is managed with a git repo, does this also mean PRs will have a messy metadata file attached to them?
The key/secret is also shared internally so that sounds like a nice mitigation actually!
Which environment variable am I looking for? I couldn't spot anything specifically in that environment variables page
If I add the bucket to that (so CLEARML_FILES_HOST= s3://minio_ip:9000/minio/bucket ), I then get the following error instead --
2021-12-21 22:14:55,518 - clearml.storage - ERROR - Failed uploading: SSL validation failed for ... [SSL: WRONG_VERSION_NUMBER] wrong version number (_ssl.c:1076)
And yes, our flow would break anyway with the internal references within the yaml file. It would be much simpler if we could specify the additional files
It does, but I don't want to guess the json structure (what if ClearML changes it or the folder structure it uses for offline execution?). If I do this, I'm planning a test that's reliant on ClearML implementation of offline mode, which is tangent to the unit test
But to be fair, I've also tried with python3.X -m pip install poetry etc. I get the same error.
I couldn't find it directly in the SDK at least (in the APIClient)... 🤔