Reputation
Badges 1
662 × Eureka!My current workaround is to use poetry and tell users to delete poetry.lock if they want their environment copied verbatim
That's enabled; I was aiming if there are flags to add to pip install CLI, such as --no-use-pep517
Not that I recall
I'm running tests with pytest , it consumes/owns the stream
That could work, given that:
Could we add a preview section? One reason I don't like using the configuration section is that it makes debugging much much harder. Will the clearml-agent download and unzip the files, placing them into the same local folder as needed for execution? What if we want to include non-configuration objects? (i.e. the model case I listed)
There's a specific fig[1].set_title(title) call.
I realized it might work too, but looking for a more definitive answer π Has no-one attempted this? π€
I commented on your suggestion to this on GH. Uploading the artifacts would happen via some SDK before switching to remote execution.
When cloning a task (via WebUI or SDK), a user should have an option to also clone these input artifacts or simply linking to the original. If linking to the original, then if the original task is deleted - it is the user's mistake.
Alternatively, this potentially suggests "Input Datasets" (as we're imitating now), such that they are not tied to the original t...
In the Profile section, yes, they are well defined (bucket, secret, key, and endpoint)
That's up and running and is perfectly fine.
Yes π I want ClearML to load and parse the config before that. But now I'm not even sure those settings in the config are even exposed as environment variables?
It's also sufficient to see StorageManager.list("/data/clear") takes a really long time to return no results
Sorry, found it on my end!
Any simple ways around this for now? @<1523701070390366208:profile|CostlyOstrich36>
Yes, Iβve found that too (as mentioned, Iβm familiar with the repository). My issue is still that there is documentation as to what this actually offers.
Is this simply a helm chart to run an agent on a single pod? Does it scale in any way? Basically - is it a simple agent (similiar to on-premise agents, running in the background, but here on K8s), or is it a more advanced one that offers scaling features? What is it intended for, and how does it work?
The official documentation are very spa...
If everything is managed with a git repo, does this also mean PRs will have a messy metadata file attached to them?
The key/secret is also shared internally so that sounds like a nice mitigation actually!
Which environment variable am I looking for? I couldn't spot anything specifically in that environment variables page
If I add the bucket to that (so CLEARML_FILES_HOST= s3://minio_ip:9000/minio/bucket ), I then get the following error instead --
2021-12-21 22:14:55,518 - clearml.storage - ERROR - Failed uploading: SSL validation failed for ... [SSL: WRONG_VERSION_NUMBER] wrong version number (_ssl.c:1076)
And yes, our flow would break anyway with the internal references within the yaml file. It would be much simpler if we could specify the additional files
It does, but I don't want to guess the json structure (what if ClearML changes it or the folder structure it uses for offline execution?). If I do this, I'm planning a test that's reliant on ClearML implementation of offline mode, which is tangent to the unit test
But to be fair, I've also tried with python3.X -m pip install poetry etc. I get the same error.
I couldn't find it directly in the SDK at least (in the APIClient)... π€
Something like this, SuccessfulKoala55 ?
Open a bash session on the docker ( docker exec -it <docker id> /bin/bash ) Open a mongo shell ( mongo ) Switch to backend db ( use backend ) Get relevant project IDs ( db.project.find({"name": "ClearML Examples"}) and db.project.find({"name": "ClearML - Nvidia Framework Examples/Clara"}) ) Remove relevant tasks ( db.task.remove({"project": "<project_id>"}) ) Remove project IDs ( db.project.remove({"name": ...}) )
But it does work on linux π€ I'm using it right now and the environment variables are not defined in the terminal, only in the .env π€
Hm, I did not specify any specific versions previously. What was the previous default?
AgitatedDove14
hmmm... they are important, but only when starting the process. any specific suggestion ?
(and they are deleted after the Task is done, so they are temp)
Ah, then no, sounds temporary. If they're only relevant when starting the process though, I would suggest deleting them immediately when they're no longer needed, and not wait for the end of the task (if possible, of course)
Maybe they shouldn't be placed under /tmp if they're mission critical, but rather the clearml cache folder? π€
- in the second scenario, I might have not changed the results of the step, but my refactoring changed the speed considerably and this is something I measure.
- in the third scenario, I might have not changed the results of the step and my refactoring just cleaned the code, but besides that, nothing substantially was changed. Thus I do not want a rerun.Well, I would say then that in the second scenario itβs just rerunning the pipeline, and in the third itβs not running it at all π
(I ...
Is it currently broken? π€