Reputation
Badges 1
123 × Eureka!I already found the source code and i modified it as needed.
How can I now get this info from Task that is created when Dataset is created?
Couldnt find anything like clearml.Dataset(id=id).get_size()
What you want is to have a service script that cleans up archived tasks, here is what we used: None
WebApp: 1.16.0-494 • Server: 1.16.0-494 • API: 2.30
But be careful, upgrading is extremely dangerous
@<1523701601770934272:profile|GiganticMole91> Thats rookie numbers. We are at 228 GB for elastic now
Im doing all of this because there isnt (or im not aware of) any good way understand what datasets are on workers
7 out of 30 GB is currently used and is quite stable
has 8 cores, so nothing fancy even
We fixed the issue, thanks, had to update everything to latest.
Hey, i see that 1.14.2 dropped
I tried it but the issue is still there, maybe the hotfix is in next patch?
Here is the setup so you can reproduce it (we dont have region field)
clearml.conf:s3 {
use_credentials_chain: false
credentials: [
{
host: "
s3.somehost.com "
key: "XXXXXXXXXXXXXXXXXXXX"
` secret: "XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX...
No, i specify where to upload
I see the data on S3 bucket is beeing uploaded. Just the log messages are really confusing