Reputation
Badges 1
90 × Eureka!I can try, but it will then damage the download speeds. Anyhow not a reasonable behavior in my opinion
which configuration are you passing? are you using any framework for configuration?
no, I tried either with very small files or with 20GB as the parent
What Iād like is to do Dataset.get(ābā, to=āaā) and have the download land the files directly there
AgitatedDove14
What was important for me was that the user can define the entire workflow and that I can see its status as one āpipelineā in the UI (vs. disparate tasks).
perform query process records into a labeling assignment Call labeling system API wait for and external hook when labels are ready clean the labels upload them to a dataset
Do you know what specific API do I need to signal āresumeā after āabortā?
not āresetā I presume?
I think it works.
small correction - use slash and not dot in configuration/OmegaConf:parameter_override={'configuration/OmegaConf': dict...')})
AgitatedDove14 looks like service-writing-time for me!
PS can you point me to some official example/ doc for how to persist/restore state so that tasks are restartable?
@ https://app.slack.com/team/UT8T0V3NE is there a non-free version support for the feature of preempting lower priority tasks to allow a higher priority task to come in?
SmugHippopotamus96 how did this setup work for you? are you using an autoscaling node group for the jobs?
with or without GPU?
Any additional tips on usage?
yeah, its a tradeoff that is dependent on parameters that lie outside the realm of human comprehension.
Letās call if voodoo.
Yes, the manual selection can be done via tagging a model.
The main thing is that I want the selection to be part of the overall flow.
I want the task of human tagging a model to be ājust another step in the pipelineā
could work! is there a way to visualize the pipeline such that this step is āstuckā in executing?
AgitatedDove14 thanks, it was late and I wasnāt sure if I needed to use one of clearml ācertifiedā AMIās or just a vanilla one.
CostlyOstrich36 Iāve tried the pipeline_from_decorator.py example and it works.
Could it be a sensitivity to some components being on a different python .py file relative to the controller itself?
I suppose that yes; and I want this task to be labeled as such that itās clear itās the āproductionā task.
CostlyOstrich36 not that I am aware of deleting etc.
I didnāt set up the env thoughā¦
the above only passes the overrides if I am not mistaken
AgitatedDove14 I havenāt done a full design for this š
Just referring to how DVC claims it can detect and invalidate changes in large remote files.
So I take it there is no such feature in http://clear.ml š
if the state is :
a:a a/.DS_Store a/1.txt a/b a/b/.DS_Store a/b/1.txt a/b/c a/b/c/1.txtDataset B:b b/2.txt b/c b/c/2.txtThen the commandmv b a/returns error since a/ is not empty.
Thatās exactly the issueā¦
As a result, I need to do somethig which copies the files (e.g. cp -r or StorageManager.upload_folder(ābā, āaā)
but this is expensive
As far I know storage can be https://clear.ml/docs/latest/docs/integrations/storage/#direct-access .
typical EBS is limited to being mounted to 1 machine at a time.
so in this sense, it wonāt be too easy to create a solution where multiple machines consume datasets from this storage type
PS https://docs.aws.amazon.com/AWSEC2/latest/UserGuide/ebs-volumes-multi.html is possible under some limitations
python 3.8
Iāve worked around the issue by doing:sys.modules['model'] = local_model_package
Yes, but this is not the use-case.
The use-case is that I have a local folder and I want to merge a dataset into it without re-fetching the local folderā¦
Tried with 1.6.0, doesnāt work
#this is the parent clearml-data create --project xxx --name yyy --output-uri `
clearml-data add folder1
clearml-data close
#this is the child, where XYZ is the parent's id
clearml-data create --project xxx --name yyy1 --parents XYZ --output-uri
clearml-data add folder2
clearml-data close
#now I get the error above `
I mean, if itās not tracked, I think it would be a good feature!