Reputation
Badges 1
533 × Eureka!Wait, suddenly the UI changed to 0.16.1, seems like I was shown a cached page
Saving part from task A:
pipeline = trials.trials[index]['result']['pipeline'] output_prefix = 'best_iter_' if i == 0 else 'iter_' task.upload_artifact(name=output_prefix + str(index), artifact_object=pipeline)
Version 1.1.1
Snippet of which part exactly?
Yeah, logs saying "file not found", here is an example
Loading part from task B:
` def get_models_from_task(task: clearml.Task, model_artifact_substring: str = 'iter_') -> dict:
"""
Extract all models saved as artifacts with the specified substring
:param task: Task to fetch from
:param model_artifact_substring: Substring for recognizing models among artifacts
:return: Mapping between iter number and model instance
"""
# Extract models from task (models are named iter-XXX where XXX is the iteration number)
model_...
I am noticing that the files are saved locally, is there any chance that the files are over-written during the run or get deleted at some point and then replaced?
Yes they are local - I don't think there is a possibility they are getting overwritten... But that depends on how clearml names them. I showed you the code that saves the artifacts, but this code runs multiple times from a given template with different values - essentially it creates like 10 times the same task with different param...
is this already available or only on github?
is it possible to access the children tasks of the pipeline from the pipeline object?
okay but still I want to take only a row of each artifact
Yep what 😄
the ability to exexute without an agent i was just talking about thia functionality the other day in the community channel
I don't know, I'm the one asking the question 😄
I also ran it without $(pwd) on the Create Clearml task templates section, I added it because of CostlyOstrich36 's comments but it didn't help
actually i was thinking about model that werent trained uaing clearml, like pretrained models etc
pgrep -af trains
shows that there is nothing running with that name
Maybe the case is that after start
/ start_locally
the reference to the pipeline task disappears somehow? O_O
Sorry I meant this link
https://azuremarketplace.microsoft.com/en-us/marketplace/apps/apps-4-rent.clearml-on-centos8
No I don't have trains anywhere in my code
Gotcha, didn't think of an external server as Service Containers are part of Github's offering, I'll consider that