Or can I just set it to Task.current_task()
?
This would be my only improvement, otherwise awesome!!!output_model.update_weights(weights_filename=os.path.join(training_data_path, 'runs', 'train', 'yolov5s6_results', 'weights', 'best.onnx'))
FierceHamster54 are you sure you have write permissions ?
Ah no i cant since the pipeline is in its own dummy model and you cannot reattach pipelines to real projects so I must instanciate a dummy task just to attach the output model to the correct project
My bad, the specified file did not exists since I forgot to raise an exception if the export command failed >< Well I guess this is the reason, will test that on monday
do I still need to specify a OutputModel
No need, only if you want to upload a local model file (but I assume in this case, no new model is created)
BTW: what happens if you pass the same s3://bucket to Task.init output_uri
? I assume you are getting the same access issue ?
I have to specify the full uri path ?
No it should be something like " s3://bucket "
the model files management is not fully managed like for the datasets ?
They are 🙂
Oh, that's nice, if I import a model using InputModel do I still need to specify a OutputModel ?
FierceHamster54 I saw you saying the YOLOv5 project and name are hardcoded in there. Fixed that for ya 😉 https://github.com/ultralytics/yolov5/pull/10100
Hi FierceHamster54
Do I need to instantiate a task inside my component ? Seems a bit redundant....
Yes, so the idea is that the Task (along the code) will be automatically linked with the output model, for better traceability.
That said you can "import" a model into the system (i.e. it was created somewhere else and you want to register it with InputModel.import_model
https://clear.ml/docs/latest/docs/clearml_sdk/model_sdk#importing-models
I guess "Input" from that perspective is the code is Using it so it is considered the code's input model rather then the code output, aka it created it
Oh okay, my initial implementation was not far off:
` task = Task.init(project_name='VINZ', task_name=f'VINZ Retraining {datetime.now().strftime("%Y-%m-%d %H:%M:%S")}')
task.set_progress(0)
print("Training model...")
os.system(train_cmd)
print("✔️ Model trained!")
task.set_progress(75)
print("Converting model to ONNX...")
os.system(f"python export.py --weights {os.path.join(training_data_path, 'runs', 'train', 'yolov5s6_results', 'weights', 'best.pt')} --img 1280 --include onnx")
print("✔️ Model converted to ONNX!")
task.set_progress(100)
print("Exporting model to ClearML...")
output_model = OutputModel(
task=task,
framework='yolov5',
name='VINZ Model'
)
output_model.set_upload_destination(' ` ` ')
output_model.update_weights(os.path.join(training_data_path, 'runs', 'train', 'yolov5s6_results', 'weights', 'best.onnx'))
output_model.tags = [f'train-dataset-{continuous_learning_dataset_id[:8]}'] `
Do I need to instantiate a task inside my component ? Seems a bit redundant....
while I'm looking to upload local weights
Oh, so this is not "importing uploaded (exiting) model" but manually creating a Model.
The easiest way to do that is actually to create a Task for Model uploading, because the model itself will be uploaded to unique destination path, and this is built on top of the Task.
Does that make sense ?
Well the credentials are scoped to the entire bucket, but do I have to specify the full uri path ? the model files management is not fully managed like for the datasets ?
Could it be it checks the root target folder and you do not have permissions there only on subfolders?
Okay! Tho I only see a param to specify a weights url while I'm looking to upload local weights
while I want to upload a converted
.onnx
weights with custom tags to my custom project
Oh I see, sure, see this one?
https://github.com/allegroai/clearml/blob/master/examples/reporting/model_reporting.py
Or:output_model.update_weights(weights_filename="/path/to/file.onnx")
Is there an example of this somewhere ? Cause I'm training a YOLOv5 model which already has ClearML intergration built-in but it seems to be hardcoded to attach its task to a Yolov5
project and upload .pt
file as artifact while I want to upload a converted .onnx
weights with custom tags to my custom project
AgitatedDove14 Got that invalid region error on the set_upload_destination()
while the region ( aws-global
) I specified in my agent config worked fine to retrieve a dataset from the same bucket2022-11-04 15:05:40,784 - clearml.storage - ERROR - Failed testing access to bucket XXXXX: incorrect region specified for bucket XXXX (detected region eu-central-1) Traceback (most recent call last): File "/root/.clearml/venvs-builds/3.8/lib/python3.8/site-packages/clearml/model.py", line 1396, in set_upload_destination uri = storage.verify_upload(folder_uri=uri) File "/root/.clearml/venvs-builds/3.8/lib/python3.8/site-packages/clearml/storage/helper.py", line 640, in verify_upload _Boto3Driver._test_bucket_config( File "/root/.clearml/venvs-builds/3.8/lib/python3.8/site-packages/clearml/storage/helper.py", line 1698, in _test_bucket_config raise StorageError(msg) clearml.storage.helper.StorageError: Failed testing access to bucket ml-clearml.xhr.fr: incorrect region specified for bucket ml-clearml.xhr.fr (detected region eu-central-1) During handling of the above exception, another exception occurred: Traceback (most recent call last): File "/root/.clearml/venvs-builds/3.8/task_repository/yolov5.git/train_model.py", line 64, in <module> results = train_model(**kwargs) File "/root/.clearml/venvs-builds/3.8/task_repository/yolov5.git/train_model.py", line 36, in train_model output_model.set_upload_destination('
') File "/root/.clearml/venvs-builds/3.8/lib/python3.8/site-packages/clearml/model.py", line 1398, in set_upload_destination raise ValueError("Could not set destination uri to: %s [Check write permissions]" % uri) ValueError: Could not set destination uri to:
[Check write permissions]
Well I uploaded datasets in the previous steps with the same credentials