Hi DizzyPelican17
I’d like to configure requirements file, docker image, docker command for my pipeline controller, but it seems I cannot set it up. Am I missing something?The decorator itself accepts those as arguments:
https://clear.ml/docs/latest/docs/references/sdk/automation_controller_pipelinecontroller#pipelinedecoratorcomponent
https://github.com/allegroai/clearml/blob/90f30e8d9a5ca9a1afa6b2e5ffccb96b0afe9c78/examples/pipeline/pipeline_from_decorator.py#L8
I’d like to setup uploading pipeline artifacts / outputs of pipeline steps to a GCP bucket. By default they are uploaded to a file server which seems suboptimal, but it seems there is no option to set it to gcp bucket by default. Am I missing something?Sure you can configure the file_server so every artifact is uploaded to GCP instead of the default file server:
https://github.com/allegroai/clearml/blob/90f30e8d9a5ca9a1afa6b2e5ffccb96b0afe9c78/docs/clearml.conf#L10
just put there: gs://bucket/folder do not forget to configure your credentials:
https://github.com/allegroai/clearml/blob/90f30e8d9a5ca9a1afa6b2e5ffccb96b0afe9c78/docs/clearml.conf#L126