@<1523701435869433856:profile|SmugDolphin23> then the issue is that config is not set. I also tried with:
import yaml
import argparse
from my_pipeline.pipeline import run_pipeline
from clearml import Task
parser = argparse.ArgumentParser()
parser.add_argument('--config', type=str, required=True)
if __name__ == '__main__':
if Task.running_locally()::
args = parser.parse_args()
with open(args.config) as f:
config = yaml.load(f, yaml.FullLoader)
else:
config = None
run_pipeline(config)
But then it prints None
, so the pipeline parameters are completly ignored
Hi @<1523701435869433856:profile|SmugDolphin23> , I just tried it but Task.current_task()
returns None
even when running in remotely
basically, I think that the pipeline run starts from __
main_
_
and not the pipeline function, which causes the file to be read
Hi @<1570220858075516928:profile|SlipperySheep79> ! What happens if you do this:
import yaml
import argparse
from my_pipeline.pipeline import run_pipeline
from clearml import Task
parser = argparse.ArgumentParser()
parser.add_argument('--config', type=str, required=True)
if __name__ == '__main__':
if not Task.current_task():
args = parser.parse_args()
with open(args.config) as f:
config = yaml.load(f, yaml.FullLoader)
run_pipeline(config)
Also: what's the purpose of storing the pipeline arguments as artifacts then? When it runs remotely it still runs the main script as entrypoint and not the pipeline function directly, so all the arguments will be replaced by what is passed to the function during the remote execution, right?
Hi @<1523701087100473344:profile|SuccessfulKoala55> , I think the issue is where to put the connect_configuration
call. I can't put it inside run_pipeline
because it's only running remotely and it doesn't have access to the file, and I can't put it in the script before the call to run_pipeline
since the task has not been initialized yet.
@<1570220858075516928:profile|SlipperySheep79> depending on a local file is always an issue - would try to connect a configuration based on this file, so that it will be loaded when running locally and than retrieved from the backend when running remotely
I've upladed an example here for simiplicity: None
For instance, I have in my_pipeline/__main__.py
:
import yaml
import argparse
from my_pipeline.pipeline import run_pipeline
parser = argparse.ArgumentParser()
parser.add_argument('--config', type=str, required=True)
if __name__ == '__main__':
args = parser.parse_args()
with open(args.config) as f:
config = yaml.load(f, yaml.FullLoader)
run_pipeline(config)
and in my_pipeline/pipeline.py
:
@PipelineDecorator.pipeline(
name='Main',
project=None,
default_queue='default',
pipeline_execution_queue='default',
start_controller_locally=False,
repo='
',
add_run_number=False)
def run_pipeline(config: Dict):
print(config)
I'm running this on an agent in docker mode
Hi @<1570220858075516928:profile|SlipperySheep79> , I think it depends on your code. Can you provide a self contained code snippet that reproduces this?