
Reputation
Badges 1
69 × Eureka!Now, in other task I want to upload the artifact by using get_local_copy()dataset_upload_task = Task.get_task(task_id=args['dataset_task_id']) local_json = dataset_upload_task.artifacts['dataset'].get_local_copy()
But the path that I got contain '' and '/' (i.e combination of unix and win foramt)
CostlyOstrich36
The pipline demo is still stack on running,
The first step still on pending, and belong to services queue
SuccessfulKoala55
Hi,
Using task.upload_artifact
No, Its stuck here:
Collecting botocore<1.23.0,>=1.22.9
Using cached botocore-1.22.12-py3-none-any.whl (8.1 MB)
SuccessfulKoala55
The pipline demo is still stack on running,
The first step still on pending, and belong to services queue
CostlyOstrich36 thanks.
Maybe I should pass something in extra_docker_arguments in the config file?
Hi SweetBadger76
It worked great yesterday and also a week ago ..
I'm trying to figure out what happened because I did not make any change to the code
Haa,
How can I change it?
The pipline is on services
and the first task on defult
SweetBadger76
There's something I can do to help you check? (task id, project name etc.)
In the init I passed output_uri=Folder
I need to change to pipe.set_default_execution_queue('services')?
or leave it defult?
I found the problem I think, thanks!
The sdk.aws.s3.credentials.0.host and sdk.aws.s3.credentials.0.key, yes
Maybe I should pass something in extra_docker_arguments?
CostlyOstrich36
Hi CostlyOstrich36 CrookedWalrus33 AgitatedDove14
When I init an agent and run a task it works, but when I run the same task again it does not map the keys..