Hello, Is there a config option where I can force all steps in a Pipeline to run in the same Docker Container?
one year ago
Solved my issue by adding this before the Task.init()
Task.force_requirements_env_freeze(requirements_file='./requirements.txt')
I have a question regarding docker mode use here, I want to run my task remotely on docker, when I execute the script containing task.execute_remotely(), it analyzes the packages in the current env which are different from the ones needed.
I also tried using the task.set_packages('./requirements.txt')
but it didn't work
incompatibility issue because the agent was trying to setup a version of numpy not supported.
I am defining a pipeline step as follows:
pipe.add_function_step(
name='Split Data',
function=step_one,
cache_executed_step=True,
)
from the docs I can only specify the docker image using docker kwarg