Reputation
Badges 1
21 × Eureka!I believe this also impacts python 3.8 + windows btw
I believe pipeline_from_tasks.py
used to be called pipeline_controller.py
Windows 10, 3.9, using conda but even pip list should output something
Basically the file URI might be different on a different machine (out of my control) but they point to the same artifact storage location
ah that is perfect, thank you
I think I'll just switch to docker and mount
No just a dictionary from the pipeline examples:
https://github.com/allegroai/clearml/blob/b010f775bdd72ba6729f5e1e569626692d7b18af/examples/pipeline/step3_train_model.py#L13-L17
I am using Task.force_requirements_env_freeze()
(before Task.init) with no requirements file specified (because I do not have one), and my experiments are showing this under packages:
Yes I just did pip list
in the env and it worked fine
Okay, definitely feel better knowing I am not crazy 👍
I'm trying to accomplish that but instead of a pip freeze I just get a "Packages will be installed from projects requirements file" even though there is no projects requirements file. Do I need to do something with development.detect_with_pip_freeze
?
Okay, thanks for looking into it
Dang that's not good news for me
I'll be your n00b w1nd0ws beta tester lol
` from clearml import Task
args = {'hello': 'world'}
Task.force_requirements_env_freeze()
task = Task.init(project_name='example', task_name="Test")
task.connect(args)
task.execute_remotely() `
You can see installed packages with that script?
I do see that once a task fails the installed package list is updated with a full list of the environment... I don't know if that helps at all
Could it be because some of the packages then point to files?pyarrow==5.0.0
pandas @ file:///D:/bld/pandas_1624391191530/work