@<1523701205467926528:profile|AgitatedDove14> I could test it but I just recently fixed this issue by caching the previous step where this artifact is coming from. Now I'm getting the dataframe itself instead of link to artifact.
I don't know should I waste our time on this? However, it's very interesting why ability to cache the step impacts artifacts behavior
@<1544853721739956224:profile|QuizzicalFox36> , yes 🙂
http://<host>:8081/lp_veh_detect_train_pipeline/.pipelines/vids_pipe/detect_frames.4a80b274007d4e969b71dd03c69d504c/artifacts/videos_df/videos_df.csv.gz
(the <host>
contains correct hostname)
@<1523701205467926528:profile|AgitatedDove14> it is as expected - a dataframe
I didn't understand how to use it. Everything I've tried failed. Could you give me an example?
What do you have in the artifacts of this task id: 4a80b274007d4e969b71dd03c69d504c
Hi @<1544853721739956224:profile|QuizzicalFox36>
http:/34.67.35.46:8081/...
notice there is a / missing in the link, how is that possible? it should be http://
You can take a look at the pipeline examples here:
None
Transferring artifacts between tasks is exactly what they do.
@<1523701070390366208:profile|CostlyOstrich36> Hi! Sorry for not responding for a long time. I couldn't reproduce my issue until today. So usually, I don't need to use Storage manager as I would get the contents of the parameter directly. However, for some reason once again instead of contents of dataframe I've got a PosixPath to the artifact or string describing dict with url to artifact. I've implemented if statement to catch such cases but again I can't access the artifact. I've tried StorageManager.download_file(str(artifacts_PosixPath)) and got same path as response or when accessing as string StorageManager.download_file(artifact_as_string_dict)) (I would get it if I pass .artifacts.df), got error ValueError: Requested path does not exist: /home/monika_kazlauskaite/.clearml/venvs-builds.1/3.8/code/{'name': 'videos_df', 'size': 1357, 'type': 'pandas', 'mode': <ArtifactModeEnum.output: 'output'>, 'url': 'http:/11.11.11.11:8081/lp_veh_detect_train_pipeline/.pipelines/vids_pipe/detect_frames.4a80b274007d4e969b71dd03c69d504c/artifacts/videos_df/videos_df.csv.gz',
@<1523701205467926528:profile|AgitatedDove14> I've checked my configs and it's all good, no / is missing
I think your "files_server" is misconfigured somewhere, I cannot explain how you ended up with this broken link...
Check the clearml.conf on the machines or the env vars ?
(you can find it in the pipeline component page)
@<1523701205467926528:profile|AgitatedDove14> Hi, I have no idea as I don't upload the file to the artifacts myself. I return df from function that is in previous pipeline step and then pass it as a parameter for this step
@<1544853721739956224:profile|QuizzicalFox36> , are you running the steps from the machine who's config you checked?
Hi @<1544853721739956224:profile|QuizzicalFox36> ,
You can use StorageManager.download_file()
to easily fetch files.
None
However, it's very interesting why ability to cache the step impacts artifacts behavior
From you log:
videos_df = StorageManager.download_file(videos_df)
Seems like "videos_df" is the DataFrame, why are you trying to download the DataFrame ? I would expect to try and download the pandas file, not a DataFrame object
I suggest reading all of them, starting with pipeline from tasks 🙂
Hmm, can you send the full log of the pipeline component that failed, because this should have worked
Also could you test it with the latest clearml python version (i.e. 1.10.2)
Yes. I've also checked configs in all of my machines just in case