Reputation
Badges 1
84 × Eureka!@<1523701083040387072:profile|UnevenDolphin73> : If I do, what should I configure how?
@<1523701070390366208:profile|CostlyOstrich36> I read those, but did not understand.
@<1523701070390366208:profile|CostlyOstrich36> : After more playing around, it seems that ClearML Server does not store the models or artifacts itself. These are stored somewhere else (e.g., AWS S3-bucket) or on my local machine and ClearML Server is only storing configuration parameters and previews (e.g., when the artifact is a pandas dataframe). Is that right? Is there a way to save the models completely on the ClearML server?
@<1523701087100473344:profile|SuccessfulKoala55> I think I might have made a mistake earlier - but not in the code I posted before. Now, I have the following situation:
- In my training Python process on my notebook I train the custom made model and put it on my harddrive as a zip file. Then I run the code
output_model = OutputModel(task=task, config_dict={...}, name=f"...")
output_model.update_weights(weights_filename=r"C:\path\to\mymodel.zip", is_package=True)
- I delete the "...
@<1523701083040387072:profile|UnevenDolphin73> , you wrote
Well, I would say then that in the second scenario it’s just rerunning the pipeline, and in the third it’s not running it at all
(I mean, in both the code may have changed, the only difference is that in the second case you’re interested in measuring it, and in the third,
you’re not, so it sounds like a user-specific decision).
Well, I would hope that in the second scenario step A is not rerun. Yes, in the third scen...
It is documented at None ... super deep in the code. If you don't know that output_uri
in TASK's (!) init is relevant, you would never know...
@<1523701070390366208:profile|CostlyOstrich36> : Thanks, where can I find more information on ClearML's model repository. I hardly find any in the documentation.
Also, that leaves the question open, what Model
is for. I described how I understand, the workflow should look like, but my question remains open...
Thank you I found the error.myPar = task.connect(myPar, name='from TaskParameters')
is required.
@<1523701087100473344:profile|SuccessfulKoala55> Also, I think that - in this case, but also in other cases - the issue is not just the documentation, but also the design of the SDK.
Secondly, I do not understand this:
None says
Manually mark a Task as completed. This will close the running process and will change the Task’s status to Completed (Use this function to close and change status of remotely executed tasks). To simply change the Task’s status to completed, use task.close()
None says
Closes the current Task and cha...
@<1523701083040387072:profile|UnevenDolphin73> : A big point for me is to reuse/cache those artifacts/datasets/models that need to be passed between the steps, but have been produced by colleagues' executions at some earlier point. So for example, let the pipeline be A(a) -> B(b) -> C(c), where A,B,C are steps and their code, excluding configurations/parameters, and a,b,c are the configurations/parameters. Then I might have the situation, that my colleague ran the pipeline A(a) -> B(b) -> C(c...
: What does
- The component code still needs to be self-composed (or, function component can also be quite complex)
Well it can address the additional repo (it will be automatically added to the PYTHONPATH), and you can add auxilary functions (as long as they are part of the initial pipeline script), by passing them to
helper_functions
mean? Is it not possible that I call code that is somewhere else on my local computer and/or in my code base? That makes thi...
Ok, I checked: A is terminated. This is not what I thought would happen and not what I intended with my documentation. I should clarify that.
KindChimpanzee37 , I ensured that the dataset_name is the same in get_data.py and preprocessing.py and that seemed to help. Then, I got the error RuntimeError: No audio I/O backend is available.
, because of which I installed PySoundFile
with pip; that helped. Weirdly enough then, the old id error came back. So, I re-ran get_data.py and then preprocessing.py - this time the id error was gone again. Instead, I got `raise TypeError("Invalid file: {0!r}".format(self.name))
TypeError:...
KindChimpanzee37 : First I went to the dataset and clicked on "Task information ->" in the right bottom corner of the "VERSION INFO". I supposed that is the same as what you meant with "right click on more information"? Because I did not find any option to "right click on more information". The "Task information ->" leads me to a view in the experiment manager. I posted the two screen shots.
PS: It is weird to me that the datamanager leads me to the experiment manager, specifically an experi...
@<1523701083040387072:profile|UnevenDolphin73> : From which URL is your most recent screenshot?
But, I guess @<1523701070390366208:profile|CostlyOstrich36> wrote that in a different chat, right?
>pip show clearml
WARNING: Ignoring invalid distribution -upyterlab (c:\users\...\lib\site-packages)
WARNING: Ignoring invalid distribution -illow (c:\users\...\lib\site-packages)
Name: clearml
Version: 1.6.4
Summary: ClearML - Auto-Magical Experiment Manager, Version Control, and MLOps for AI
Home-page:
None
`Auth...
@<1523701205467926528:profile|AgitatedDove14> In the documentation it warns about .close()
"Only call Task.close if you are certain the Task is not needed."
What does the documentation refer to? My understanding would be that if close the task within a program, I am not able to use the task object anymore as before and I need to retrieve it via query_tasks
to get it again. Is that correct?
How would you compare those to ClearML?
Ah... if I run the same script not from PyCharm, but from the terminal, then it gets completed... puh...
I mean those, that you see in the screen shot. The difference in code is - at least for me - to write
- parameters_data = {'custom1': 'no', 'custom2': False}; parameters_data = task.connect(parameters_data , name='data')
- task.set_user_properties(custom1='no', custom2=False)
It means "The syntax for the file name, folder name or volume label / disk is wrong" somthing along those lines. The [...] is the directory path to my project, which I opened in PyCharm and from which I run the commands in the Python Console.