
Reputation
Badges 1
25 × Eureka!WackyRabbit7 just making sure I understand:MedianPredictionCollector.process_results
Is called after the pipeline is completed.
Then inside the function, Task.current_task() returns None.
Is this correct?
SmarmySeaurchin8args=parse.parse() task = Task.init(project_name=args.project or None, task_name=args.task or None)
You should probably look at the docstring π
:param str project_name: The name of the project in which the experiment will be created. If the project does
not exist, it is created. If project_name
is None
, the repository name is used. (Optional)
:param str task_name: The name of Task (experiment). If task_name
is None
, the Python experiment
...
my experiment logic
you mean the actual code doing the training ?
so that it gets lazily executed and not at task definition time
Task definition time -> when creating the Pipeline Task? remember the base_task_factory a the end creates a Task object (it does not run the code itslef).
BTW: if you have simple training logic you can use pipeline decorators , it might be a better fit?
https://clear.ml/docs/latest/docs/fundamentals/pipelines#pipeline-from-function-decorator
Wow, thank you very much. And how would I bind my code to task?
you mean the code that creates pipeline Tasks ?
(remember the pipeline itself is a Task in the system, basically if your pipeline code is a single script it will pack the entire thing )
ReassuredTiger98 are you saying you want to be able to run the pipeline as a standalone and as "remote pipeline",
Or is this for a specific step in the pipeline that you want to be able to run standalone/pipelined ?
LOL I keep typing clara without noticing (maybe it's the nvidia thing I keep thinking about)
Carla makes much more sense π
Really stoked to start using it and introduce a more sane ML ops workflow at my workplace lol.
Totally with you π
... would that be aΒ
Model Registry Store
Β plugin?
YES please β€
So we actually just introduced "Applications" into the clearml free tier, https://app.community.clear.ml/applications
Allowing you to take any Task in the system and make it an "application" (a python script running on one of the service agents), with the ability to configu...
Hi ReassuredTiger98
I think you should have something like:
` @PipelineDecorator.component(task_type=TaskTypes.application, docker='clara_docker_container_if_we_need')
def step_one(param):
print('step_one')
import os
os.system('run me clara')
# I'm assuming we should wait?
return
@PipelineDecorator.component(task_type=TaskTypes.training)
def step_two(param):
print('step_two')
import something
somthing.to_do()
return
@PipelineDecorator.pipeline(name='c...
Hi ReassuredTiger98
but I would rather just define a function that returns the task directly
π
Check it out:
https://github.com/allegroai/clearml/blob/36ee3d61209e413a917d8a718fb25f389143cfa1/clearml/automation/controller.py#L205:param base_task_factory: Optional, instead of providing a pre-existing Task, provide a Callable function to create the Task (returns Task object)
No worries, I'll see what I can do π
Could it be the credentials are actually incorrect? because it seems like you can access the server? (I assume you were able to browse to it and generate credentials. right?)
BTW: see if this works:$ CLEARML_API_HOST_VERIFY_CERT=0 clearml-init
What are you seeing?
UnevenDolphin73 if you have the time to help fix / make it work it will be greatly appreciated π
Could it be you have two entries of "console_cr_flush_period" ?
ClearML maintains a github action that sets up a dummy clearml-server,
You have one, it's the http://app.clear.ml (not a dummy one, but for this purpose it will work)
thoughts ?
To auto upload the model you have to tell clearml to upload it somewhere, usually by passing output_uri to Task.init or setting the default_output_uri in the clearml.conf
then will clearml associate that image with my experiment and always use that image with it,
when you say "agent to use my docker image," I'm assuming you mean the configuration file or --docker
argument, in both cases this means Default conatiner.
This means that if the Task does Not specify a docker, it will use the one you set in the conf/argument, But Tasks can always specify a different docker to use, and the agent will pull the requested docker based on the Task's entry.
Eve...
Sure thing, thanks FlutteringWorm14 !
But in credentials creation it still shows 8008. Are there any other places in docker-compose.yml where port from 8008 to 8011 should be replaced?
I think there is a way to "tell" it what to out there, not sure:
https://clear.ml/docs/latest/docs/deploying_clearml/clearml_server_config#configuration-files
WackyRabbit7 if this is a single script running without git repo, you will actually get the entire code in the uncommitted changes section.
Do you mean get the code from the git repo itself ?
Hi @<1618780810947596288:profile|ExuberantLion50>
Iβm trying to containerize a task using clearml-agent build, following instructions from the docs online.
Do you mean to create a container with the Task's environment for debugging ?
If this is for running the Task there is no need to create a specific container for it, both code and python env are cached.
UpsetCrocodile10
Does this method expectΒ
my_train_func
Β to be in the same file as
As long as you import it and you can pass it, it should work.
Child exp get's abortedΒ immediately ...
It seems it cannot find the file "main.py" , it assumes all code is part of a single repository, is that the case ? What do you have under the "Execution" tab for the experiment ?
Hi ShinyWhale52
This is just a suggestion, but this is what I would do:
- use
clearml-data
and create a dataset from the local CSV fileclearml-data create ... clearml-data sync --folder (where the csv file is)
2. Write a python code that takes the csv file from the dataset and creates a new dataset of the preprocessed data
` from clearml import Dataset
original_csv_folder = Dataset.get(dataset_id=args.dataset).get_local_copy()
process csv file -> generate a new csv
preproces...
Hi @<1597399925723762688:profile|IrritableStork32>
I think that if you have clearml installed an configured on your machine it should just work:
None
Hi @<1603198134261911552:profile|ColossalReindeer77>
Hello! does anyone know how to do
HPO
when your parameters are in a
Hydra
Basically hydra parameters are overridden with "Hydra/param"
(this is equivalent to the "override" option of hydra in CLI)
Thanks CynicalBee90 I appreciate the discussion! since I'm assuming you will actually amend the misrepresentation in your table, let me followup here.
1.
SPSS license may be a significant consideration for some, and so we thought it was important to point this out clearly.
SPSS is fully open-source compliant unless you have the intention of selling it as a service, I hardly think this is any users consideration, just like anyone would be using mongodb or elastic search without think...
MuddySquid7 I might have found something, and this is very very odd, it seems it will Not upload any new images post the history size, which is very odd considering the number of users actively using this feature...
Do you want to try a hack to see if it solved your issue ?