Hi OutrageousSheep60 ! The fix for Dataset.list_datasets()
will be out in the next release of ClearML SDK. Sorry for the inconvenience!
Hi @<1587615463670550528:profile|DepravedDolphin12> ! get()
should indeed return a python object. What clearml version are you using? Also, can you share the code?
Hi @<1576381444509405184:profile|ManiacalLizard2> ! Can you please share a code snippet that I could run to investigate the issue?
Haha, should not be too complicated to add one. We will consider it. Thanks for reporting the issue
We used to have "<=20" as the default pip version in the agent. Looks like this default value still exists on your machine. But that version of pip doesn't know how to install your version of pytorch...
Hi @<1571308003204796416:profile|HollowPeacock58> ! The changes should be reflected. Do you have a small example that could help us reproduce the issue?
Yeah, that's always the case with complex systems 😕
After you do s['Function']['random_number'] = random.random()
you still need to call set_parameters_as_dict(s)
FierceHamster54 As long as you are not forking, you need to use Task.init
such that the libraries you are using get patched in the child process. You don't need to specify the project_name
, task_name
or outpur_uri
. You could try locally as well with a minimal example to check that everything works after calling Task.init
.
Hi DilapidatedDucks58 ! Browsers display double spaces as a single space by default. This is a common problem. What we could do is add a copy to clipboard
button (it would copy the text properly). What do you think?
Your object is likely holding some file descriptor or something like that. The pipeline steps are all running in separate processes (they can even run on different machines while running remotely). You need to make sure that the objects that you are returning are thus pickleable and can be passed between these processes. You can try to see that the logger you are passing around is indeed pickalable by calling pickle.dump(s)
on it an then loading it in another run.
The best practice would ...
what do you get when you run this code?
from clearml.backend_api import Session
print(Session.check_min_api_server_version("2.17"))
Hi ObedientDolphin41 ! Python allows you to decorate functions dynamically. See this example:
` from clearml.automation.controller import PipelineDecorator
@PipelineDecorator.component(repo=" ", repo_branch="master")
def step_one():
print('step_one')
return 1
def step_two_dynamic_decorator(repo=" ", repo_branch="master"):
@PipelineDecorator.component(repo=repo, repo_branch=repo_branch)
def step_two(arg):
print("step_two")
return arg
return step...
DangerousDragonfly8 Yes this is correct, we mixed-up the places we call these functions
DangerousDragonfly8 you can try to start the pipeline like this:pipe.start(step_task_completed_callback=callback)
where callback has the signature:def callback(pipeline, node, parameters): print(pipeline, node, parameters)
Note that even tho the parameter name is step_task_completed_callback
, it is actually ran before the task is started. This is actually a bug...
We will need to review the callbacks, but I think you can work with this for now...
Hi DangerousDragonfly8 ! Sorry for the late reply. I'm taking a look and will come back to you shortly
MammothParrot39 try to set this https://github.com/allegroai/clearml-agent/blob/ebb955187dea384f574a52d059c02e16a49aeead/docs/clearml.conf#L82 in your clearml.conf
to "22.3.1"
Hi @<1570583237065969664:profile|AdorableCrocodile14> ! get_local_copy
will always copy/download external files to a folder. To get the external files, there is property on the dataset called link_entries
which returns a list of LinkEntry
objects, which contain a link
attribute, and each such link should point to a extrenal file (in this case, your local paths prefixed with file://
)
FiercePenguin76 Are you changing the model by pressing the circled button in the first photo? Are you promted with a menu like in the second photo?
MotionlessCoral18 If you provide the model as a hyperparam, then I believe you should query its value by calling https://clear.ml/docs/latest/docs/references/sdk/task/#get_parameters or https://clear.ml/docs/latest/docs/references/sdk/task/#get_parameter
You need to specify it. Or you could specify this in your config: https://github.com/allegroai/clearml/blob/54c601eea2f9981bb8e360a8203bc36696a55cfd/clearml/config/default/sdk.conf#L164
Hi! Can you please provide us with code that would help us reproduce this issue? Is it just downloading from gcp?
Hi RoundMosquito25 ! What clearml version are you using? Do you get any error messages when you are setting floats instead of strings?
great, glad you found a work-around
Hi @<1626028578648887296:profile|FreshFly37> ! Indeed, the pipeline gets tagged once it is running. Actually, it just tags itself. That is why you are encountering this issue. The version is derived in 2 ways: either you manually add the version using the version
argument in the PipelineController
, or the pipeline fetches the latest version out of all the pipelines that have ran, and auto-bumps that.
Please reference this function: [None](https://github.com/allegroai/clearml/blob/05...
Hi @<1524560082761682944:profile|MammothParrot39> ! A few thoughts:
You likely know this, but the files may be downloaded to something like /home/user/.clearml/cache/storage_manager/datasets/ds_e0833955ded140a69b4c9c9d8e84986c
. .clearml
may be hidden and if you are using an explorer you are not able to see the directory.
If that is not the issue: are you able to download some other datasets, such as our example one: UrbanSounds example ? I'm wondering if the problem only happens fo...
Hi @<1555000557775622144:profile|CharmingSealion31> ! When creating the HyperParameterOptimizer
, pass the argument optuna_sampler=YOUR_SAMPLER
.