JuicyFox94
Hi WittyOwl57 ,
The function is :
task.get_configuration_object_as_dict ( name="name" )
with task being your Task object.
You could find a bunch of pretty similar functions in the docs. Have a look at here : https://clear.ml/docs/latest/docs/references/sdk/task#get_configuration_object_as_dict
can you also check that you can access the servers ?
try to do curl http://<my server>:port for your different servers ? and share the results 🙂
I have created the task with Task.create and pulled it with an agent, to change its status. The error now occurs when the agent builds the env : he tries to pull jb_pytest_runner from the repo.
We are working on a solution, I will keep you updated
Do you mean from within a pipeline ? Do you manually report the model ? It might point to a local file, especially if it has been auto logged. That is what happens when you are saving your model (thus to the local file system) from your script.
hey RoughTiger69
Can you describe me how you are setting up the environment variable please ?
Setting up that flag will skip the virtual env installation : the agent will use your environment and the packages installed into it.
Using Task.add_requirements(requirements.txt) allows to add specific packages at will. Note that this function will be executed even with the flag CLEARML_AGENT_SKIP_PIP_VENV_INSTALL set
From a pipeline, you can use PipelineController.upload_model( name , path ) and specify in path the path you used to save your model from your script.
hi OutrageousSheep60
sounds like the agent is in reality ... dead. It sounds logical, because you cannot see it using ps
however, it would worth to check if you still can see it in the UI
hey
You have 2 options to retrieve a dataset : by its id or by the project_name AND dataset_name - those ones are working together, you need to pass both of them !
Hi HandsomeGiraffe70
There is a way, this is the API. You can use it this way :
retrieve the task the model belongs to retrieve the model you want (from a lit of input and output models) create the metadata inject them to the model
Here is an example :
` from clearml import Task
from clearml.backend_api import Session
from clearml.backend_api.services import models
from clearml.backend_api.services.v2_13.models import MetadataItem
task = Task.get_task(project_name=project_name, task_name=...
Hi GentleSwallow91 ,
I can't manage to reproduce the issue, it is working fine for me. I use a local minio docker-based image. The conf file has to be precisely configured, but it seems that you did it ok, because you don't have a denied access here. It is strange that he is waiting for the upload to finish. We have this flag for upload_artefact : wait_on_upload . His default value should be False, but i would try to add it...
Also I don't understand what you mean by " I can see files in ...
Hello Sergios,
We are working on reproducing your issue. We will update you asap
Hi MotionlessCoral18
You need to run some scripts when migrating, to update your old experiments. I am going to try to find you soem examples
Hey
I'll play a bit with what you sent, because reproducing the issues help a lot to solve them. I keep you updated 😊
i am not sure i get you here.
when pip installing clearml-agent, it doesnt fire any agent. the procedure is that after having installed the package, if there isnt any config file, you do clearml-agent init and you enter the credentials, which are stored in clearml.conf. If there is a conf file, you simply edit it and manually enter the credentials. so i dont understand what you mean by "remove it"
We have released a lot of versions since that one 🙂 🙂
Can you please try to upgrade to the lastest clearml (1.6.2) and try again ?
You need to use the API for exporting experiments to csv/excel. I am preparing an example for you
i have found some threads that deal with your issue, and propose interesting solutions. Can you have a look at this ?
hi TenderCoyote78
can you please give some more precision about what you intend to achieve ? I am afraid not to well understand your question
hi SteepDeer88
did you managed to get rid of your issue ?
hi NervousFrog58
Can you share some more details with us please ?
Do you mean that when you have an experiment failing, you would like to have a snippet that reset and relaunch it, the way you do through the UI ?
Your ClearML packages version, and your logs would be very userful too 🙂
can you please provide the apiserver log and the elasticsearch log?
hi RattyLouse61
here is a code example, i hope it will help you to understand better the backend_api.
` from clearml import Task, Logger
from clearml.backend_api import Session
from clearml.backend_api.services import events
task = Task.get_task('xxxxx', 'xxxx')
session = Session()
res = session.send(events.GetDebugImageSampleRequest(
task=task.id,
metric=title,
variant=series)
)
print(res.response_data) `
Hey Igor
I am not the expert about this topic. I have someone who better knows the topic that is coming back to you straight after his meeting. 🙂