Reputation
Badges 1
113 × Eureka!I mean, depend on what do you want to report ... if you want to stick to table, I suggest earlier to gather your stats in table format ...
Otherwise, matplotlib seems to be the most user friendly way
In the web UI, in the queue/worker tab, you should see a service queue and a worker available in that queue. Otherwise the service agent is not running. Refer to John c above
normally, you should have a agent running behind a "services" queue, as part of your docker-compose. You just need to make sure that you populate the appropriate configuration on the Server (aka set the right environment variable for the docker services)
That agent will run as long as your self-hosted server is running
while the other may need to be 1 instead of true
CLEARML_AGENT_SKIP_PIP_VENV_INSTALL=/path/to/my/vemv/bin/python3.12 clearml-agent bla
Set that env var in the terminal before running the agent ?
inside the script that launch the agent, I set all the env need (aka disable installation with the var above)
I don't use submodule so don't really know how that behave with ClearML
Without clearml-session, how one could set this up ?? I cannot find any documentation/guide on how to do this ... The official doc seems to say: you start a code server that then connect to vscode.dev Then from your laptop, you go to vscode.dev in order to access to your code server. Is there anyway you do this but without going to vscode.dev ???
If you care about the local destination then you may want to use this None
We are using this: WebApp: 2.2.0-690 • Server: 2.2.0-690 • API: 2.33
you will need to provide more context than that if you don't want the answer: Have you try to turn it off and back on again ?
once you install manually your package inside the docker container, check that your file module_b/templates/my_template.yml is where it should be
Should I put that in the clearml.conf file?
Do you want to use https or ssh to do git clone ? Setting up both in the same time is confusing
so the issue is that for some reason, the pip install by the agent don't behave the same way as your local pip install ?
Have you tried to manually install your module_b with pip install inside the machine that is running clearml-agent ? Seeing your example, looks like you are even running inside docker ?
not sure how that work with Docker and machine that is not set up with ssh public key ... We will go to that path sometime in the future so I am quite interested too, on how people do it without ssh public key
I mean, what happen if I import and use function from another py file ? And that function code changes ?
Or you are expecting code should be frozen and only parameters changes between runs ?
So I tried:
import livsdk.livbatch
import clearml
clearml.Task.add_requirements("livsdk","
")
task = clearml.Task.init(project_name="hieu-test", task_name='base_config')
print("Done")
Which give me this list of Packages Installed:
# Python 3.10.10 (main, Mar 05 2023, 19:07:49) [GCC]
# Local modules found - skipping:
# livsdk == ../[REDACTED]/livsdk/__init__.py
Augmentor == 0.2.10
Pillow == 9.2.0
PyYAML == 6.0
albumentations == 1.2.1
azure_storage_blob == 12.1...
What should I put in there? What is the syntax for git package?
@<1523701070390366208:profile|CostlyOstrich36> Thanks !! That's look much cleaner than task.export_task()['runtime']['gpu_type'] :D
Thanks @<1523701087100473344:profile|SuccessfulKoala55> I missed that one.
I have been playing with exporting task, modifying the "diff" part and importing back as new task. Seems to work as desired. But set_script seems cleaner.
Love how flexible is ClearML !!!
@<1523701205467926528:profile|AgitatedDove14>
What is the env var name for Azure Blob storage ? That the one we use for our Artifiact.
Also, is there function call rather than env var ?
It would be simplier in our case to call a function to set credential for clearml rather than fetch secret and set env var prior to running the python code.
If there is only the option of using env var, I am thinking fetchcing secrets and set env var from python, eg: os.environ["MY_VARIABLE"] = "hello" ...
but then it still missing a bunch of library in the Taks (that succeed) > Execution > INSTALLED PACKAGES
So when I do a clone of that task, and try to run the clone, the task fail because it is missing python package 😞
so in your case, in the clearml-agent conf, it contains multiple credential, each for different cloud storage that you potential use ?
right, in which case you want to dynamically change with your code, not with the config file. This is where the Logger.set_default_output_upload come in
I don;t think ClearML is designed to handle secrets other than git and storage ...
Just to confirm: "output_uri to log everything to S3" is that on the server config or client config (the clearml.conf where the code is actually running) ?
Where the model will be saved/uploaded is defined by the client and not the server.