Reputation
Badges 1
25 × Eureka!The upload itself is in the background.
It should not take long to prepare the plot for sending. Are you experiencing a major delay ?
Hi JitteryCoyote63 ,
When you shutdown the task (manually with close() or when the process finish) it wait for the uploads...
Why do you need to specifically wait for all the artifacts upload? (currently you can stop the artifacts upload thread and wait for all the artifacts, but that seems like a bad hack)
Hi @<1801424298548662272:profile|ConvolutedOctopus27>
I am getting errors related to invalid git credentials. How do I make sure that it's using credentials from local machine?
configure the git_user/git_pass (app key) inside your clearml.conf on the machine with the agent:
None
none of my pipeline tasks are reporting these graphs, regardless of runtime. I guess this line would also fix that?
Same issue, that said, good point, maybe with pipeline we should somehow make that a default ?
PanickyMoth78 RC is outpip install clearml==1.6.3rc1π€
The difference is that running the agent in daemon mode, means the "daemon" itself is a job in SLURM.
What I was saying is pulling jobs from the clearml queue and then pushing them as individual SLURM jobs, does that make sense ?
You can always specify diff clearml.conf files with --config-file π
EnviousStarfish54 you can use Use Task.set_credentials
Notice that OS environment or trains.conf will override the programmatic credentials
https://allegro.ai/docs/task.html#trains.task.Task.set_credentials
I appended python path with /code/app/flair in my base image and execute
the python path is changing since it installs a new venv into the system.
Let me check what's going on with the pythonpath, because it is definitely is changed when running the code (the code base root folder is added to it). Maybe we need to make sure that if you had PYTHON PATH pre-defined we restore it.
YummyMoth34
It tried to upload all events and then killed the experiment
Could you send a log?
Also, what's the train package version ?
Thanks! @<1792364603552829440:profile|TestyBeetle31> I'll pass it to the maintainers
(Just a thought, maybe we just need to combine Kedro-Viz ?)
I'm checking the possibility of our firewall between the
clearml-agent
machine and the local computer running the
session
Maybe... the thing is, how come the session creates a Task, push it into the queue, but the Task itself is empty.
Hence my request for the clearml-session console log, like actual copy paste of what you have in the terminal, not the Task log from the UI
Whatβs interesting to me (as a ClearML newbie) is itβs clearly compiling that wheel using my host machine (MacOS).
Hmm kind of, and kind of not.
If you take a look at the Tasks created (regardless on how they are created,. pipeline, manually, etc.), you have a list of python packages required by the code, as they are detected at runtime (i.e. when the code was first executed, on the development machine). When creating a Pipeline controller (runner), the pipeline Tasks are just lists, ...
Hmm we might need more detailed logs ...
When you say there is a lag, what exactly doe s that mean? if you have enough apiserver instances answering the requests, the bottleneck might be the mongo or the elastic ?
BTW: I think we had a better example, I'll try to look for one
Also, don't be shy, we love questions π
Interesting...
We could followup the .env configuration, and allow the clearml-task to add configuration files from cmd line. This will be relatively easy to add. We could expand the Environment support (that somewhat exists), and add the ability to read variables from .emv and Add them to an "hyperparemeter" section, named Environment. wdyt?
And how is the endpoint registered ?
We are here if you need further help π
Could it be in a python at_exit event ?
Would love to just cap it at a fixed amount for a month for API calls.
Try the timeout configuration, I think this shoud solve all your issues, and will be fairly easy to set for everyone
Hi LovelyHamster1 ,
you mean totally ignore the "installed packages" section, and only use the requirements.txt ?
This is exactly what I did here, and it is working π
https://demoapp.demo.clear.ml/projects/0e919ea1cc5c499b99e1ab85004b6e97/experiments/887edef09d4549e88b829a34c87d4d5b/output/execution
The release was supposed to be out this week, got delayed by some py2 support issue, anyhow the release will be almost exactly like the latest we now have on the GitHub repo (and I'm assuming it will be out just after the weekend)
Hi CostlyElephant1
What do you mean by "delete raw data"? Data is always fetched to cached folders and clearml takes care of cache cleanup
That said notice that get mutable copy is a target you specify, in this case you should definetly delete after usage. Wdyt ?
Wait I might be completely off.
Is this line "hangs" ?
task.execute_remotely(..., exit_process=True)