Reputation
Badges 1
25 × Eureka!Thanks JuicyFox94 for letting us know.
I'm checking what's the status with it
but not as a component (using the decorator)
Hmm yes, I think that component calling component as an external component is not supported yet
(basically the difference is , is it actually running as a function, or running on a different machine as another pipeline component)
I noticed that when a pipeline step returns an instance of a class, it tries to pickle.
Yes this is how the serialization works, when we pass data from one node to another (by design it supports multiple mach...
How can i find queue name
You can generate as many as you like, the default one is called "default" but you can add new queues in the UI (goto workers & queus page, then Queues, and click "+ New Queue"
RipeWhale0 are you taking them from here?
https://artifacthub.io/packages/helm/allegroai/clearml
True, this is exactly the reason. That said, you can always manually add it. You can see the default values : https://github.com/allegroai/trains-agent/blob/master/docs/trains.conf
Agent works when I am running it from virtual environment but stucks in the same place all the time when I using Docker
Can you please provide a log? I'm not sure what it means stuck
First I would check the CLI command it will basically prefill it for you:
https://clear.ml/docs/latest/docs/apps/clearml_task
Specifically to your question, working directory "." is the root of the git repo
But I would avoid adding it manually, use the CLI, it will either use ask you to provide info or take the git repo details from the local copy
Hi SubstantialElk6
Generally speaking here, the idea is that actual code creates a Dataset (i.e. Dataset class created from code), plus you can add some metric reporting (like table reporting) to create a preview of the data stored for better visibility, or maybe create some statistics as part of the data ingest script. Then this ingest code can be relaunched / automated. The created Dataset itself can be tagged renamed added key/value for better cataloging. wdyt?
Hi DilapidatedDucks58 ,
I'm not aware of anything of this nature, but I'd like to get a bit more information so we could check it.
Could you send the web-server logs ? either from the docker or the browser itself.
Basically it is the same as "report_scatter2d"
Shouldn't this be a real value and not a template
you mean value being pulled to the pod that failed ?
Hmm should be pushed later today, meanwhile:
` from clearml import Task
from clearml.automation.trigger import TriggerScheduler
def func(*args, **kwargs):
print('test', args, kwargs)
if name == 'main':
s = TriggerScheduler(pooling_frequency_minutes=1.0)
s.add_model_trigger(
name='trigger 1', schedule_function=func,
trigger_project='examples', trigger_on_tags=['deploy']
)
s.add_model_trigger(
name='trigger 2',
schedule_task_id='3f7...
GiganticTurtle0 My apologies, I made a mistake, this will not work 😞
In the example above "step_two" is executed "instantaneously" , meaning it is just launching the remote task, it is not actually waiting for it.
This means an exception will not be raised in the "correct" context (actually it will be raised in a background thread).
That means that I think we have to have a callback function, otherwise there is no actual way to catch the failed pipeline task.
Maybe the only re...
Hmm so the concept of "company" wide configuration is supported in the enterprise version.
I'm trying to think of a "hack" to just pass these env/conf ...
How are you spinning the agent machines?
There was an issue in some versions where seeborn plots were blank. Is that the case?
They could, the problem by the time you set them,they have been read into the variables.
Maybe we should make it lazy loaded, it will also speedup the import.
Hi @<1687653458951278592:profile|StrangeStork48>
- Agreed,
- Notice this user/pass is only used for the initial authentication, after that all authentication is done via a signed JWT tokenHow about a GitHub issue with the feature request, if there is enough interest (or someone jumps in offering implementation) we can push it forward. What do you think?
Hi JitteryCoyote63 a few implementation details on the services-mode, because I'm not certain I understand the issue.
The docker-agent (running in services mode) will pick a Task from the services queue, then it will setup the docker for it spin it and make sure the Task starts running inside the docker (once it is running inside the docker you will see the service Task registered as additional node in the system, until the Task ends) once that happens the trains-agent will try to fetch the...
is there a code that can reproduce it ?
Hi MinuteGiraffe30
Thank you so much for your awesome product!
😍 !
s address 10.68.167.10. I am able to send requests from all other virtual machines on the server to the address 10.68.167.10:8008. However, when I try to do this from my own computer connected to the corporate network via VPN, it fails to connect to 8008.
I'm assuming there is a firewall on the VPN connection itself (i.e. the VPN gateway) that blocks 8008 port, as you already tried curl to 8008 is...
No -- that section is blank,
This is the main issue, it should be filled with the requirement being auto detected.
The entire script was executed from within vscode, and the Task was created but it was not prefilled with anything ?
Just making sure, you called Task.init
inside your code ?
FrothyShark37 what was different in your script ?
Hi CleanPigeon16
Put the specific git into the "installed packages" section
It should look like:... git+
...
(No need for the specific commit, you can just take the latest)
TrickyRaccoon92 the title
provided by write.scalars is also a representing string for the specific metric. This is more than just a title on the plot itself.
It means that this will be the name of the scalar metric (title/series combination) .
Is that your intention, or is it for viewing purpose only?
Hi DeliciousKoala34
I am using Pycharm and i have set up the clear-ml plugin, but it still doesnt work.
Did you provide the key/secret to the plugin? I think this is a must for it to actually work