![Profile picture](https://clearml-web-assets.s3.amazonaws.com/scoold/avatars/AgitatedDove14.png)
Reputation
Badges 1
25 × Eureka!Python3.8 I can quickly check, give me a minute
JitteryCoyote63 good news
not trains-server error, but trains validation error, this is easily fixed and deployed
Hi @<1523701601770934272:profile|GiganticMole91>
Do you mean something like a git ops triggered by PR / tag etc ?
Hi @<1541954607595393024:profile|BattyCrocodile47>
Can you help me make the case for ClearML pipelines/tasks vs Metaflow?
Based on my understanding
- Metaflow cannot have custom containers per step (at least I could not find where to push them)
- DAG only execution. I.e. you cannot have logic driven flows
- cannot connect git repositories to different component in the pipeline
- Visualization of results / artifacts is rather limited
- Only Kubernetes is supported as underlying prov...
Hmmm maybeΒ
Β I thought that was expected behavior from poetry side actually
I think this is the expected behavior, hence bug?!
Just call the Task.init before you create the subprocess, that's it π they will all automatically log to the same Task. You can also call the Task.init again from within the subprocess task, it will not create a new experiment but use the main process experiment.
Hmm that should have worked ...
I'm assuming the Task itself is running on a remote agent, correct ?
Can you see the changes in the OmegaConf section ?
what happens when you pass--args overrides="['dataset.path=abcd']"
Maybe different API version...
What's the trains-server version?
What I'm trying to do is to filter is between two datetimes...Is that possible?
could you expand ?
RC should be out later today (I hope), this will already be there, I'll ping here when it is out
JitteryCoyote63 you mean from code?
DeliciousSeal67
are we talking about the agent failing to install the package ?
basically use the template π we will deprecate the override option soon
I see them run reliably (no killed), are they running in service mode?
How do you deploy agents, with the clearml k8s glue ?
Actually it hasn't changed ...
You need trains-server support, so if trains v0.15 is working with older backend it will revert to "training" type
WackyRabbit7 this is funny, it is not ClearML providing this offering
some generic company grabbed the open-source and put t there, which they should not π
JitteryCoyote63 There is a basic elastic license that should always be there. If for some reason it was deleted/expired then the following command should fix it:
curl -XPOST ' http://localhost:9200/_xpack/license/start_basic '
Closing the data doesnt work: dataset.close() AttributeError: 'Dataset' object has no attribute 'close'
Hi @<1523714677488488448:profile|NastyOtter17> could you send he full exception ?
Hi @<1600661423610925056:profile|StrongMouse81>
using serving base url and also other endpoint of model we add using:
clearml-serving model add
we get the attached respond:
And other model endpoints are working for you?
Hi MortifiedCrow63
Sorry getting GS credentials is taking longer than expected π
Nonetheless it should not be an issue (model upload is essentially using the same StorageManager internally)
feature value distribution over time
You mean how to create this chart? None
SharpDove45 FYI:
if you set the environment variable CLEARML_NO_DEFAULT_SERVER=1
, it will make sure never to default to the demo server
at the end of the manual execution
currently I'm doing it by fetching the latest dataset, incrementing the version and creating a new dataset version
This seems like a very good approach, how would you improve ?
I guess I would need to put this in the extra_vm_bash_script param of the auto-scaler, but it will reboot in loop right? Isnβt there an easier way to achieve that?
You can edit the extra_vm_bash_script
which means the next time the instance is booted you will have the bash script executed,
In the meantime, you can ssh to the running instance and change the ulimit manually, wdyt?
agentservice...
Not related, the agent-services job is to run control jobs, such as pipelines and HPO control processes.