Examples: query, "exact match", wildcard*, wild?ard, wild*rd
Fuzzy search: cake~ (finds cakes, bake)
Term boost: "red velvet"^4, chocolate^2
Field grouping: tags:(+work -"fun-stuff")
Escaping: Escape characters +-&|!(){}[]^"~*?:\ with \, e.g. \+
Range search: properties.timestamp:[1587729413488 TO *] (inclusive), properties.title:{A TO Z}(excluding A and Z)
Combinations: chocolate AND vanilla, chocolate OR vanilla, (chocolate OR vanilla) NOT "vanilla pudding"
Field search: properties.title:"The Title" AND text
Profile picture
AgitatedDove14
Moderator
49 Questions, 8126 Answers
  Active since 10 January 2023
  Last activity one year ago

Reputation

0

Badges 1

25 × Eureka!
0 Votes
0 Answers
2K Views
0 Votes 0 Answers 2K Views
apparently everyone can ...
5 years ago
0 Votes
0 Answers
2K Views
0 Votes 0 Answers 2K Views
Slack security ... Go figure πŸ˜‰
5 years ago
0 Votes
0 Answers
2K Views
0 Votes 0 Answers 2K Views
Finally
5 years ago
0 Votes
0 Answers
2K Views
0 Votes 0 Answers 2K Views
Gals, Guys & :robot_face: If you want to get some inspiration on building DL Continuous Integration pipelines, I suggest this post (obviously built on top of...
5 years ago
0 Votes
1 Answers
2K Views
0 Votes 1 Answers 2K Views
Gals, Guys & :robot_face: , if you want to checkout the Hyper-Parameters automation (Using Bayesian Optimization Hyper-Band) We have an example on the demo s...
5 years ago
0 Votes
0 Answers
2K Views
0 Votes 0 Answers 2K Views
YummyWhale40 awesome thanks!
5 years ago
0 Votes
6 Answers
2K Views
0 Votes 6 Answers 2K Views
Hi
Hi ! ClearML Server + SDK v1.9.0 is out! πŸŽ‰ πŸš€ 🎊 Happy Holidays and Happy New Year! ❇️ πŸŽ‡ πŸŽ„
2 years ago
0 Votes
0 Answers
2K Views
0 Votes 0 Answers 2K Views
YummyWhale40 you are saying the example code is not working when running with the demo server? Also I think I was able to view your experiment on the demo se...
5 years ago
0 Votes
0 Answers
2K Views
0 Votes 0 Answers 2K Views
YEY!!!! Download as CSV 🀯
3 years ago
0 Votes
0 Answers
2K Views
0 Votes 0 Answers 2K Views
New video is out πŸ™‚ Cloud Autoscalers are awesome https://www.youtube.com/watch?v=j4XVMAaUt3E
3 years ago
0 Votes
0 Answers
2K Views
0 Votes 0 Answers 2K Views
Is it a one time thing? or recurring?
5 years ago
0 Votes
0 Answers
2K Views
0 Votes 0 Answers 2K Views
5 years ago
0 Votes
3 Answers
2K Views
0 Votes 3 Answers 2K Views
This will close it Task.current_task().close()I think we should rename completed() because it just marks the Task as completed on the backend but does not ac...
4 years ago
0 Votes
0 Answers
2K Views
0 Votes 0 Answers 2K Views
5 years ago
0 Votes
0 Answers
2K Views
0 Votes 0 Answers 2K Views
Hi Guys! I have great news, we finally fully implemented support for continuing previously trained models πŸŽ‰ Here is a quick example (this is torch, but any ...
5 years ago
0 Votes
9 Answers
2K Views
0 Votes 9 Answers 2K Views
Hi
Hi https://github.com/allegroai/trains/releases/tag/0.15.1 / https://github.com/allegroai/trains-server/releases/tag/0.15.1 / https://github.com/allegroai/tr...
5 years ago
0 Votes
2 Answers
1K Views
0 Votes 2 Answers 1K Views
OMG Look who just joined the PyTorch EcoSystem None Yes! it is TRAINS πŸš† πŸŽ‰ 🎈
5 years ago
0 Votes
1 Answers
1K Views
0 Votes 1 Answers 1K Views
πŸ™ Please skip cleaml python package v1.0.1 and just move on to v1.0.2 😊 apologies for the inconvenience πŸ™‚ pip install clearml==1.0.2
4 years ago
0 Votes
6 Answers
1K Views
0 Votes 6 Answers 1K Views
Hi
Hi :robot_face: , humans We have the new documentation site up and running πŸŽ‰ None 🎊 This is still a work in progress, so we keep the previous version alive...
4 years ago
0 Votes
1 Answers
1K Views
0 Votes 1 Answers 1K Views
πŸ™ There is no v1.0 release without a prompt v1.0.1 following it, and we are no different 😊 pip install clearml==1.0.1
4 years ago
0 Votes
0 Answers
2K Views
0 Votes 0 Answers 2K Views
5 years ago
0 Votes
0 Answers
2K Views
0 Votes 0 Answers 2K Views
Lol, I wonder what the adblock rule was ;)
5 years ago
0 Votes
10 Answers
2K Views
0 Votes 10 Answers 2K Views
Happy Friday everyone ! We have a new repo release we would love to get your feedback on πŸš€ πŸŽ‰ Finally easy FRACTIONAL GPU on any NVIDIA GPU 🎊 Run our nvidi...
one year ago
0 Votes
0 Answers
2K Views
0 Votes 0 Answers 2K Views
docs are up
5 years ago
0 Votes
0 Answers
2K Views
0 Votes 0 Answers 2K Views
I would guess connectivity issues, the TLS is probably python inaccurate response (I mean in a way, it is also a TLS error, but I would imagine this has more...
5 years ago
0 Votes
2 Answers
2K Views
0 Votes 2 Answers 2K Views
Hi
Hi ! trains 0.16.2 is finally out with the new pipelines interface! Check out the new example https://github.com/allegroai/trains/blob/master/examples/pipeli...
5 years ago
0 Votes
0 Answers
2K Views
0 Votes 0 Answers 2K Views
5 years ago
0 Votes
3 Answers
2K Views
0 Votes 3 Answers 2K Views
@<1523703325881536512:profile|ConvolutedSealion94> these are xgboost internal metrics that are automatically picked by clearml
3 years ago
0 Votes
4 Answers
930 Views
0 Votes 4 Answers 930 Views
Happy new year everyone! πŸ₯‚ πŸŽ† Last minute 🎁 v2.0 is now out, with a new UI design! now finally supporting light & dark mode 🀩 Lot's more to come this year...
10 months ago
0 Votes
0 Answers
2K Views
0 Votes 0 Answers 2K Views
Hello Everyone!
5 years ago
Show more results questions
0 And One More Question. How Can I Get Loaded Model In Preporcess Class In Clearml Serving?

ohh AbruptHedgehog21 if this is the case, why don't you store the model with torch.jit.save and use Triton to run the model ?
See example:
https://github.com/allegroai/clearml-serving/tree/main/examples/pytorch
(BTW: if you want a full custom model serve, in this case you would need to add torch to the list of python packages)

3 years ago
0 And One More Question. How Can I Get Loaded Model In Preporcess Class In Clearml Serving?

we will try to use Triton, but it’s a bit hard with transformer model.

Yes ...

All extra packages we add in serving)

So it should work, you can also run your preprocess class manually from your own machine (for debugging), if you pass to it a local file (basically the downloaded model file from the UI, it should work

it. But it’s maybe not the best solution

Yes... it is not, separating the pre/post to CPU instance and letting triton do the GPU serving is a lot more effici...

3 years ago
0 Assuming I Have A

WackyRabbit7 I guess we are discussing this one on a diff thread πŸ™‚ but yes, should totally work, that's the idea

5 years ago
0 Assuming I Have A

(without having to execute it first on Machine C)

Someone some where has to create the definition of the environment...
The easiest to go about it is to execute it one.
You can add to your code the following line
task.execute_remotely(queue_name='default')This will cause you code to stop running and enqueue itself on a specific queue.
Quite useful if you want to make sure everything works, (like run a single step) then continue on another machine.
Notice that switching between cpu...

5 years ago
0 And One More Question. How Can I Get Loaded Model In Preporcess Class In Clearml Serving?

How can i get loaded model in Preporcess class in ClearML Serving?

ComfortableShark77
You mean your preprocess class needs a python package or is it your own module ?

3 years ago
0 I Am Trying To Plot Values That Are Either 0 Or 1 (With Tensorboardx.Add_Scalar). However, It Doesn'T Show Correctly. Any Idea Why? (Smoothing Is 0)

So currently there is a limit (from the elasticsearch) of about 10k (anything above the is subsampled)
In the new version we are adding a "maximize" button, then in the full screen you will have the raw data including all ???k samples. sounds good?

4 years ago
0 Hi Everyone, Is There A Way To Avoid The Environment Setup When Running A Task Using A Worker? I Am Currently Using A Custom Docker Image That Already Has All The Require Packages Installed. I Tried Setting The Env Var

SteepDeer88
Try the following:
` Task.add_requirements("pycocotools-windows", "; platform_system == "Windows"")
Task.add_requirements("pycocotools", "; platform_system != "Windows"")

Task.init(...) You should see in your "installed packages" something like: pycocotools-windows ; platform_system == "Windows"
pycocotools ; platform_system != "Windows" `

3 years ago
0 I Uncommented The Line

HurtWoodpecker30

The agent uses the

requirements.txt

)

what do you mean by that? aren't the package listed in the "Installed packages" section of the Task?
(or is it empty when starting, i.e. it uses the requirements.txt from the github, and then the agent lists them back into the Task)

3 years ago
0 Downloading Output Artifacts From S3 By Clicking On The Download Button Next To Model Url Was Great, But Since We Moved From Aws To Yandex.Cloud, This Feature Doesn'T Work. Any Chance You Could Support Other Cloud Providers?

I wonder if this hack would work
Assume you upload an artifact/model to ' s3://storage.yandexcloud.net:443/clearml-models ' notice the port is added. Would that trigger a popup in the UI?
Also what happens if you add tge credential manually in the profile page?

3 years ago
0 Our Mac Users Are Having Some Issues. They Have Their Respective ~/Clearml.Conf, And Yet They Get: Clearml 1.1.5

A quick fix will be:
` import dotenv
dotenv.load_dotenv('~/.env')
from clearml import Task # Now we can load it.
import argparse

if name == "main":
# do stuff `wdyt?

3 years ago
0 Anybody Tried To Integrate Clearml With Ray Framework (

Hi LudicrousDeer3
I have to admit I cannot remember one in the wild (I might be wrong though).
What's the specific use case you had in mind ?

4 years ago
0 Any Idea Why I Get This Error In All My Agents

in the docker-compose file. Still strange...

hmm yes it is... If you have an idea on what went wrong let me know, we would love to fix it

4 years ago
0 Hello All , Good Morning ! Can You Help Better Understand The Distinction Of Cleargpt? How Is It Different From Chatgpt And What Gpt Model Are We Using In Clearml ? Thank You In Advance !

That is correct. Unfortunately though this is not part of the open source, this means that for the open source it might be a bit more hands-on to deploy an llm model

2 years ago
0 Another Issue Is The Agent Uses Python 2 For Some Reason Even Though Locally I’M Using Python 3 And The Agent Is Supposed To Use A Python 3 Venv.

If this doesn't help.
Go to your ~/clearml.conf file, at the bottom of the file you can add agent.python_binary and change it to to the location of python3.6 (you can run which python3.6 to get the full path):
agent.python_binary: /full/path/to/python3.6

4 years ago
0 Clearml (Remote Execution) Sometimes Doesn'T "Pick-Up" Gpu. After I Rerun The Task It Picks It Up. Seems Random, Doesn'T Happen Too Often (Maybe Once In 30-40 Times) And I Cannot Seem To Detect Any Pattern. Did Anyone Else Notice This? Agents Are Vms On G

Is there an easy way to add a docker argument in the python script?

On the task it self in the UI you can edit the docker arguments and add any missing flags
(task.set_base_docker will do the same from code)
You can also edit the configuration and always add this flag:
None

one year ago
4 years ago
4 years ago
0 Question About Pipeline And Long-Waiting Tasks: Say I Want To Generate A Dataset. The Workflow I Have Requires

My main issue with this approach is that it breaks the workflow into β€œa-sync” set of tasks:

This is kind of the way you depicted it, meaning, there is an an initial dataset, "offline process" (i.e. external labeling) then, ingest process.

I was wondering if the β€œwaiting” operator can actually be a part of the pipeline.
This way it will look more clear what is the workflow we are executing.

Hmm, so pipeline is "aborted", then the trigger relaunches the pipeline, and the pipeli...

3 years ago
0 Is There A Way To Limit The Number Of Jobs/Tasks That Can Run Concurrently On The

Hi ElegantCoyote26 , in theory no limit, but that depends on how you spined the services queue agent:
https://clear.ml/docs/latest/docs/clearml_agent/clearml_agent_daemon
See services mode :

To limit the number of simultaneous tasks run in services mode, pass the maximum number immediately after the

--services-mode

option (e.g.

--services-mode 5

)

3 years ago
one year ago
0 If I Do 

Hi ElegantCoyote26
Try:
task = Task.create(....) task.output_uri = " ..."

3 years ago
0 Hi, I'M Having Trouble Using Task.Clone And Task.Create- I'M Running Two Experiments One After The Other, And I Would Like To Report The Second Experiment To A New Task (New Experiment On The Server) But It Doesn'T Work. The Flow Is Task.Init -> Experimen

Hi HappyLion37
It seems that you are "reusing" the Tasks. Which means the second time you open them you are essentially resetting the old run and starting all over.
Try to do:
task1 = Task.init('examples', 'step one', reuse_last_task_id=False) print('do stuff') task1.close() task2 = Task.init('examples', 'step two', reuse_last_task_id=False) print('do some more stuff') task2.close()

5 years ago
0 Hi! I'M Using Func

ExcitedSeaurchin87 I took a quick look, dude this is awesome!!! Thank you 🀩

3 years ago
Show more results compactanswers