Examples: query, "exact match", wildcard*, wild?ard, wild*rd
Fuzzy search: cake~ (finds cakes, bake)
Term boost: "red velvet"^4, chocolate^2
Field grouping: tags:(+work -"fun-stuff")
Escaping: Escape characters +-&|!(){}[]^"~*?:\ with \, e.g. \+
Range search: properties.timestamp:[1587729413488 TO *] (inclusive), properties.title:{A TO Z}(excluding A and Z)
Combinations: chocolate AND vanilla, chocolate OR vanilla, (chocolate OR vanilla) NOT "vanilla pudding"
Field search: properties.title:"The Title" AND text
Profile picture
CostlyOstrich36
Moderator
0 Questions, 3791 Answers
  Active since 10 January 2023
  Last activity one year ago

Reputation

0
0 Has Anybody Encountered:

Hi @<1584716373181861888:profile|ResponsiveSquid49> , what optimization method are you using?

one year ago
0 Hello! I'M Trying To Figure Out How To Deploy A Scheduled Pipeline. I Have A Sample Pipeline Here

It would work from your machine as well, but the machine needs to be turned on... like when an ec2 instance that is running.

one year ago
0 Proper Way To Upload Artifacts

Hi GentleSwallow91 ,

  1. When using jupyter notebooks its best to do task.close() - It will bring the same affect you're interested in
  2. If you would like to upload to the server you need to add the following parameter to your Task.init() The parameter is output_uri. You can read more here - https://clear.ml/docs/latest/docs/references/sdk/task#taskinit
    You can either mark it as True or provide a path to a bucket. The simplest usage would be ` Task.init(..., output_uri...
2 years ago
0 Hi, I'M Trying To Upload Data To Clearml Parallelly. Is It Impossible To Use

Hi MagnificentWorm7 ,

I'm not sure I understand. You're trying to upload files to a dataset from different concurrent processes?

2 years ago
0 Hi, When Running A Clearml Agent With Services Mode, Is There A Way To Limit The Number Of Concurrent Services Crunning?

Is it the services docker that comes with the docker compose or did you run your own agent?

2 years ago
0 Hi All! Quick Question: Can Clearml-Agent

Hi @<1556450111259676672:profile|PlainSeaurchin97> , I think this is what you are looking for:
None

one year ago
0 Can I Change The Clearml-Serving Inference Port? 8080 Is Already Used For My Self-Hosted Server.. I Guess I Can Just Change It In The Docker-Compose, But I Find A Little Weird That You Are Using This Port If The Self-Hosted Server Web Is Hosted In It..

Hi ElegantCoyote26 ,

It doesn't seem that using port 8080 is mandatory and you can simply change it when you run ClearML-Serving - i.e docker run -v ~/clearml.conf:/root/clearml.conf -p 8085:8085

My guess is that the example uses port 8080 because usually the ClearML backend and the Serving would run on different machines

2 years ago
0 Before I Write An Issue: Does Someone Else Have To Problem That With The Latest Clearml-Server If You Go To Detail View -> Results -> Debug Samples -> Change Metric To Anything, And Then Press The Refresh Button/Wait For Auto Refresh You Get A Blank Debu

ReassuredTiger98 , I played with it myself a little bit - It looks like this happens for me when an experiment is running and reporting images and changing metric does the trick - i.e reproduces it. Maybe open a github issue to follow this 🙂 ?

2 years ago
0 Hi All! I'M Trying To Create Dataset With Output_Uri Pointing To Google Storage And Got Weird Error:

You're totally right, if you managed to upload to a bucket then folder failure should be unrelated to permissions

one year ago
0 Hey! When Reviewing An Experiment In Clearml, In The "Plots" Tab, I Want To Display Multiple Graphs On The Same Row, For A Similar Experience Like "Debug Samples". Can Anyone Help My Configure It? Perhaps My Best Option Is To Convert The Graph To A 'Debug

Hi LethalCentipede31 , I don't think there is an out of the box solution for this but saving them as debug samples sounds like a good idea. You can simply report them as debug samples and that should also work 🙂

one year ago
0 Hi Everyone And Apologies For The Incredibly 'Basic' Question. I'M Trying To Deploy Clearml Server On An Ovh Hosted Vm. What Do You Think Is A Suitable Disk Size? Would 1 Tb Be Sufficient?

Hi @<1572032783335821312:profile|DelightfulBee62> , I think 1 TB should be enough. I would suggest maybe even having 2T just for the safe side

one year ago
0 My Clearml Fileserver Is Super Slow. Scp Command Works Sends 10 Mb/S, But Downloading Clearml Artifact Using Api, Using The Same Source And Destination Machines Has 0.06 Mb/S Transfer. What Could Be The Cause? Clearml Fileserver Is Hosted As A Docker Cont

Hi @<1523701122311655424:profile|VexedElephant56> , can you please elaborate a bit more on how you set up the server? Is it on top of a VPN? Is there a firewall? Is it a simple docker compose or on top of K8s?

one year ago
0 Hey Everyone, I'M Setting Up Clearml Agents And Workers With The Open Source Version Within My Org. Was Wondering What Is The Best Way To Handle Different Python Version Requirements For Different Projects?

Hi @<1535069219354316800:profile|PerplexedRaccoon19> , the agent will try to use the relevant python version according to what the experiment ran on originally. In general, it's best to run inside dockers with a docker image specified per experiment 🙂

one year ago
0 What Is The Difference Between Model And Inputmodel?

@<1523704157695905792:profile|VivaciousBadger56> , ClearML's model repository is exactly for that purpose. You can basically use InputModel and OutputModel for handling models in relation to tasks

one year ago
8 months ago
0 Also, Does You Guys Support User Authentication? In The Documentation -

Hi @<1580367711848894464:profile|ApprehensiveRaven81> , I'm afraid this is only option for the open source version. In the Scale/Enterprise licenses there are SSO/LDAP integrations

one year ago
0 Um, Is There A Way To Delete An Artifact From A Task That Is Running?

I think it depends on your implementation. How are you currently implementing top X checkpoints logic?

2 years ago
0 I Realize I'M Asking Many Niche Questions - My Apologies

Then I think users.get_all would be right up your alley 🙂

3 years ago
Show more results compactanswers