Examples: query, "exact match", wildcard*, wild?ard, wild*rd
Fuzzy search: cake~ (finds cakes, bake)
Term boost: "red velvet"^4, chocolate^2
Field grouping: tags:(+work -"fun-stuff")
Escaping: Escape characters +-&|!(){}[]^"~*?:\ with \, e.g. \+
Range search: properties.timestamp:[1587729413488 TO *] (inclusive), properties.title:{A TO Z}(excluding A and Z)
Combinations: chocolate AND vanilla, chocolate OR vanilla, (chocolate OR vanilla) NOT "vanilla pudding"
Field search: properties.title:"The Title" AND text
Profile picture
YummyGrasshopper29
Moderator
9 Questions, 21 Answers
  Active since 18 May 2024
  Last activity one month ago

Reputation

0

Badges 1

19 × Eureka!
0 Votes
2 Answers
398 Views
0 Votes 2 Answers 398 Views
4 months ago
0 Votes
5 Answers
65 Views
0 Votes 5 Answers 65 Views
Hello guys, can you tell me where the console outputs are stored? For some reason, all outputs have disappeared from all my pipelines. Any explanation, does ...
one month ago
0 Votes
8 Answers
665 Views
0 Votes 8 Answers 665 Views
I tried to get data from dataset, but agent always look on localhost:8081. I change host in clearml.conf but have same error. How can I change host of clearm...
6 months ago
0 Votes
2 Answers
346 Views
0 Votes 2 Answers 346 Views
Hey, I am trying to clone existing pipeline using clearml API, but I didn't managed to override pipeline parameters. Task was cloned successfully with parame...
4 months ago
0 Votes
1 Answers
226 Views
0 Votes 1 Answers 226 Views
Is there any way to get value of step parameter/function kwarg? This is from documentation, but didn't manage to get value. pipe.add_function_step( name='ste...
3 months ago
0 Votes
3 Answers
151 Views
0 Votes 3 Answers 151 Views
Is there way to have dynamic pipeline parameters and use it inside pipeline controller? pipe.add_parameter(name="P", default="yes") print(pipe.get_parameters...
2 months ago
0 Votes
3 Answers
301 Views
0 Votes 3 Answers 301 Views
Hey, I have pipeline from code, but have problem with caching. Actually clearml didn't cache already executed steps (tried to re-run pipeline from web-ui). D...
3 months ago
0 Votes
0 Answers
430 Views
0 Votes 0 Answers 430 Views
Here is sc from pipeline section (web ui)
6 months ago
0 Votes
9 Answers
498 Views
0 Votes 9 Answers 498 Views
Hey, I am new with clearML and need some help 🙂 I am trying to run very simple pipeline inside docker container. I followed the documentation, created new q...
6 months ago
0 Hey, I Am New With Clearml And Need Some Help

Ok, thanks for explanation. So pipeline controller is in Running state, while task 1 is in pending state. The solution will be to add one more agent?

6 months ago
0 Hey, I Am New With Clearml And Need Some Help

One more question 🙂
How can I force clearML not to install requirements before running task? (already have everything installed on docker machine)

6 months ago
3 months ago
0 Hey, I Am New With Clearml And Need Some Help

It works! Thanks man, you save my day!!

6 months ago
0 Hey, I Am New With Clearml And Need Some Help

Yes, there is one agent. As I said, I am able to execute task, but have problem with pipeline

6 months ago
0 I Tried To Get Data From Dataset, But Agent Always Look On Localhost:8081. I Change Host In Clearml.Conf But Have Same Error. How Can I Change Host Of Clearml Fileserver?

I have GCP instance with official clearml image.

from clearml import StorageManager, Dataset

dataset = Dataset.create(
    dataset_project="Project", dataset_name="Dataset_name"
)

files = [
    'file.csv',
    'file1.csv',
]

for file in files:
    csv_file = StorageManager.get_local_copy(remote_url=file)
    dataset.add_files(path=csv_file)


# Upload dataset to ClearML server (customizable)
dataset.upload()
# commit dataset changes
dataset.finalize()
6 months ago
0 I Tried To Get Data From Dataset, But Agent Always Look On Localhost:8081. I Change Host In Clearml.Conf But Have Same Error. How Can I Change Host Of Clearml Fileserver?

I am running clearml server on gcp, but I didn't exposed ports instead I ssh to machine and do port forwarding to localhost. The problem is localhost on my machine is not same as localhost inside docker on worker. If I check dataset, files are stored in localhost, but actually it is not localhost. Didn't fond the solution yet how to properly setup hostname for dataserver. Any ideas?

6 months ago
0 Hey, I Am New With Clearml And Need Some Help

Thanks for very fast replay!

6 months ago
0 Hello Guys, Can You Tell Me Where The Console Outputs Are Stored? For Some Reason, All Outputs Have Disappeared From All My Pipelines. Any Explanation, Does Anyone Have An Idea What Might Have Happened?

Is there a possibility that it was using Elastic before (through some logging driver) but that it defaulted to using json.log (default) logger driver now?

one month ago
0 Hello Guys, Can You Tell Me Where The Console Outputs Are Stored? For Some Reason, All Outputs Have Disappeared From All My Pipelines. Any Explanation, Does Anyone Have An Idea What Might Have Happened?

Looks like that docker-compose down && docker-compose up flush console output. I upgraded server to 1.16.2-502, didn't had that problem before. Any idea?

one month ago
6 months ago
0 Hello Guys, Can You Tell Me Where The Console Outputs Are Stored? For Some Reason, All Outputs Have Disappeared From All My Pipelines. Any Explanation, Does Anyone Have An Idea What Might Have Happened?

I was digging around a bit, it seems that for the worker containers use default logging, that is: they use json.log files stored in /var/lib/docker/container/<hash> folders. When I do up/down of docker compose, these container folders are purged and with them my console log is gone.

one month ago
one month ago
0 Is There Way To Have Dynamic Pipeline Parameters And Use It Inside Pipeline Controller?

Here is basic example:

from clearml import PipelineController

def step_function(param):
    print("Hello from function!")
    print("Param:", param)

if __name__ == '__main__':
    repo = '
'
    repo_branch = 'main'
    working_dir = 'pipelines'

    pipe = PipelineController(
        name='Test',
        project='Test',
        version='0.0.1',
        add_pipeline_tags=False,
        repo=repo,
        repo_branch=repo_branch,
        working_dir=working_dir
    )
    p...
2 months ago
0 Is There Way To Have Dynamic Pipeline Parameters And Use It Inside Pipeline Controller?

@<1523701070390366208:profile|CostlyOstrich36> any idea? 🙂

2 months ago