Examples: query, "exact match", wildcard*, wild?ard, wild*rd
Fuzzy search: cake~ (finds cakes, bake)
Term boost: "red velvet"^4, chocolate^2
Field grouping: tags:(+work -"fun-stuff")
Escaping: Escape characters +-&|!(){}[]^"~*?:\ with \, e.g. \+
Range search: properties.timestamp:[1587729413488 TO *] (inclusive), properties.title:{A TO Z}(excluding A and Z)
Combinations: chocolate AND vanilla, chocolate OR vanilla, (chocolate OR vanilla) NOT "vanilla pudding"
Field search: properties.title:"The Title" AND text
Profile picture
FancyOtter74
Moderator
4 Questions, 17 Answers
  Active since 15 September 2023
  Last activity 19 days ago

Reputation

0

Badges 1

17 × Eureka!
0 Votes
5 Answers
983 Views
0 Votes 5 Answers 983 Views
My data processing scripts are run in the cloud with the help of ClearML autoscaler. The cloud doesn't (and won't) have access to Git, which is in our intern...
7 months ago
0 Votes
4 Answers
94 Views
0 Votes 4 Answers 94 Views
Dataset uploading failed, but task finished successfully. As a result - dataset is in inconsistent state, where it thinks that there's a file inside, but the...
21 days ago
0 Votes
3 Answers
109 Views
0 Votes 3 Answers 109 Views
I'm setting task.publish_on_completion(True) right after initializing the task, and this works as expected if I run the task locally. But when executed on a ...
one month ago
0 Votes
13 Answers
1K Views
0 Votes 13 Answers 1K Views
one year ago
0 Dataset Uploading Failed, But Task Finished Successfully. As A Result - Dataset Is In Inconsistent State, Where It Thinks That There'S A File Inside, But There Isn'T:

my code:

    dataset = Dataset.create(
        dataset_project=PROJECT_NAME,
        dataset_name=f"processed_{mode}",
        dataset_tags=task.get_tags(),
        parent_datasets=None,
        use_current_task=False,
        output_uri=BUCKET,
    )

dataset.add_files(path, verbose=True)
dataset.upload(verbose=True)    dataset.finalize(verbose=True)
21 days ago
20 days ago
0 I Am Using Clearml Free Saas. I Have A Task "Mytask" Of Type "Data_Processing" In Project "Myproject" Which Uploads A Dataset In The End Of Its Execution. For Some Reason, After Uploading The Dataset, My Task Appears In Ui Not Under "Myproject", But Under
from clearml import Task, Dataset

task = Task.init(
    project_name="MyProject",
    task_name="MyTask",
    task_type=Task.TaskTypes.data_processing,
    reuse_last_task_id=False,
    output_uri="
"
)

with open("new_file.txt", "w") as file:
    file.write("Hello, world!")

dataset = Dataset.create(parent_datasets=None, use_current_task=True)
dataset.add_files(".", wildcard="new_file.txt", verbose=True)
dataset.upload(verbose=True)
dataset.finalize(verbose=True)
one year ago
0 I Am Using Clearml Free Saas. I Have A Task "Mytask" Of Type "Data_Processing" In Project "Myproject" Which Uploads A Dataset In The End Of Its Execution. For Some Reason, After Uploading The Dataset, My Task Appears In Ui Not Under "Myproject", But Under

I did similarly at my previous work (we had open source clearml deployed). The problem I described here was not present there. I liked this approach. It was convenient that dataset_id and task_id are the same.

one year ago
0 My Data Processing Scripts Are Run In The Cloud With The Help Of Clearml Autoscaler. The Cloud Doesn'T (And Won'T) Have Access To Git, Which Is In Our Internal Network. So I'M Left With Using
    common_module = task.connect_configuration("../common.py", "common.py")
    if not task.running_locally():
        import shutil
        shutil.copy(common_module, "common.py")

    from common import test_common
    test_common()
7 months ago
7 months ago
0 My Data Processing Scripts Are Run In The Cloud With The Help Of Clearml Autoscaler. The Cloud Doesn'T (And Won'T) Have Access To Git, Which Is In Our Internal Network. So I'M Left With Using

it seems that connecting it as config is more convenient than uploading an artifact, because artifacts are deleted when cloning a task. Code is very simple:

7 months ago
0 I'M Setting

Task completes normally. I'm using clearml's aws autoscaler.
Task is started the following way: airflow job run finds an older task, clones it, changes some params and enqueues it.

one month ago