Examples: query, "exact match", wildcard*, wild?ard, wild*rd
Fuzzy search: cake~ (finds cakes, bake)
Term boost: "red velvet"^4, chocolate^2
Field grouping: tags:(+work -"fun-stuff")
Escaping: Escape characters +-&|!(){}[]^"~*?:\ with \, e.g. \+
Range search: properties.timestamp:[1587729413488 TO *] (inclusive), properties.title:{A TO Z}(excluding A and Z)
Combinations: chocolate AND vanilla, chocolate OR vanilla, (chocolate OR vanilla) NOT "vanilla pudding"
Field search: properties.title:"The Title" AND text
Profile picture
ObedientDolphin41
Moderator
4 Questions, 37 Answers
  Active since 10 January 2023
  Last activity 10 months ago

Reputation

0

Badges 1

27 × Eureka!
0 Votes
8 Answers
609 Views
0 Votes 8 Answers 609 Views
Hey all, quick question about pipeline execution queues. I set the execution_queue='queuename' on my individual pipeline components using the decorator, and ...
one year ago
0 Votes
7 Answers
475 Views
0 Votes 7 Answers 475 Views
10 months ago
0 Votes
20 Answers
557 Views
0 Votes 20 Answers 557 Views
Hello all, thanks for this really cool software and community! I have a question on importing local modules when using the Pipelines from Decorators. Startin...
one year ago
0 Votes
10 Answers
685 Views
0 Votes 10 Answers 685 Views
Hi all! Question about pipelines using decorators. The first step of my pipeline uses a specific repo, specified using PipelineDecorator.component(repo='repo...
one year ago
0 Hey All, Quick Question About Pipeline Execution Queues. I Set The

Oh yup, that seems very possible since I run it with the run_locally() and then clone this task in the UI

one year ago
0 Hey All, Quick Question About Pipeline Execution Queues. I Set The

This is ran by using the UI’s ‘Run’ button without the ‘Advanced configuration’

one year ago
0 Hey All, Quick Question About Pipeline Execution Queues. I Set The

Hi AgitatedDove14
My bad, I worded my question wrong I see, I meant the tasks of the pipeline’s components. (it shows that I’m a newbie 😅 )
This does make perfect sense though! The problem seems to just be that the components themselves are ran on the same queue as the pipeline logic, even though I configured it differently

one year ago
0 Hey All, Quick Question About Pipeline Execution Queues. I Set The

This workflow however is the only way I have found to easily fix my previous ‘Module not found’ errors

one year ago
0 Hi Everyone. I Have A Problem With Pipeline. I Use

Hmm that sounds okay to me, could you send the clearml log with the ‘No module named ..’ error?

one year ago
0 Hey All, Quick Question About Pipeline Execution Queues. I Set The

Not yet, working on running the autoscaler for now, and picking this up again later 🙂

one year ago
0 Hi Everyone. I Have A Problem With Pipeline. I Use

Additionally, I have found it helpful to take a look into the agent’s working directory. With the python error should be the location of the script, and it may tell you a bit more by browsing that directory

one year ago
0 Hi Everyone. I Have A Problem With Pipeline. I Use

It seems like the actual import statement worked, since there is no ‘ImportError: no module named x’

one year ago
0 Hi Everyone. I Have A Problem With Pipeline. I Use

Just to be sure, you could download the repo and put this script in the root, and use the PipelineDecorator.debug_pipeline() option to run it locally and see if the code works like you wanted 🙂

one year ago
0 Hello All, Thanks For This Really Cool Software And Community! I Have A Question On

Thanks to both! Unfortunately the same error occurs with the following code snippet. (Jean, instead of the component parameter you mean packages , right? I could not find the former 🙂 )
@PipelineDecorator.component(..., packages=['someutils']) def step_one(): from someutils import someutilfunc someutilfunc(32)

one year ago
0 Hello All, Thanks For This Really Cool Software And Community! I Have A Question On

It seems to be working now, by running the Pipeline locally with PipelineDecorator.run_locally() and running the script using the following command:
PYTHONPATH="fill_in_your_current_dir" python pipeline.py
Cloning this in the UI and enqueueing now also allows remote execution.
Calling the script without the PipelineDecorator.run_locally() i.e. running the pipeline remotely still gives the ModuleNotFoundError: No module named

one year ago
0 Hello All, Thanks For This Really Cool Software And Community! I Have A Question On

Just checked and it’s not there, even for the successfully-remotely-ran pipeline. Do note that the needed module is just a local folder with scripts. The differences between the successful pipeline (ran locally and cloned in the UI) vs the errored pipeline (ran remotely) are also very hard to spot to me, they have the exact same Installed Packages and execution details

one year ago
0 Hello All, Thanks For This Really Cool Software And Community! I Have A Question On

Thanks a lot! I’m still in the process of setting up, so running on a remote worker has not been successful yet, but I’ll report back on this issue if that fixes it!

one year ago
0 Hello All, Thanks For This Really Cool Software And Community! I Have A Question On

In any case, I’m happy it’s fully running now 🙂

one year ago
0 Hello All, Thanks For This Really Cool Software And Community! I Have A Question On

Happens to all! Importing of local packages in these decorated pipelines hasn’t really worked yet (except when running via Pycharm, which seems to make sure that the location of the original code is always in the path)

one year ago
0 Hello All, Thanks For This Really Cool Software And Community! I Have A Question On

Yes, also present in the git repo (hosted on gitlab and seemed to correctly retrieve it, couldn’t find any errors about this in the logs)

one year ago
0 Hi All! Question About Pipelines Using Decorators. The First Step Of My Pipeline Uses A Specific Repo, Specified Using

Hi Mark! Do you set any of the decorator parameters using variables? That was my issue, and instead of using python variables, I hardcoded one potential value, and then used the get and set methods to change them when cloning programatically, which should be the same as changing them in the configuration tab when cloning with the UI. Hope this helps 🙂

one year ago
0 Hi All! Question About Pipelines Using Decorators. The First Step Of My Pipeline Uses A Specific Repo, Specified Using

Would this then be possible by cloning the task (which is a pipeline) and accessing the right subtask (the component which should be changed)?

one year ago
0 Hi All! Question About Pipelines Using Decorators. The First Step Of My Pipeline Uses A Specific Repo, Specified Using

I think I got it! I found that the branch for the component is specified in the UI in the component’s configuration object under the pipeline’s configuration tab. In theory I should be able to clone the pipeline task, use the get_configuration_object method, change the branch, set it using the set_configuration_object , and finally enqueue! Going to test this out

one year ago
0 Hi All! Question About Pipelines Using Decorators. The First Step Of My Pipeline Uses A Specific Repo, Specified Using

I’m also not sure but it seems like the slack trial renews from time to time in this workspace, which eventually gives access to those older threads

one year ago
0 Hi All! I Tried Out Clearml And Loved It And Wanted To Get My Own Instance Up And Running To Play Around With It But I Am Having Some Issues. Hopefully, It'S Just Me Being Stupid And Someone Can Help. I Followed The Docs And Used The Image To Install It O

Have you opened the ports 8080,8008,8081 ? I think I had the same thing when setting up, and still had to add some inbound rules to open these ports via the cloud platform

one year ago
0 After Presenting Clearml To My Team, I Got The Question "We'Re Already On Aws, Why Not Use Sagemaker?" Tbh, I'Ve Never Gone Through The Ml Workflow With Sagemaker. The Only Advantage I Could Think Of Is That We Can Use Our On-Prem Machines For Training,

I’m curious what the opinions are on this! I asked myself the same question. In my limited experience, going through a workflow with SageMaker was a painful process, and one that required a ton of AWS-specific code and configuration. Compared to this, ClearML was easy and quick to set up, and provides a dashboard where everything from experiments to models to output is organised, queryable and comparable. Way less hassle for way more benefits.

11 months ago
0 Hi, Is There A Way To Clone An Experiment That Uses Two Distinct Repositories?

Personally I’ve found this (sort-of hacky) approach to work, by passing your git credentials as environment variables to the agent’s docker and cloning the repo in the code. You’ll have to make sure you have the right packages installed though.
` if 'GIT_USER' in os.environ:
git_name, git_pass = os.environ['GIT_USER'], os.environ['GIT_PASS']
call(f'git clone https://{git_name}:{git_pass}@gitlab.com/myuser/myrepo', shell=True)
global myrepo
from myrepo import func
elif local_re...

one year ago
0 Hey There! Setting Up Clearml On A New Coworker’S Windows Laptop And Running Into Issues. Here Is The Stacktrace When Running A Test Script, Which Simply Initiates A Task. The Clearml.Conf Only Consists Of The Api {} Code Snippet That Is Given When Adding

Hi Jake! The clearml.conf file content is exactly the api section that is given by our clearml server, copied using the copy button, something like

api { 
    web_server: http:// .. :8080
    api_server: http:// .. :8008
    files_server: http:// .. :8081
    credentials {
        "access_key" = "KEY"
        "secret_key"  = "SECRET"
    }
}

clearml version 1.9.0
The strange thing is that the configuration works perfectly on my machine. My coworker’s machine does have a different p...

10 months ago
Show more results compactanswers