Reputation
Badges 1
37 × Eureka!Ok, thanks! Going to try this now. I included an entry point from reading some other messages on Slack here when trying to figure out how to use Docker for running remotely.
Yes, it does. And I can clone this repo and branch to the server outside of clearml just using  git clone ....  as I've added ssh keys and authenticated.
I found this issue but not sure if it's the same thing as it's from awhile back:  None  And the link I'm trying to clone doesn't start with  https  but with  git@bitbucket
I just cleared the cache, but still getting the error:
Python executable with version '3.10' requested by the Task, not found in path, using '/usr/local/bin/python3' (v3.9.9) instead
created virtual environment CPython3.9.9.final.0-64 in 661ms
  creator CPython3Posix(dest=/root/.clearml/venvs-builds/3.9, clear=False, no_vcs_ignore=False, global=True)
  seeder FromAppData(download=False, pip=bundle, setuptools=bundle, wheel=bundle, via=copy, app_data_dir=/root/.local/share/virtualenv)
 ...
					For reference for anyone else, this is what I did:
current_task = Task.current_task()
model_run_url = Task.get_task_output_log_web_page(
        task_id=current_task.id,
        project_id="abc")
					Hi @<1523701205467926528:profile|AgitatedDove14> , I'm still having issues with this set up. See my latest comment here: None
I created a new queue  megan-testing  and have an agent running on my machine that I assigned to it. It works when I just use a simple task and schedule it, but when I try run the pipeline, it says it can't find the queue.
@<1523701070390366208:profile|CostlyOstrich36> , a quick follow up, I've been looking at the ClearML API documentation to see how to trigger a pipeline via the API. Do you use  queues  and  add_task , as specified here:  None  ?
Here is an example of the pipeline code, simplified:
"""Forecasting Pipeline"""
from clearml.automation.controller import PipelineDecorator
from clearml import TaskTypes
@Pipe...
					@<1523701205467926528:profile|AgitatedDove14> Hi! Could you give any feedback on the above? We are trying to figure out if/how we can run pipelines on a schedule and also trigger them with an external event.
Hi  @<1523701205467926528:profile|AgitatedDove14> , thanks for your reply. We want to use the  Scheduler  but to run a pipeline. Looking at this example here, it looks like it only works with tasks:  None
So, in my code example above, where I have  executing_pipeline  as the pipeline function created with the decorator, can this be scheduled to run with the  TaskScheduler  , ie. used as the function...
Ah ok, so if I have two steps, this means 3 queues with 3 agents, one for the controller and one each dedicated to the steps, just to confirm?
I am using the pipeline id of when I last ran the pipeline and got this through the UI in ClearML.
Thanks, @<1523701070390366208:profile|CostlyOstrich36> .
If I use  Task.current_task()  , I then get a task object back that looks like  <clearml.task.Task object at 0x7f7723b90a00> . How does one get the url from this?
@<1523701070390366208:profile|CostlyOstrich36>  here is the INFO section:
/usr/lib/python3/dist-packages/requests/__init__.py:89: RequestsDependencyWarning: urllib3 (1.25.8) or chardet (5.1.0) doesn't match a supported version!
  warnings.warn("urllib3 ({}) or chardet ({}) doesn't match a supported "
Current configuration (clearml_agent v1.7.0, location: /tmp/.clearml_agent.r_86l8jq.cfg):
----------------------
agent.worker_id = megan-vm
agent.worker_name = vm
agent.force_git_ssh_protocol = true
agent.python_binary = 
agent.package_manager.type = poetry
agent...
					Thanks @<1523701070390366208:profile|CostlyOstrich36> , I've figured this out and specified how to build the ClearML agent now with the correct python version.
I've figured out that this is because I have a  config.yaml  file with secrets in it in the repository. This is not committed to git. So, when running remotely, the file is no present. Is the recommendation to put this in the docker image and then I have to specify an entry point in the dockerfile? Previously, I was hoping to just get away with creating a docker image with the installed packages for the agent, not with the repository code as well. Is this not the recommended approach?
Thanks, I'll have a look at the youtube videos. I've been going over the documentation a lot and haven't found much about actually running and deploying pipelines, and a lot of details seem to be missing.
Hi there, I am getting this exact same error. I have set  force_git_ssh_protocol: true  and have commented out the  git_user  and  git_pass  , but still getting the error. Is the only way to fix this to clear the cache? Do you have to do this every time then?
This is  schedule.py  where the  schedule_task_id  is the id for a pipeline that has previously been run and now we want to schedule it:
from clearml import Task
from clearml.automation import TaskScheduler
if __name__ == "__main__":
    scheduler = TaskScheduler(force_create_task_name='megan-test-remote-pipeline')
    scheduler.add_task(
        name='Simple Pipeline Schedule Run Test',
        schedule_task_id='94a7e898bb3f4494a31828187710f5bd',
        queue='megan_testing',
...
					Following up, I found this in the code on Github: None
A  taskid  is required though - to get this id would we run the pipeline manually as it then shows up in the Web UI and then just use the id of the task that we can get from the UI by clicking on the pipeline run info?
I have an agent running and assigned to the queue  megan-testing :
Schedule testing code:
# Schedule for running the pipeline daily
from clearml import Task
from clearml.automation import TaskScheduler
def simple_function():
    print('This code is executed in a background thread, '
          'on the same machine as the TaskScheduler process')
    # add some logic here
    print('done')
if __name__ == "__main__":
    scheduler = TaskScheduler(force_create_task_name='megan-test-remote-pipeline')
    scheduler.add_task(
        name='Simple Pipeli...
					Hi  @<1523701070390366208:profile|CostlyOstrich36> , another quick question, can you run a pipeline on a  schedule  or are schedules only for Tasks? We are battling to figure out how to automate the pipelines.
Also, we use Dagster for orchestration.
Here is a sample from the end of the logs where it failed:
/home/megan/code/direct-relief-forecasting/forecast.py:277: FutureWarning:
The behavior of DatetimeProperties.to_pydatetime is deprecated, in a future version this will return a Series containing python datetime objects instead of an ndarray. To retain the old behavior, call `np.array` on the result
2024-07-04 12:14:04.589 | INFO     | forecast:infer:281 - Inferring for dates: ['2024-06-13']
2024-07-04 12:14:04.734 | INFO     |...
					Thanks @<1523701070390366208:profile|CostlyOstrich36> , that does sound like an option. Can you point me to the documentation for this API call as I haven't been able to find anything?
I think I found it: None
Will this work for pipelines too, which are also Tasks, right?