Reputation
Badges 1
11 × Eureka!Current configuration (clearml_agent v1.9.3, location: /tmp/clearml.conf):
@PipelineDecorator.pipeline(name="Sub Pipeline",project="Pipelines",version="1.0",multi_instance_support="parallel",)def sub_pipeline(parameter):print(f"Running sub-pipeline with parameter={parameter}")return parameter * 2
@PipelineDecorator.pipeline(name="Main Pipeline",project="Pipelines",version="1.0",)def main_pipeline():refs = []for p in [1, 2, 3]:
` ref ...
@<1523701070390366208:profile|CostlyOstrich36> Hi,
I have a question related to ClearML’s indexing mechanism for cached datasets. We noticed that when storing the dataset cache folder on an NFS (Network File System), running the command clearml-data get triggers a cache indexing process, which takes a significant amount of time. However, if we remove the NFS cache folder, the command runs almost instantly.
Could you explain how caching works in ClearML? Specifically:
- Why does ClearML p...
@<1523701070390366208:profile|CostlyOstrich36> Fixed: It was a cache issue in NFS. However, we discovered an important detail—there were two folders in the cache: datasets and global . When we started the ClearML script, it began indexing the entire global folder, which was the reason the script got stuck. After mounting only the datasets folder, there was no delay anymore.
Do you know how to disable indexing? If we mount the global folder on all instances, it grows very f...
there no version_num in dict, I think this is optional argument, but in UI I see commit id. how I can query from sdk this value from UI or maybe there other method?
@<1523701435869433856:profile|SmugDolphin23> Thank you
@<1523701070390366208:profile|CostlyOstrich36> Hi, yes, but how can I get commit id in sdk, I checked task.script but there only branch name?
ok, yes, I used task get_scripts method. without "data", now all good. Thanks
@<1523701435869433856:profile|SmugDolphin23> Hi, want to ask connected question. How can I find out the hostname of the component from other component, because we have tasks running on different machines in aws and for the client sdk we need to understand where to send the inference request. I thought about the config-server, to which the triton sends pipelineID: hostname and the client then receives information from it knowing the pipelineID. But maybe there is a simpler solution? Also thi...
- Werkzeug==2.2.3- xdoctest==1.0.2- xgboost @ file:///rapids/xgboost-1.7.1-cp38-cp38-linux_x86_64.whl- yarl @ file:///rapids/yarl-1.8.2-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl- zict @ file:///rapids/zict-2.2.0-py2.py3-none-any.whl- zipp==3.15.0Environment setup completed successfullyStarting Task Execution:2025-01-27 13:22:37ClearML results page: files_server: [None](gs://path_to_bucket/projects/56898367b0b44f06a2679cd9e05b3a70/...
@<1523701070390366208:profile|CostlyOstrich36> self hosted andclass ClearmlLogger:
def __init__(self, task):
self.task = taskself.task_logger = task.get_logger()self.task_logger.set_default_upload_destination(' None ')self.writer = SummaryWriter('runs')
def log_training(self, reconstruction_loss, learning_rate, iteration):self.task.get_logger().report_scalar(
` ...