Reputation
Badges 1
56 × Eureka!There has been a restart of my machine in the mean time :man-shrugging:
So only the matrix knows now I guess..
Yup.
I really don't know what it's about.
It doesn't affect the process. Everything seems to run fine.
If the warnings would provide a bit more info I could maybe pinpoint it better, but it's really all I got
Is there some verbose mode I could run it with?
Figured it out, I installed clearml[gs] but since I don't need that I removed it. it's gone now.
I noticed that it's actually independent of the pipelines
Well.. I'll guess I'll do the workaround then of putting the main code into a submodule and have everything run from there
It happens on all of my pipeline run attempts and there's nothing more that gives insight.
As an example:
python src/train.py
ClearML Task: created new task id=102a4f25c5ac4972abd41f1d0b6b9708
ClearML results page:
<unknown>:1: SyntaxWarning:
invalid decimal literal
<unknown>:1: SyntaxWarning:
invalid decimal literal
<unknown>:1: SyntaxWarning:
invalid decimal literal
<unknown>:1: SyntaxWarning:
invalid decimal literal
<unknown>:1: SyntaxWarning:
invalid decimal...
Here are the codefiles for my pipelines.
They do not work yet, I am struggling with the pipeline stuff quite a bit.
But both pipelines always give these warnings.
The code that is run in regards to clearml is really small.
If there's no mechanism on side of clearml, I might consider just putting that codebase into it's own submodule, making it a different repo without knowledge of the others.
The installed packages of the task say this:
# Python 3.11.2 (main, Mar 13 2023, 12:18:29) [GCC 12.2.0]
PyYAML == 6.0.1
clearml == 1.15.1
google google_api_core
google_cloud_storage == 2.16.0
ultralytics == 8.2.2
I do not know where the google_api_core comes from and I'd like to remove it.
No idea what's going on now, but I cannot reproduce the behaviour either.. also tried my old code posted here, but the warning doesn't pop up anymore.
I will inform once it pops again and will use the provided traceback function then.
I have a slight suspicion that it was indeed environment based on my local machine, but I have no idea what is the trigger for that.
It may or may not be related to this
2024-04-29 23:38:25,932 - clearml.Task - WARNING - Parameters must be of builtin ty...
Yea, I get that.. But it's really hard to tell what's causing it due to the "<unknown>"
It comes from the PipelineDecorator.pipeline I assume or from PipelineDecorator.component
Here is the latest version with all issues ironed out.
This here.. I know how to get the source code info, but it doesn't include the commit ID. And I also cannot access the uncommitted changes.
I have the strong suspicion it is somewhat related to my parameters of the function or generally the hyperparameters gathered by the task automatically.
This is the full log of the task.
I am trying to run HPO.
You mean a seperate branch to work in without the submodules linked?
Not really sure how I'd go about doing that.
I'd be more happy with an option to say 'pull_submodules=False'
Sure can do
If it were possible to override the checkout behaviour I would ignore all submodules anyways, but in the documentation of clearml.conf as well as the pipeline decorator I couldn't find anything that would allow me doing that.
Alright cool!
I will check it out and let you know what it was.
When developing I use the poetry environment, but in the queues I let clearML handle the installation via pip.
There is no need to use poetry if the task is a one-time thing
This function shows the same behaviour once the task gets initialized:
# Training helper functions
def prepare_training(env: dict, model_variant: str, dataset_id: str, args: dict, project: str = "LVGL UI Detector"):
from clearml import Task, Dataset
import os
print(f"Training {model_variant} on dataset: {dataset_id}")
# Fetch dataset YAML
env['FILES'][dataset_id] = Dataset.get(dataset_id).list_files("*.yaml")
# Download & modify dataset
env['DIRS']['target'] ...
Maybe it has something to do with my general environment? I am running on WSL2 in debian
Not that I would know of..
I attached the possible problematic argument.
The strings are just regular string, nothing fancy there.
args
:{'epochs': 3, 'imgsz': 480, 'data': '/home/rini-debian/git-stash/lvgl-ui-detector/datasets/ui_randoms.yaml'}
model_variant
:yolov8n
dataset_id
:50e10f640d7548458d9c38ab9aac571b