Reputation
Badges 1
662 × Eureka!Okay trying again without detached
These are per-user. Essentially we log user DB access as well (for various backtracking afterwards), so it's beneficial for us to pass the user DB secrets to the task and not have it configured once on the agent.
I believe that happens natively thanks to pyhocon? No idea why it fails on mac
True, and we plan to migrate to pipelines once we have some time for it :) but anyway that condition is flawed I believe
Actually, it appears some elements (scalars, plots, etc) have no migrated by moving mongodb data.
Where are these stored? Any idea @<1523701827080556544:profile|JuicyFox94> ?
Yes, you're correct, I misread the exception.
Maybe it hasn't completed uploading? At least for Datasets one needs to explicitly wait IIRC
How or why is this the issue? I great something is getting lost in translation :D
On the local machine, we have all the packages needed. The code gets sent for remote execution, and all the local packages are frozen correctly with pip.
The pipeline controller task is then generated and executed remotely, and it has all the relevant packages.
Each component it launches, however, is missing the internal packages available earlier :(
I'm trying to build an easy SDK that would fit DS work and fit the concept of clearml pipelines.
In doing so, I'm planning to define various Step classes, that the user can then experiment with, providing Steps as input to other steps, etc.
Then I'd like for the user to be able to run any such step, either locally or remotely. Locally is trivial. Remotely is the issue. I understand I'll need to upload additional data to the remote instance, and pull a specific artifact back to the notebo...
Does it make sense to you to run several such glue instances, to manage multiple resource requirements?
TimelyPenguin76 that would have been nice but I'd like to upload files as artifacts (rather than parameters).
AgitatedDove14 I mean like a grouping in the artifact. If I add e.g. foo/bar to my artifact name, it will be uploaded as foo/bar .
Is there a preferred way to stop the agent?
The error seems to come from this line:self._driver = _FileStorageDriver(str(path_driver_uri.root)) (line #353 in clearml/storage/helper.py
Where if the path_driver is a local path, then the _FileStorageDriver starts with a base_path = '/' , and then takes extremely long time at iterating over the entire file system (e.g. in _get_objects , line #1931 in helper.py )
The S3 bucket credentials are defined on the agent, as the bucket is also running locally on the same machine - but I would love for the code to download and apply the file automatically!
I'm also getting the following warning, I guess it's some ClearML dependency?IPython could not be loaded!
Any leads TimelyPenguin76 ? I've also tried setting up a minio s3 bucket, but I'm not sure if the remote agent has copied the credentials and host 🤔
It's also sufficient to see StorageManager.list("/data/clear") takes a really long time to return no results
Sounds like incorrect parsing on ClearML side then, doesn't it? At least, it does not fully support MinIO then
I don't imagine AWS users get a new folder named aws-key-region-xyz-bucket-hostname when they download_folder(...) from an AWS S3 bucket, or do they? 🤔
I... did not, ashamed to admit. The documentation says only boolean values.
So where should I install the latest clearml version? On the client that's running a task, or on the worker machine?
Example configuration -
` version: 1
disable_existing_loggers: true
formatters:
simple:
format: '%(asctime)s %(levelname)-9s %(name)-24s: %(message)s'
filters:
brackets:
(): ccutils.logger.BracketFilter
handlers:
console:
class: ccmlp.utils.TqdmStreamHandler
level: INFO
formatter: simple
filters: [brackets]
loggers: # Set logging levels for specific packages
urllib3:
level: WARNING
matplotlib:
level: WARNING
...
Eek. Is there a way to merge a backup from elastic to current running server?
The screenshot is small since the data is private anyway, but it's enough to see:
"Metric: untitled 00" "plot image" as the image title The attached histogram has a title ("histogram of ...")
That gives us the benefit of creating "local datasets" (confined to the scope of the project, do not appear in Datasets tabs, but appear as normal tasks within the project)
That's probably in the newer ClearML server pages then, I'll have to wait still 😅
I just ran into this too recently. Are you passing these also in the extra_clearml_conf for the autoscaler?