SmarmySeaurchin8 Following up on ColossalDeer61 's hint, notice https://allegroai-trains.slack.com/archives/CTK20V944/p1597248476076700?thread_ts=1597248298.075500&cid=CTK20V944 not-too-old thread on reusing globally installed packages.
WittyOwl57 No worries 🙂 happens to the best!
KindGiraffe71 Have you checked out the https://github.com/allegroai/clearml/blob/master/examples/frameworks/pytorch-lightning/pytorch_lightning_example.py ? https://clearml.slack.com/archives/CTK20V944/p1616070536033700 previous discussion provides some insight into how it works under the hood.
OutrageousSheep60 You can see https://github.com/allegroai/clearml/issues/724 a discussion on the topic.
TL;DR:
Currently the containing project is available in the UI as a tooltip to the dataset name An alternate "Project view" to the datasets page is in the works
TightElk12 This makes a lot of sense - should make it into one of the coming releases
GentleSwallow91 For more information, look at what ClearML logs for your experiments: https://docs-testing.allegro.ai/docs/latest/docs/fundamentals/task#logging-task-information
MelancholyElk85 Thanks for calling this to attention. What do you think would have made it easier for you to notice the available extended list content?
I would assume that a "type to match" option would also have helped?
Appreciate if you could https://github.com/allegroai/clearml/issues/new/choose so this can be pushed forward.
Hi JuicyOtter4
The GUI search returns all experiments in the project that have your search string in their task id, name, description or any of their models' names.
You can use regex with the '.*' button in the search bar.
RotundHedgehog76 Have you tried clearml-data add --files .
? (Probably best to try on a smaller subset first)
@<1580367723722969088:profile|SmoothDuck83> Not every plot is trivially be formed as a table (i.e. CSV), that's why the JSON export is available for all plots.
What were you considering?
@<1580367723722969088:profile|SmoothDuck83> CSV export is only available for table plots
RotundHedgehog76 Thanks for the spot - seems like docs are wrong, and CLI help is correct: '--skip-docker-network' will NOT pass '--network host' to the docker.
The easy way to do that is to add the desired metrics/params as custom columns, then use the column filters: https://clear.ml/docs/latest/docs/webapp/webapp_exp_table#customizing-the-experiments-table
CooperativeSealion8 For future reference, notice there's a configuration reference available at https://allegro.ai/docs/references/trains_ref/
HappyDove3 Notice that in https://github.com/allegroai/clearml/issues/400 the goal is to see a table plot in the UI scalars tab for a specific experiment (with additional discussions on how these will be addressed when comparing experiments).
Note that once you take the approach you suggested of logging your metrics single values, you can configure your experiment comparison scalars view to show single values instead of the time-series graph which I think will provide you with the matrix c...
HappyDove3 you can get some more insight on the different configuration methods and how to use theme https://clear.ml/docs/latest/docs/fundamentals/hyperparameters
DefeatedCrab47 Happy you're finding Trains useful 🙂
but it definitely has it's advantages if TRAINS would support it (early stage Data Science infrastructure).
No doubt, and I definitely see such usable example in the cards for Trains' upcoming versions...
DepressedChimpanzee34 Always appreciated
DefeatedCrab47 For the most part, mlflow can serve basic ML models using scikit-learn. In contrast, Trains was designed with more general purpose ML/DL workflows in mind, for which there's no "generic" way to serve models as different scenarios can use different input encoding, models results would be represented in a variety of forms, etc.
Consider also, that creating an HTTP endpoint for model inference is quite a breeze: there are multiple examples of Flask on top of any DL/ML framework w...
BattyLion34 Adding to AgitatedDove14 hint. See the following docs page: https://allegro.ai/clearml/docs/docs/deploying_clearml/clearml_config_for_clearml_server.html
GreasyPenguin14 That's an annoying bug indeed - Thanks for spotting it. If you need to circumvent it before a fix comes out in one of the near releases, you can programatically use the https://clear.ml/docs/latest/docs/references/api/endpoints#post-projectsupdate e.g.from clearml.backend_api.session.client import APIClient client = APIClient() client.projects.update(project='<project ID>', description='My new description')
Note you can get your project's ID either from the webapp URL...
DepressedChimpanzee34a filter similar to one in the scalars page where you can display a subset of the reported debug images can be useful
The scalars page provides a metric hide/show control - Is this the one you mean? The debug images page also provides a filter by metric - Depending on your naming policy this can easily be used to focus on more sparsely appearing images.
Else, an example of the filter you were thinking of would be appreciated.
Regardless, direct iteration access cou...
UnevenDolphin73 Well... not right now... Currently the ClearML UI only partitions internal artifact types.
That said, having user-defined artifact groups sure sounds worth looking into - Care to https://github.com/allegroai/clearml/issues/new/choose ?
If the credentials don't provide access, the calls should fail (there's no fallback - just default values in place of empty configuration).
Notice you explicitly configure all hosts values, so you don't end up using a specific server for API access, and the default demo server for File server access...
WittyOwl57 I just used a couple of the experiments in the https://app.community.clear.ml/projects/764d8edf41474d77ad671db74583528d/ of the free tier server.
WittyOwl57 The UI shows repo and package detailed comparison under the "Details"/"Execution" (See sample screenshot), whereas auto-logged environment variables are shown under the "HyperParameters" comparison tab.
What do you find missing beyond those?