Examples: query, "exact match", wildcard*, wild?ard, wild*rd
Fuzzy search: cake~ (finds cakes, bake)
Term boost: "red velvet"^4, chocolate^2
Field grouping: tags:(+work -"fun-stuff")
Escaping: Escape characters +-&|!(){}[]^"~*?:\ with \, e.g. \+
Range search: properties.timestamp:[1587729413488 TO *] (inclusive), properties.title:{A TO Z}(excluding A and Z)
Combinations: chocolate AND vanilla, chocolate OR vanilla, (chocolate OR vanilla) NOT "vanilla pudding"
Field search: properties.title:"The Title" AND text
Profile picture
QuaintPelican38
Moderator
4 Questions, 25 Answers
  Active since 10 January 2023
  Last activity one year ago

Reputation

0

Badges 1

25 × Eureka!
0 Votes
20 Answers
1K Views
0 Votes 20 Answers 1K Views
2 years ago
0 Votes
4 Answers
1K Views
0 Votes 4 Answers 1K Views
Hi crew! A bit stuck on something basic again: I’m running a ClearML server on AWS EC2 using the latest Community AMI (ami-01edf47969e2515dd - allegroai-clea...
3 years ago
0 Votes
8 Answers
1K Views
0 Votes 8 Answers 1K Views
Hi fam! I’m trying to get clearml-session working but getting stuck on something probably basic. I’ve got an EC2 instance configured and running clearml-agen...
3 years ago
0 Votes
21 Answers
959 Views
0 Votes 21 Answers 959 Views
Hi people! I think the ClearML Dataset.finalize() method might be broken, not sure if here is a good place to report it or I should open an issue? (Or I’m do...
one year ago
0 Hi People! I Think The Clearml

Oops sorry just saw this message now!

one year ago
0 Hi Fam! Sorry For The Potential Dumb Question, But I Couldn’T Find Anything On The Interwebs About It. I’M Hosting A Clearml Server On Aws, Using S3 As A Backend For Artifact Storage. I Find That Whenever I Delete Archived Artifacts In The Web App, I Get

Hi UnevenDolphin73 sorry for the slow reply, been on leave!

We don’t have a solution right now, but if there’s no fix to the frontend in the near future we’ll probably try to write a script that queries the ClearML API for all artefacts, queries S3 for all artefacts, and figures out orphaned artefacts to delete.

2 years ago
0 Hi People! I Think The Clearml

The image is allegroai/clearml:1.0.2-108

one year ago
0 Hi People! I Think The Clearml

And thanks for the consistently speedy responses with getting onto issues when they pop up!

one year ago
0 Hi Fam! Sorry For The Potential Dumb Question, But I Couldn’T Find Anything On The Interwebs About It. I’M Hosting A Clearml Server On Aws, Using S3 As A Backend For Artifact Storage. I Find That Whenever I Delete Archived Artifacts In The Web App, I Get

Thanks AgitatedDove14 ! Just to make sure I’m understanding correctly, do you mean that the ClearML Web server in https://clear.ml/docs/latest/docs/deploying_clearml/clearml_server issues a delete command to the ClearML API server, which is then responsible for trying to delete the files in S3? And that I need to enter an AWS key/secret in the profile page of the web app here?

2 years ago
0 Hi Fam! I’M Trying To Get

Totally! Thanks so much AgitatedDove14 , I’ll try that out now

3 years ago
0 Hi Fam! Sorry For The Potential Dumb Question, But I Couldn’T Find Anything On The Interwebs About It. I’M Hosting A Clearml Server On Aws, Using S3 As A Backend For Artifact Storage. I Find That Whenever I Delete Archived Artifacts In The Web App, I Get

Ok great! I’ve actually provided a key and secret so I guess it should be working. Would you have any suggestions about where I could look to debug? Maybe the docker logs of the web server?

2 years ago
0 Hi People! I Think The Clearml

Great to know it’s a server backwards compatibility issue!

one year ago
0 Hi Crew! A Bit Stuck On Something Basic Again: I’M Running A Clearml Server On Aws Ec2 Using The Latest Community Ami (Ami-01Edf47969E2515Dd - Allegroai-Clearml-Server-1.0.2-108-21). The Only Thing I’Ve Done Is Copy The

EDIT: Turns out in that AMI, the dockerfile has:
agent-services: networks: - backend container_name: clearml-agent-services image: allegroai/clearml-agent-services:latest restart: unless-stopped privileged: true environment: CLEARML_HOST_IP: ${CLEARML_HOST_IP} CLEARML_WEB_HOST: ${CLEARML_WEB_HOST:-} CLEARML_API_HOST: CLEARML_FILES_HOST: ${CLEARML_FILES_HOST:-}
So I changed
# CLEARML_API_HOST: `
CLEARML_API_HOST: ${CLEARML_A...

3 years ago
0 Hi Fam! I’M Trying To Get

Update: I see that by default it uses 10022 as the remote SSH port, so I’ve opened that as well (still getting the “tunneling failed” message though).

I’ve also noticed this log in the agent machine:
2021-07-09 05:38:37,766 - clearml - WARNING - Could not retrieve remote configuration named 'SSH' Using default configuration: {'ssh_host_ecdsa_key': '-----BEGIN EC PRIVATE KEY-----\{private key here}

3 years ago
0 Hi People! I Think The Clearml
  File "/Users/david/dataset_builder.py", line 619, in save
    clearml_ds.finalize()
  File "/Users/david/miniconda3/envs/ml/lib/python3.9/site-packages/clearml/datasets/dataset.py", line 796, in finalize
    self._task.mark_completed()
  File "/Users/david/miniconda3/envs/ml/lib/python3.9/site-packages/clearml/backend_interface/task/task.py", line 688, in mark_completed
    tasks.CompletedRequest(
  File "/Users/david/miniconda3/envs/ml/lib/python3.9/site-packages/clearml/backend_api/...
one year ago
0 Hi Fam! I’M Trying To Get

Oh that’s cool, I assumed the DevOps project was just examples!

There’s a jupyter_url property there that is http://{instance's_private_ip_address}:8888?token={jupyter_token}

There’s also
external_address {instance_public_ip_address} internal_ssh_port 10022 internal_stable_ssh_port 10023 jupyter_port 8888 jupyter_token {jupyter_toke} vscode_port 9000
Maybe this is something stupid to do with VPCs that I should understand bet...

3 years ago
0 Hi Fam! Sorry For The Potential Dumb Question, But I Couldn’T Find Anything On The Interwebs About It. I’M Hosting A Clearml Server On Aws, Using S3 As A Backend For Artifact Storage. I Find That Whenever I Delete Archived Artifacts In The Web App, I Get

(It seems like the web server doesn’t log the call to AWS, I just see this:
{SERVER IP} - - [22/Dec/2021:23:58:37 +0000] "POST /api/v2.13/models.delete_many HTTP/1.1" 200 348 " ID}/models/{MODEL ID}/general?{QUERY STRING PARAMS THAT DETERMINE TABLE APPEARANCE} {BROWSER INFO} "-"

2 years ago
one year ago
0 Hi People! I Think The Clearml

Not sure if that gives you the answer? Otherwise if you can tell me which of the 7 containers to exec into and how to check, happy to do that

one year ago
0 Hi People! I Think The Clearml

I’ll make a venv and test then let you know soon 👍

one year ago
0 Hi People! I Think The Clearml

Nope, still broken on 1.10.0rc

one year ago
0 Hi People! I Think The Clearml

No errors getting an existing dataset @<1537605940121964544:profile|EnthusiasticShrimp49>

one year ago
0 Hi People! I Think The Clearml

it looks like this line will always fail:

None

Because it passes a publish argument to Request , but it doesn’t pass the _allow_extra_fields_ argument, which is required here:

None

one year ago
0 Hi People! I Think The Clearml

@<1523701205467926528:profile|AgitatedDove14> head of master branch

one year ago
0 Hi Fam! I’M Trying To Get

Unfortunately no dice 😕 I’ve opened every port from 0-11000, and am using the command clearml-session --public-ip true on the client, but still getting the timeout message, only now it says:
` Setting up connection to remote session
Starting SSH tunnel
Warning: Permanently added '[<IP address>]:10022' (ECDSA) to the list of known hosts.

SSH tunneling failed, retrying in 3 seconds
Starting SSH tunnel
Warning: Permanently added '[<IP address>]:10022' (ECDSA) to the list of kn...

3 years ago
0 Hi Fam! I’M Trying To Get

Hi AgitatedDove14 thanks for your help and sorry I missed this! I’ve had this on hold for the last few days, but I’m going to try firing up a new ClearML server running Version 1.02 (I’ve been using the slightly older Allegro Trains image from the AWS marketplace) and have another try from there. Thanks for your help on Github too ❤ I’m so blown away by the quality of everything you folks are doing, have been championing it hard at my workplace

3 years ago