Examples: query, "exact match", wildcard*, wild?ard, wild*rd
Fuzzy search: cake~ (finds cakes, bake)
Term boost: "red velvet"^4, chocolate^2
Field grouping: tags:(+work -"fun-stuff")
Escaping: Escape characters +-&|!(){}[]^"~*?:\ with \, e.g. \+
Range search: properties.timestamp:[1587729413488 TO *] (inclusive), properties.title:{A TO Z}(excluding A and Z)
Combinations: chocolate AND vanilla, chocolate OR vanilla, (chocolate OR vanilla) NOT "vanilla pudding"
Field search: properties.title:"The Title" AND text
Hello I Was Asked To Bring Up A Clearml Server On Our System. We Do Not Use Any Of The Cloud Options And Not Use Docker As Well


I was asked to bring up a clearML server on our system. We do not use any of the cloud options and not use docker as well 🙂 . On most of the services I had to bring up there were instructions for manual installation, but for clearML there isn't.
Before I'm trying to look into all the docker images and find out what I have to run, I would like to know if there are any instructions? or is there any one who did it before?
Since we have an elasticsearch server and redis server as well I guess I need instructions for the webui and api only. Am I right?


Posted one year ago
Votes Newest

Answers 11

Hi CheekyToad28 ,

None of the options https://clear.ml/docs/latest/docs/deploying_clearml/clearml_server#deployment works for you?

Posted one year ago

Hi TimelyPenguin76

The only option for Linux is prebuilt docker image, which I cannot use on our system.

Posted one year ago

Well, installing t manually would require a lot of work, including building the angular app, and installing various requirements for the different servers (apiserver, fileserver etc.) - without docker, each should be run in it's own virtual python environment

Posted one year ago

I'm here back to say that I was able to bring up clearML w/o docker. I told the lab to test it in order to see if it functions properly.
The process was:
creating the web interface by cloning clearml-web and building it. during the build I had to fix some paths, so I did and create a pull request to the original repository. I setup the apiBaseUrl and fileBaseUrl according to the location I use in the nginx.conf running the api server and the files server using uwsgi. In order to use other path from /opt/clearml I mad a little hack in the apiserver/config/basic.py and added another path for configs. running nginx server with uwsgi_pass to the socket files I used for the api and files servers (I prefer file socket over high ports)
Running redis, mongodb and elasticsearch locally made it easier since I didn't have to change the configuration of those. I guess I could use some environment variable on the uwsgi files.

Thanks you all

Posted one year ago

Hi SuccessfulKoala55
In case the code is using environment variable in runtime (such as CLEARML_CONFIG_DIR), I have to pass them on runtime. I can use "export" if I run it manually, or setup environment on the systemd file or setting them on the uwsgi configuration file.
Since I want it to run automatically when the system starts I have to set them on one of the service's configuration file.

Posted one year ago

Couldn't you simply prefix the command running at system startup with the env var, i.e. CLEARML_CONFIG_DIR=/my/path the-command-to-run ?

Posted one year ago

Hi CheekyToad28 , any specific reason you can't use the prebuilt docker images?

Posted one year ago

You can try using the GCP VMDK, perhaps

Posted one year ago

Hi CheekyToad28 , nice job 🙂

In order to use other path from /opt/clearml I mad a little hack in the apiserver/config/basic.py and added another path for configs.

This can be achieved using the CLEARML_CONFIG_DIR env var

I guess I could use some environment variable on the uwsgi files.

Out of curiosity, what would you need that var for, exactly, and what would it control?

Posted one year ago

Also, you can simply install a VM and install the docker image inside...

Posted one year ago

SuccessfulKoala55 we are not using docker for security reasons. I'm trying to find a way to run the service using our system (it should run on our network) and not a stand-alone server on a VM which will need extra maintenance.

Posted one year ago
11 Answers
one year ago
4 months ago