I'm here back to say that I was able to bring up clearML w/o docker. I told the lab to test it in order to see if it functions properly.
The process was:
creating the web interface by cloning clearml-web and building it. during the build I had to fix some paths, so I did and create a pull request to the original repository. I setup the apiBaseUrl and fileBaseUrl according to the location I use in the nginx.conf running the api server and the files server using uwsgi. In order to use other path from /opt/clearml I mad a little hack in the apiserver/config/basic.py and added another path for configs. running nginx server with uwsgi_pass to the socket files I used for the api and files servers (I prefer file socket over high ports)
Running redis, mongodb and elasticsearch locally made it easier since I didn't have to change the configuration of those. I guess I could use some environment variable on the uwsgi files.
Thanks you all
Hi TimelyPenguin76
The only option for Linux is prebuilt docker image, which I cannot use on our system.
Hi CheekyToad28 , any specific reason you can't use the prebuilt docker images?
You can try using the GCP VMDK, perhaps
Couldn't you simply prefix the command running at system startup with the env var, i.e. CLEARML_CONFIG_DIR=/my/path the-command-to-run
?
Hi CheekyToad28 , nice job 🙂
In order to use other path from /opt/clearml I mad a little hack in the apiserver/config/basic.py and added another path for configs.
This can be achieved using the CLEARML_CONFIG_DIR
env var
I guess I could use some environment variable on the uwsgi files.
Out of curiosity, what would you need that var for, exactly, and what would it control?
Hi SuccessfulKoala55
In case the code is using environment variable in runtime (such as CLEARML_CONFIG_DIR), I have to pass them on runtime. I can use "export" if I run it manually, or setup environment on the systemd file or setting them on the uwsgi configuration file.
Since I want it to run automatically when the system starts I have to set them on one of the service's configuration file.
Well, installing t manually would require a lot of work, including building the angular app, and installing various requirements for the different servers (apiserver, fileserver etc.) - without docker, each should be run in it's own virtual python environment
SuccessfulKoala55 we are not using docker for security reasons. I'm trying to find a way to run the service using our system (it should run on our network) and not a stand-alone server on a VM which will need extra maintenance.
Hi CheekyToad28 ,
None of the options https://clear.ml/docs/latest/docs/deploying_clearml/clearml_server#deployment works for you?
Also, you can simply install a VM and install the docker image inside...