Reputation
Badges 1
6 × Eureka!it might be because of cache data
pls delete the cache data and try
after every time you start the server
My requirement is as follows:
- NAS is connected to the server.
- I am pushing data from a remote machine to the server.
- By default, the server stores the data in
/opt/clearml/data/fileserver
. - However, I want to store the data on the NAS.
Additionally, after making changes in the docker-compose file, I can adjust the storage path as desired. When I set the path to the NAS, it is resulting in errors.
@<1537605940121964544:profile|EnthusiasticShrimp49>
I have configured it perfectly
Iam able to send data from the container to clearml server
If clearml-agent is the only way
Can you provide any documentation
can i configure the server with public ip
right now it is running at port 8080
but i want it to run at something like 12.12.12.11:8080
I have run the Ubuntu 20.04 container and cloned YOLOv5 inside it. Within the container, I configured ClearML (self-hosting server) with access keys and credentials.
I am launching YOLOv5 training with project and name tags. However, experiment results are not being logged to the ClearML server; instead, they are saved inside the container's root directory under the <project/name>
folder.
Interestingly, when I tried running the process directly on the host machine, the experiment results ...
@<1537605940121964544:profile|EnthusiasticShrimp49>
Sorry for the very late reply. We have mounted an NFS-accessed NAS folder on the machine. Subsequently, we used that folder to store all the files of the file server, and now all the files are stored inside the NAS without explicitly specifying the NAS every time we upload the dataset to the server.