
Reputation
Badges 1
34 × Eureka!Hi! What the error is saying is that it is looking for the the ctbc/image_classification_CIFAR10.py
file in your repo.
So when you created the task you were inside a git repo, and ClearML assumed that all your files in it were commited and pushed. However your repo https://github.com/gradient-ai/PyTorch.git doesn’t contain these files
mmm, can you try the following:
create a new folder with no git repo, and copy those two notebooks launch the notebook with the base task and copy the task id launch the notebook with the hyperopt task modifying the TEMPLATE_TASK_ID
variable accordingly
Hi ExasperatedCrocodile76 , I guess that you were able to install Scikit-learn and you were able to run it locally, and now you want to try it with an agent on the same machine.
The error is that it can’t find OpenBLAS:
` Run-time dependency openblas found: NO (tried pkgconfig and cmake)
Run-time dependency openblas found: NO (tried pkgconfig)
../../scipy/meson.build:130:0: ERROR: Dependency "OpenBLAS" not found, tried pkgconfig `My question is: did you export some env variabl...
for example I had to do a OPENBLAS="$(brew --prefix openblas)" pip install pandas
to be able to install pandas on my M1 MAC
if I were to run an agent that would require to install pandas at some point I’d run it:OPENBLAS="$(brew --prefix openblas)" clearml-agent daemon --queue default
right, I’m saying I had to do that in my MAC. In your case you would have to point it to somewhere else. Please check where openblas is installed on your ubuntu
where is the dataset stored? maybe you deleted the credentials by mistake? or maybe you are not installing the libraries needed (for example if using AWS you need boto3, if GCP you need google-cloud-storage)
but it would only be affecting that session in the terminal…so you would want to add it to your .bashrc
but would installinggit+
<user>/rmdatasets
install rmdatasets == 0.0.1
?
Aren’t they redundant?
Right, you don’t want ClearML to track that package, but there isn’t much you can do there I believe. I was trying to tackle how to run your code with an agent given those dependencies.
I think that if you clone the experiment, and remove that line in the dependencies sections in the UI you should be able to launch it correctly (as long as your clearml-agent
has the correct permissions)
and btwif "
@" in line: line = line.replace("
@", "https://")
should be the same as
...
would the same experiment be called in either clearml server?
there is no /usr/share/elasticsearch/logs/clearml.log
file (neither inside the container nor in my server)
Currently I’m changing /opt/ for my home folder
oh but docker-ps
shows me 8081 ports for webserver, apiserver and fileserver containers
` CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES
0b3f563d04af allegroai/clearml:latest "/opt/clearml/wrappe…" 7 minutes ago Up 7 minutes 8008/tcp, 8080-8081/tcp, 0.0.0.0:8080->80/tcp, :::8080->80/tcp clear...
I also changed the permissions of /usr/share/elasticsearch
according to this post: https://techoverflow.net/2020/04/18/how-to-fix-elasticsearch-docker-accessdeniedexception-usr-share-elasticsearch-data-nodes/ , but I’m getting the same error
ok, I entered the container, replaced all 8081 to 8085 in every file, commited the container and changed the docker-compose.yml
to use that image instead of the allegroai/clearml:latest
and now it works 🙂
another thing: I had to change 8081
to 8085
since it was already used
I’m suggesting MagnificentWorm7 to do that yes, instead of adding the files to a ClearML dataset in each step
That’s why I’m suggesting him to do that 🙂
Hey! When you say it wasn’t enough, what do you mean? Can you launch the web UI?
just do:import os.path as op dataset_folder = Dataset.get(dataset_id="...").get_local_copy() csv_file = op.join(dataset_folder, 'salary.csv')
that depends…would that only keep the latest version of each file?
Thanks for the answer. You’re right. I forgot to add that this tasks runs inside a docker container and I’m currently only mapping the $PWD ( ml
folder) into /app folder in the container.