
Reputation
Badges 1
35 × Eureka!if I were to run an agent that would require to install pandas at some point Iād run it:OPENBLAS="$(brew --prefix openblas)" clearml-agent daemon --queue default
for example I had to do a OPENBLAS="$(brew --prefix openblas)" pip install pandas
to be able to install pandas on my M1 MAC
right, Iām saying I had to do that in my MAC. In your case you would have to point it to somewhere else. Please check where openblas is installed on your ubuntu
where is the dataset stored? maybe you deleted the credentials by mistake? or maybe you are not installing the libraries needed (for example if using AWS you need boto3, if GCP you need google-cloud-storage)
line 120 says unmark to enable venv caching (it comes commented by default, but since Iām copying my conf it isnāt commented there)
also I suggested to change TMPDIR env variable, since /tmp/ didnāt have a lot of space.
agent.environment.TMPDIR = ****
is it ok to see *
**
*
instead of the actual path?
just do:import os.path as op dataset_folder = Dataset.get(dataset_id="...").get_local_copy() csv_file = op.join(dataset_folder, 'salary.csv')
so I can run the experiments, I can see them, but no plots are saved because there is an upload problem when uploading to localhost:8085
great! and I saw that there were some system packages needed for opencv that were installed automatically that could be turned off. Now Iām just wondering if I could remove the PIP install at the very beginning, so it starts straightaway
oh but docker-ps
shows me 8081 ports for webserver, apiserver and fileserver containers
` CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES
0b3f563d04af allegroai/clearml:latest "/opt/clearml/wrappeā¦" 7 minutes ago Up 7 minutes 8008/tcp, 8080-8081/tcp, 0.0.0.0:8080->80/tcp, :::8080->80/tcp clear...
mmm, can you try the following:
create a new folder with no git repo, and copy those two notebooks launch the notebook with the base task and copy the task id launch the notebook with the hyperopt task modifying the TEMPLATE_TASK_ID
variable accordingly
would it be possible to change de dataset.add_files to some function that moves your files to a common folder (local or cloud), and then use the last step in the dag to create the dataset using that folder?
Hi ExasperatedCrocodile76 , I guess that you were able to install Scikit-learn and you were able to run it locally, and now you want to try it with an agent on the same machine.
The error is that it canāt find OpenBLAS:
` Run-time dependency openblas found: NO (tried pkgconfig and cmake)
Run-time dependency openblas found: NO (tried pkgconfig)
../../scipy/meson.build:130:0: ERROR: Dependency "OpenBLAS" not found, tried pkgconfig `My question is: did you export some env variabl...
Hi AgitatedDove14 , Iām talking about the following pip install.
After that pip install, it displays agentās conf, shows installed packages, and launches the task (no installation)
` Running in Docker mode (v19.03 and above) - using default docker image: spoter ['-e CLEARML_AGENT_SKIP_PYTHON_ENV_INSTALL=1', '-e CLEARML_AGENT_SKIP_PIP_VENV_INSTALL=1']
Running task '3ebb680b17874cda8dc7878ddf6fa735'
Storing stdout and stderr log to '/tmp/.clearml_agent_out.tsu2tddl.txt', '/tmp/.clearml_agent_o...
how do I mount my local ssh folder into /root/.ssh/
docker when running clearml-agent?
also, is there a way for it to not install the requirements, and simply run the task?
Thanks TimelyPenguin76 for your answer! So indeed it was mounting it, and how do I check that āCLEARML_AGENT_SKIP_PYTHON_ENV_INSTALLā is working in my agent in docker?
Thanks for the answer. Youāre right. I forgot to add that this tasks runs inside a docker container and Iām currently only mapping the $PWD ( ml
folder) into /app folder in the container.
there is no /usr/share/elasticsearch/logs/clearml.log
file (neither inside the container nor in my server)
Iām suggesting MagnificentWorm7 to do that yes, instead of adding the files to a ClearML dataset in each step
I could map the root folder of the repo into the container, but that would mean everything ends up in there
another thing: I had to change 8081
to 8085
since it was already used
so when inside the docker, I donāt see the git repo and thatās why ClearML doesnāt see it
but would installinggit+
<user>/rmdatasets
install rmdatasets == 0.0.1
?
Arenāt they redundant?
please remove rmdatasets == 0.0.1
that dependsā¦would that only keep the latest version of each file?