Yes, could you send the full log? screen grab ?
Yep, found it, the --name is marked as required and the argparser throws an error ...
I'll make sure this is fixed as well ๐
A quick fix will be:
` import dotenv
dotenv.load_dotenv('~/.env')
from clearml import Task # Now we can load it.
import argparse
if name == "main":
# do stuff `wdyt?
hm ReassuredTiger98 can you send the full log? I think it should have worked (but as you mentioned it might be conda/pip mix?!)
SharpDove45 FYI:
if you set the environment variable CLEARML_NO_DEFAULT_SERVER=1
, it will make sure never to default to the demo server
AstonishingSeaturtle47 that's awesome! Could you explain the hack, it might be helpful for others (I assume :))
If the only issue is this linetask.execute_remotely(..., exit_process=True)
It has to finish the static analysis of the entire repository (which usually happens in the background but now we have to wait for it). If the repo is large this could actually take 20sec (depending on CPU/drive of the machine itself)
Hi @<1559711593736966144:profile|SoggyCow20>
I would first like to say how amazing clearml is!
Thank you! ๐
Running in Docker mode (v19.03 and above) - using default docker image: nvidia/cuda:10.2-cudnn7-runtime-ubuntu18.04
yes sdk.agent.default_docker.image = python:3.10.0-alpine
should beagent.default_docker.image = python:3.10.0-alpine
Notice the scope is agent, not sdk
Thanks GrievingTurkey78 !
It seems that under the hood they user argparser
See here:
https://github.com/google/python-fire/blob/c507c093fa6622ab5efee21709ffbf25974e4cf7/fire/parser.py
Which means it might just work?!
What do you think?
well that depends on you, what did you write there to know it is the best one ? file name ? added some metric ?
Hi @<1801424298548662272:profile|ConvolutedOctopus27>
I am getting errors related to invalid git credentials. How do I make sure that it's using credentials from local machine?
configure the git_user/git_pass (app key) inside your clearml.conf on the machine with the agent:
None
I want to be able to install the venv in multiple servers and start the "simple" agents in each one on them. You can think of it as some kind of one-off agent for a specific (distributed) hyperparameter search task
ExcitedFish86 Oh if this is the case:
in your cleaml.conf:agent.package_manager.type: conda agent.package_manager.conda_env_as_base_docker: true
https://github.com/allegroai/clearml-agent/blob/36073ad488fc141353a077a48651ab3fabb3d794/docs/clearml.conf#L60
https://git...
Thanks StrongHorse8
Where do you think would be a good place to put a more advanced setup? Maybe we should add an entry for DevOps? Wdyt?
I can share some code
Please do ๐
Each user creates aย
.env
ย file for their needs or exports them in the shell running the python code. Currently I copy the environment variables to an S3 bucket and download it from there
That is a great hack, but who carries the credentials for the S3 bucket? the reason for asking is I;m thinking maybe the code would directly do that (meaning download the .env file and apply them?!)
- At its simplest, this could just mean checking that all of the steps and the pipeline itself have completed successfully (by checking their โTask statusโ).If a pipeline step ends with "failed" status in the pipeline execution function an exception will be raised, if the exception is not caught, the pipeline itself will also fail
run
pipeline_script.py
which contains the pipeline code as decorators.
So in theory the following should actually work.
Let's assume you ...
Hmm, as a quick solution you can use the custom example and load everything manually:
https://github.com/allegroai/clearml-serving/blob/219fa308df2b12732d6fe2c73eea31b72171b342/examples/custom/preprocess.py
But you have a very good point, I'm not sure how one could know what's the xgboost correct class, do you?
Hi @<1631102016807768064:profile|ZanySealion18>
sorry missed that one
The cache doesn't work, it attempts to download the dataset every time.
just making sure the dataset itself contains all the files?
Once I used clearml-data add --folder * CLI everything works correctly (though all files recursively ended up in the root, I had luck all were named differently).
Not sure I follow here, is the problem the creation of the dataset of fetching it? is this a single version or multi...
correct. notice you need two gents one for the pipeline (logic) and one for the pipeline components.
that said you can run two agents on the same machine ๐
Hi ShinyPuppy47
getting this error pretty sprotically
What do you mean by "sporadically" ? This should be consistent ,either there is access to the clearml.conf, file or not. no ?!
What is your setup? Is this coming from the agent or manual execution ?
Could not install packages due to an EnvironmentError: [Errno 2] No such file or directory: '/tmp/build/80754af9/attrs_1604765588209/work'
Seems like pip failed creating a folder
Could it be you are out of space ?
but it is not possible to write to a private channel in which the bot is added.
Is this a Slack limitation ?
@<1787653555927126016:profile|SoggyDuck67> notice the binary
field in the Task "execution" tab, if for some reason it says "python3.10" it will try to use pytho 3.10 when running it.
That said if it does not find the request python version, it should output a warning and default to the python installed.
If you can provide the full log it will be helpful to see what happened there
DisturbedWorm66 it does, I think there is an example here:
https://github.com/allegroai/nvidia-clearml-integration/tree/main/tlt
Or did you mean I can couple a short "mini config" with the package and redirect clearml to use this local one (instead of the one at ~/clearml.conf)?
Actually yes, you can set a "fixed" config point to it with ENV variable, then setup per user just the access/secret .
wdyt?
(I was also pointing to the fact you do not have to use clearml-init you can create a simple partial config template and let user just fill in the missing "key"/"secret")
@<1687643893996195840:profile|RoundCat60> I'm assuming we are still talking about the S3 credentials, sadly no ๐
Are you familiar with boto and IAM roles ?
Notice you have in the Path:/home/npuser/.clearml/venvs-builds/3.7/task_repository/commons-imagery-models-py/sfi
But you should have:/home/npuser/.clearml/venvs-builds/3.7/task_repository/commons-imagery-models-py/