Reputation
Badges 1
25 × Eureka!Not really 😞
Everyone can do everything, the idea is sharability and accessibility.
I do know that in the paid tier they have full access control roles SSO etc, but unfortunately its way too complicated for the open-source.
Basically what I'm saying is trust your fellow colleagues 🙂
. That speed depends on model sizes, right?
in general yes
Hope that makes sense. This would not work under heavy loads, but eg we have models used once a week only. They would just stay unloaded until use - and could be offloaded afterwards.
but then you still might encounter timeout the first time you access them, no?
ReassuredTiger98 if this user passes to the task as docker args the following, it might work:
'-e CLEARML_AGENT_SKIP_PYTHON_ENV_INSTALL=1'
Actually doesn't matter (systemd and init.d are diff ways to spin services on diff linux distros) you can pick whatever seems more continent for you, and whichever is supported by the linux you are running (in most cases both are) 🙂
Hi GrittyHawk31
this one?
https://clear.ml/docs/latest/docs/deploying_clearml/clearml_server
That should spin up an instance, right? (it currently doesn't, and I'm not sure where to debug)
Do you see the AWS scaler Task running ?
(This is the code/process that actually spins a new EC2 instance)
SubstantialElk6
The CA is taken automatically by urllib, check the OS environments you need to configure it.
https://stackoverflow.com/questions/27835619/urllib-and-ssl-certificate-verify-failed-errorSSL_CERT_FILE REQUESTS_CA_BUNDLE
@<1671689437261598720:profile|FranticWhale40> could you test the fix? just pull & run
allegroai/clearml-serving-triton:1.3.1
allegroai/clearml-serving-inference:1.3.1
Hi @<1784754456546512896:profile|ConfusedSealion46>
Is this reproducible ?
Could it be out of storage?
Hmmm are you saying the Dataset Tasks do not have the "dataset" system_tag as well as the type ?
VivaciousPenguin66 I have the feeling it is the first space in the URI that breaks the credentials lookup.
Let's test it:from clearml import StorageManager uri = ' ` Birds%2FTraining/TRAIN [Network%3A resnet34, Library%3A torchvision] Ignite Train PyTorch CNN on CUB200.8611ada5be6f4bb6ba09cf730ecd2253/models/cub200_resnet34_ignite_best_model_0.pt'
original
StoargeManager.get_local_copy(uri)
qouted
StoargeManager.get_local_copy(uri.replace(' ', '%20')) `
see here the docker_setup_bash_script argument
None
It will be executed (no need for the #!/bin/bash btw) before starting to setup the env inside the container, so apt-get and the like can be executed if needed. Notice that if this is something that Always needs to be executed, you can put the same list of commands here: [None](https://github.com/allegroai/clearml-agen...
TenseOstrich47 FYI:
This might what you are looking for 🙂
https://github.com/allegroai/clearml-agent/blob/822984301889327ae1a703ffdc56470ad006a951/docs/clearml.conf#L61
Hi ConvolutedSealion94
Yes 🙂Task.set_random_seed(my_seed=123) # disable setting random number generators by passing None task = Task.init(...)
Hi @<1523704152130064384:profile|SmallGiraffe94>
Yes it is possible!
set the User Properties of a dataset when creating a Dataset with
A bit hackish but should work.
dataset = Dataset.create(dataset_project="project", dataset_name="name")
dataset._task.set_user_properties(key="value")
dataset_ids = Task.query_tasks(
project_name=["project/.datasets/name"],
task_filter=dict(
type=[str(Task.TaskTypes.data_processing)],
exact_match_regex_flag=False,
...
Correct:extra_docker_shell_script: ["apt-get install -y awscli", "aws codeartifact login --tool pip --repository my-repo --domain my-domain --domain-owner 111122223333"]
I think poetry should somehow return error if toml is "empty" then we can detect it...
Hmm I tested on chromium and it seemed to work, let me see if I can reproduce it...
I solved the issue by implementing my own ClearML logger
This is awesome! any chance you want to PR it to transformers ?
I think that by default the zipped package files are 0.5GB
(you can control it None look for --chunk-size)
I think the missing part of the api is understanding which chunk your specific file stored in.
You can do something like:
ds = Dataset.get(...)
the_artifact_chunk_I_need = ds.file_entries_dict["myt/file/here"].artifact_name
wdyt?
maybe worth to add an interface ?
Hmm @<1523701279472226304:profile|SoreHorse95> this is a good point, I think you are correct we need to fix that,
- Could you open a GitHub issue so this is not forgotten ?
- As a workaround I would use clone=True, then after the call I would call task.close() on the original task, wdyt?
The thing I don't understand is how come this DOES work on our linux setups
I do not think it actually works... I could not have find a code that will convert the ENV in the config string ...
I'll be happy to test it out if there's any commit available?
Please do, and feel free to PR it 😍
https://github.com/allegroai/clearml/blob/d3e986393ac8d1a1ea48302224962570ab8e6f9e/clearml/backend_api/session/session.py#L576
https://github.com/allegroai/clearml/blob/d3e98639...
@<1577468638728818688:profile|DelightfulArcticwolf22>
How can I tell clearml-agent not to run pip install unless my requierments.txt file was changed.
the agent has built in cache, it will reuse the previous venv if nothing changed (cache local on the agent's machine).
Make sure this is line is not commented :
None
for example, one notebook will be dedicated to explore columns, spot outliers and create transformations for specific column values.
This actually implies each notebook is a standalone "process", which makes a ton of sense. But this is where notebooks and proper SW design break, in traditional SW, the notebooks are actually python files, and then of course you can import one from another, unfortunately this does not work in notebooks...
If you are really keen on using notebooks I wou...
task=Task.current_task()
Will get me the task object. (right?)
PanickyMoth78 yes, always, from anywhere, this is a singleton object 🙂
${PWD} works!
This will be resolved every call to Task.init (so I would recommend against it), how about "$HOME/" ?