Here are my extra_docker_arguments that make the thing working:
GentleSwallow91 Nice!
BTW: in theory there should not need to be any need to add the specific: "-v","/home/nino/.ssh:/home/testuser/.ssh", the agent should do that automatically
I think the main issue is running with python -m module.name --args
Which is a bit different, when trying to "understand" what is the actual repository.
Can you try to run it from the repository folder (same command, just to see if it will have any effect on the detected packages)
Try to upload something to the file server ?
None
None
notice there is a scroll_id there, you might need to call the API multiple times until you scroll over All the events
could that be it?
Hi @<1523706645840924672:profile|VirtuousFish83>
Hello, is it possible to disable lazy loading ?
You mean in the UI for loading the console ?
The logs can be huge 10s and 100s of MB...
We have the same issue for hyperparameters even with only ~100 keys,
100+ parameters that is quite a lot.
So are you saying the search in the UI only filter the lazily loaded elements and not the entire param list?
Hi ArrogantBlackbird16
but it returns a task handle even after the Task has been closed.
It should not ... That is a good point!
Let's fix that ๐
Well I guess you can say this is definitely not self explanatory line ๐
but, it is actually asking whether we should extract the code, think of it as:if extract_archive and cached_file: return cls._extract_to_cache(cached_file, name)
Hi MuddySquid7
Hmmm what would be the use case ? (I mean how are we using Vertex ?)
so you have a repo with poetry that some users update and some do not?
All working on the same branch ?
My question is, which version do you need docker compose?
Ohh sorry, there is no real restriction, we just wanted easy copy-paste for the installation process.
Are you aware of any other way then (other than theย
secure: false
ย flag?
Actually self -signing and providing certificate file is already supported with boto (and thus clearml)
AWS_CA_BUNDLE
https://boto3.amazonaws.com/v1/documentation/api/latest/guide/configuration.html
Seems like credentials error
Do you have everything setup correctly in your ~/clearml.conf ?
UnsightlySeagull42 the assumption is that the agent has a read-only all access user.
As the moment there is no way to configure it to have diff user/pass per repository in the clearml.conf
You can however:
embed the user/pass on the repository link (not very secure) Use ssh-key and have it on .ssh on the host machine Use .git-credentials and configure them (with per project user/pass)
I guess only if autoscaling is used (one worker one machine)?
yes, basically depending on how you set autoscaling / k8s integration ๐
--docker or in clearml.conf https://github.com/allegroai/clearml-agent/blob/21c4857795e6392a848b296ceb5480aca5f98e4b/docs/clearml.conf#L153
Many thanks LazyLeopard18 ! ๐
Thanks MagnificentPig49 !
but it is not optimal if one of the agents is only able to handle tasks of a single queue (e.g. if the second agent can only work on tasks of type B).
How so?
SoreDragonfly16
btw: The difference between the two graphs is the ratio pf the graph display , that it ๐
Hi WickedGoat98
but is there also a way to delete them, or wipe complete projects?
https://github.com/allegroai/trains/issues/16
Auto cleanup service here:
https://github.com/allegroai/trains/blob/master/examples/services/cleanup/cleanup_service.py
Yes, but does add_external_files makes chunked zips as add_files do?
No it references them, (i.e. meta-data not actually doing something with the files themselves)
I need the zipping, chunking to manage millions of files
That makes sens, if that's the case you will have to download those files anyway, and then add them with add_files
you can use the StoargeManager to download them, and then add them from the local copy (this will zip/chunk them)
[None](https://clear.ml/docs/la...
You are correct, it is currently not supported in venv mode. We could not find a good use case for it. What is yours?
. I'm thinking it's generically a kernel gateway issue, but I'm not sure if other platforms are using that yet
The odd thing is that you can access the notebook, but it returns zero kernels ..
UnevenDolphin73 FYI: clearml-data is documented , unfortunately only in GitHub:
https://github.com/allegroai/clearml/blob/master/docs/datasets.md
Hi LazyLeopard18 ,
See details below, are you using the win10 docker-compose yaml?
https://github.com/allegroai/trains-server/blob/master/docs/install_win.md
We're lucky that they let the developers see their code...
LOL ๐
and it is also set in theย
/clearml-agent/.ssh/config
ย and it still can't clone it. So it must be some security issue internally.
Wait, are you using docker mode or venv mode ? in both cases your SSH credentials should be at the default ~/.ssh
PleasantGiraffe85 can you send examples of the different git repo links (one internal one public) ?