Okay yes, that's exactly the reason!! Cross origin blocks the file link
Hi CluelessFlamingo93
I think the latest clearml-agent 1.5.1 fixed that issue (this is basically pip trying to "protect" you from mismatch packages)
can you upgrade your clearml-agent and test?pip3 install clearml-agent==1.5.1
CleanPigeon16 Coming very soon, we adding a few features for the pipeline, this one will also be included :)
I do not think this is the upload timeout, it makes no sense to me for GCP package (we do not pass any timeout, it's their internal default for the argument) to include a 60sec timeout for upload...
I'm also not sure where is the origin of the timeout (I'm assuming the initial GCP handshake connection could not actually timeout, as the response should be relatively quick, so 60sec is more than enough)
HI @<1687643893996195840:profile|RoundCat60>
Are you running on AWS ?
Once a model is saved and published, it should be downloadable right
Well that depends if you configured CLearML to autoupload it (by default it will just log the "local location").
To auto-upload add output_uri=True
to Task.Init
(or specify a destination with output_uri= ` s3://bucket/ )
You can also configure it as default here:
https://github.com/allegroai/clearml/blob/65f1c0baa124efb05fb7894a5386f0dd52c0536b/docs/clearml.conf#L163
We used subprocess for it, ...
Popen? os.system? fork?
No need, it should auto close it if you started it with Task.init (or the agent executed it)
Hi RipeGoose2
What exactly is being uploaded ? Are those the actual model weights or intermediate files ?
Ok no it only helps if as far as I don't log the figure.
you mean if you create the natplotlib figure and no automagic connect you still see the mem leak ?
Are you saying that in the UI you do not see "confusion matrix" at all, only on the GS bucket ?
BTW
/home/local/user/.clearml/venvs-builds/3.7/bin/python: can't open file 'train.py': [Errno 2] No such file or directory
This error is from the agent, correct? it seems it did not clone the correct code, is train.py
committed to the repository ?
Is is across the board for any Task ?
What would you expect to happen if you clone a Task that used the requirements.txt, would you ignore the full "pip freeze" and use the requirements .txt again, or is this thime we want to use the "installed packages" ?
I would like to force the usage of those requirements when running any script
How would you force it? Will you just ignore the "Installed Packages" section ?
Thanks VexedCat68 !
This is a great example, maybe PR it to the cleamrl-servvng repo ? wdyt?
While if I just download the right packages from the requirements.txt than I don't need to think about that
I see you point, the only question how come these packages are not automatically detected ?
@<1710827340621156352:profile|HungryFrog27> the venv-build folder is supposed to be deleted after each task is done. How did you end up with leftovers? Could it be windows was failing to delete it for some reason? That actually connects with you initial issue no?
Hi JitteryCoyote63
Show running experimentsIt doesn't?
Have the legend clickable, to hide/show experiments based on their status:+1:
Have a line connecting points that are SOTA (example in https://paperswithcode.com/sota/image-generation-on-cifar-10 )I like that, how is that selected? (I know FE are thinking of replacing this entire graph library, so maybe good timing in terms of what to look at)
Thanks PompousBaldeagle18 !
Which software you used to create the graphics?
Our designer, should I send your compliments 😉 ?
You should add which tech is being replaced by each product.
Good point! we are also missing a few products from the website, they will be there soon, hence the "soft launch"
Hi @<1547390415320125440:profile|SilkySparrow85>
because it is trying to send a debug-sample to fileserver!
Yes, you should always configure the "files server" to point to your minio S3, basically:
None
files_server: "
"
But do not forget to also configure the credentials here:
[None](https://github.com/allegroai/clearml/blob/40c6db9d95016382c721546d42...
LovelyHamster1 from the top, we have two steps:
We run the code "manually" (i.e. without the agent) this step create the experiment (Task) and automatically feels in the "installed packages" (which are in the same format as regular requirements.txt) An agent is running a cloned copy of the experiment (Task). The agents creates a new venv on the agent's machine, then the agent is using the "Installed packages" section as a replacement to regular "requirements.txt" and installs everything fro...
You mean does one solution is better than combining maintaining and automating 3+ solutions (dvc/lakefs + mlflow + cubeflow/airflow)
Yes I'd say it is. BTW if you have airflow running for other automations you can very easily combine the automation with clearml and have a single airflow automation for everything, but the main difference now airflow only launches logic, never actual compute/data (which are launched and scaled via clearml
Does that make sense?
What's the Windows version, python version, clearml version, you are using ?
SmugOx94
after having installedÂ
numpy==1.16
 in the first case orÂ
numpy==1.19
 in the second case. Is it correct?
Correct
the reason is simply that I'd like to setup an MLOps system where
I see the rational here (obviously one would have to maintain their requirements.txt)
The current way trains-agent
works is that if there is a list of "installed packages" it will use it, and if it is empty it will default to the requirements.txt
We cou...
SmugOx94 could you please open a GitHub issue with this request, otherwise we might forget 🙂
We might also get some feedback from other users
Hi UnsightlySeagull42
does anyone know how this works with git ssh credentials?
These will be taken from the host ~/.ssh folder
if in the "installed packages" I have all the packages installed from the requirements.txt than I guess I can clone it and use "installed packages"
After the agent finished installing the "requirements.txt" it will put back the entire "pip freeze" into the "installed packages", this means that later we will be able to fully reproduce the working environment, even if packages change (which will eventually happen as we cannot expect everyone to constantly freeze versions)
My problem...