A big part of the way Datasets work is to turn the data into a parameter rather than be part of the code. You will be able to easily reproduce experiments 🙂
Hi @<1639799308809146368:profile|TritePigeon86> , you mean that in order to initialize machines in ec2 you need to provide some external ip or you need to pass the external ip as a parameter in order for the job to run?
Hi HurtWoodpecker30
Did you change clearml version? What version are you using?
Although I'm not sure it's connected
Are you using a self hosted server or the SaaS solution?
Is output_uri
defined for both steps? Just making sure.
So even if you abort it on the start of the experiment it will keep running and reporting logs?
Hi AttractiveCockroach17 , in the first question - clearml
captures the packages used during the run. What does your script use and what does clearml
capture when running locally on your machine?
You can configure the clearml
to capture your entire environment as well.
Regarding 2:
Can you please expand on the entire process?
Also in network section of developer tools. What is returned to one of the 400 messages?
I'm curious as to why you weren't redirected to login page
Feels like a cookie issue to me
Hi JumpyDragonfly13 , can you try going to http://localhost:8080/login ? What happens when you open developer tools (F12) when browsing currently
I'm accessing both using SSH tunneling & the same domain
I guess we found the culprit 🙂
Browser thinks it's the same backend because of the domain
Now try logging in
Hi DepravedCoyote18 , as long as you have everything backed up (configurations and data) on /opt/clearml/
(I think this is the default folder for storing clearml related stuff) the server migration should work (Data is a different issue).
However, ClearML holds links internally for datasets/debug samples/artifacts and a few other outputs maybe. Everything currently logged in the system to a certain minio server will still be pointing to that minio server.
Does that make sense?
No problem 🙂
Imagine that internally for an artifact that is saved in some address my-minio-host:9000/FILE
Internally ClearML keeps the link to the artifact as is. It doesn't matter where the ClearML backend is located/deployed since it will always be pointing to the same address. You could hack it around by doing changes on Mongo directly but I would strongly advise against it if you're not sure what you're doing
Are you sure you migrated all the data correctly?
Because that seems to be connected to data
It really depends on how you want to work. The --docker
tag will make the agent run in docker mode, allowing it to spin docker containers to run the jobs inside
Can you elaborate on this? You mean package dependencies when experiments are executing via agent?
Hi BroadSeaturtle49 , can you please elaborate on what the issue is?
Hi BroadSeaturtle49 , what versions of clearml-agent
& clearml
are you using? What OS is this?
Hi DullPeacock33 , I think what you're looking for is this:
https://clear.ml/docs/latest/docs/references/sdk/task#execute_remotely
This will initialize all the automagical stuff but won't require running the script locally.
What do you think?
Regarding this one, there is actually a way. If you work on http://app.clear.ml you can share an experiment for other users to see. However, to see the experiment people getting the link would need to sign up. This could also be a pretty cool feature request to make it completely public and open. Maybe open another feature request.
Are you still having these issues? Did you check if it's maybe a connectivity issue?
Hi RotundSquirrel78 , can you try clearing local cache? For me everything is showing properly
Hi GloriousPenguin2 , how did you try to modify it? From the code it looks like it's expecting a configuration and it will sample it once every few minutes