Hi FierceHamster54 , can you please elaborate on the process with a more specific example?
However, now when I go in the Results -> Debug Samples tab, the s3 credential window pops up. Every time that I refresh the page
RattyLouse61 , What version of ClearML are you running, I think this issue was solved in 1.3.0 release
SubstantialElk6 , the agent is designed to re-run in an environment as close as possible to the original. Can you please provide logs of the two experiments so we can compare? I'm not sure what the issue is. Do both computers have the same python versions?
And regarding model deployment you mean serving the model through a serving engine such as triton?
Hi MoodyCentipede68 ,
What version of ClearML / ClearML-Agent are you using? Is it a self hosted server or the SaaS?
Also, can you explain what step 7 was trying to do? Is it running locally or distributed?
@<1556812486840160256:profile|SuccessfulRaven86> , I think this is because you don't have the proper permissions 🙂
I'm not entirely sure which steps you took and if you missed something. Elastic is complaining about permissions - Maybe you missed one of the steps?
I think for this you would need to report this manually. You can extract all of this data using the API and then create custom plots/scalars that you can push into reports for custom dashboards 🙂
Cant you paste the output until the stuck point? Sounds very strange. Does it work when it's not enqueued? Also, what version of clearml-agent & server are you on?
Hi StraightParrot3 , as SuccessfulKoala55 suggested you could maybe use tags for this as well.
In regards to creating views - If you predefine a certain view locally on your browser (with the extra column) I think you can just copy paste the URL and it should include the custom column for anyone using this URL
You can use the CLEARML_LOG_LEVEL
env variable for this - None
Can you add the full log & the dependencies detected in original code? How are you building the pipeline?
Hmmmmm do you have a specific usecase in mind? I think pipelines are created only through the SDK but I might be wrong
VexedCat68 , what if you simply add pip.stop()
? Does it not stop the pipeline? Can you maybe add a print to verify that during the run the value is indeed -1? Also looking from your code it looks like you're comparing the 'merged_dataset_id' to -1
Hi JumpyDragonfly13 , can you try going to http://localhost:8080/login ? What happens when you open developer tools (F12) when browsing currently
Looks decent, give it a try and update us it's working 🙂
Pipelines assume that different steps run on different machines. How would you pass those files between the different machines? If the steps run on the same machine then why have them as different steps?
Hi @<1535069219354316800:profile|PerplexedRaccoon19> you can setup api.files_server
in clearml.conf
to point to your s3 bucket
What OS are you using?
Again, I'm telling you, please look at the documentation and what it says specifically on minio like solutions.
The host should behost: "
our-host.com :<PORT>"
And NOThost: "
s3.our-host.com "
Maybe you don't require a port I don't know your setup, but as I said, in the host settings you need to remove the s3 as this is reserved only to AWS S3.
Hi @<1523702932069945344:profile|CheerfulGorilla72> , in Task.init
specify output_uri=
None
Note that you used an env variable, I want to try the config directly first 🙂
Hi EnviousPanda91 , Yes that is the purpose of the docker bash setup script.
Is there a reason it doesn't look nice?
Are you getting some error?
@<1590514584836378624:profile|AmiableSeaturtle81> , I would suggest opening a github feature request then 🙂
Hi,
From the looks of it, it always returns a string. What is your use case for this? Do you have some conditionality on the type of variable the parameters are?
Hi @<1643060818490691584:profile|MagnificentHedgehong41> , did you specify a project name? You can go into settings and enable showing hidden projects/experiments and then you will be able to see the pipeline steps in projects as well
Hi @<1638349756755349504:profile|MistakenTurtle88> , it simply looks like a new server without any data registered to it yet
You're totally right, if you managed to upload to a bucket then folder failure should be unrelated to permissions
Hi @<1595587997728772096:profile|MuddyRobin9> , does the step fail or just prints this error?