Hi @<1639799308809146368:profile|TritePigeon86> , I think the 1.16 refers to the version of the SDK. I'd suggest upgrading your server regardless 🙂
@<1590514584836378624:profile|AmiableSeaturtle81> , its best to open a GitHub issue in that case to follow up on this 🙂
DistressedGoat23 , how are you running this hyper parameter tuning? Ideally you need to have
` From clearml import Task
task = Task.init() `
In your running code, from that point onwards you should have tracking
Hi @<1523701181375844352:profile|ExasperatedCrocodile76> , and now the worker clones the repo correctly?
UnevenDolphin73 , I think this might be right up your alley:
https://clear.ml/docs/latest/docs/references/sdk/task/#connect_configuration
GreasyLeopard35 , What versions of clearml & clearml-agent are you using? Also, what happens if you try in python 3.9?
Hi MagnificentWorm7 ,
I'm not sure I understand. You're trying to upload files to a dataset from different concurrent processes?
Is this commit local or was it pushed to some branch?
you set up 2 agent runs - one with docker and the other without. Each agent should be listening to a different queue. Makes sense?
Or are you trying to change something in the docker compose?
Hi RoundMosquito25 , I think it should work. Give it a try and tell us 🙂
Hi @<1570220852421595136:profile|TeenyHedgehog42> , are you using the latest version of clearml-agent ? Can you provide a stand alone code snippet that reproduces this behavior for you?
Hi @<1523703572984762368:profile|SlimyDove85> , conceptually I think it's possible. However, what would be the use case? In the end it would all be abstracted to a single pipeline
CheerfulGorilla72 , can you please provide to me how your ~/clearml.conf has the following settings configured?
api.web_server
api.api_server
api.files_server
That's an interesting question. I think it's possible. Let me check 🙂
VexedCat68 , It appears to be a bug of sorts, we'll sort it out 🙂
ShallowGoldfish8 , I think the best would be storing them as separate datasets per day and then having a "grand" dataset that includes all days and new days are being added as you go.
What do you think?
Hi @<1675675722045198336:profile|AmusedButterfly47> , what is your code doing? Do you have a snippet that reproduces this?
I've also suspected as much. I've asked the guys check out the credentials starting with TX4PW3O (What you provided). They managed to use the credentials successfully with errors.
Therefor it is a configuration issue.
Hi @<1523701504827985920:profile|SubstantialElk6> , I don't think there is such an option currently. Maybe open a GitHub feature request?
ChubbyOwl99 , are you trying to access http://app.clear.ml ?
Maybe ExasperatedCrab78 , might have an idea
Hi @<1768084624061239296:profile|QuaintWoodpecker78> , you have an error when you try to unzip? Are you downloading directly through the webUI? Where was the artifact stored?
@<1561885921379356672:profile|GorgeousPuppy74> , what docs did you find to be lacking?
Hi @<1570220844972511232:profile|ObnoxiousBluewhale25> , you can use the output_uri parameter in Task.init to set a predetermined output destination for models and artifacts
Hi @<1717350332247314432:profile|WittySeal70> , just to clarify, are you talking about the ClearML server itself or about agents?
From my understanding the AMI is simply an image with the ClearML server preloaded to it.
Hi BoredBat47 , I'm not sure. However I doubt that any remote agents would be taking such a configuration