Try only like this ostore.xyz.tech:9000
you configure the worker name in clearml.conf and I think you'll need to re-run it
And if you switch back to 1.1.2 in the setup that 1.1.1 worked, does it still fail?
I think you should investigate what happens during docker-compose up to see why the services agent docker isn't running
I'm not sure. Maybe @<1523701087100473344:profile|SuccessfulKoala55> can help 🙂
Hi @<1795263699850629120:profile|ContemplativeParrot88> , are you sure the hyper parameters themselves are properly connected? If you run as a single run and change parameters, do they take effect?
I might not be able to get to that but if you create an issue I'd be happy to link or post what I came up with, wdyt?
Taking a look at your snippet, I wouldn't mind submitting a PR for such a cool feature 🙂
Also please expand the 500 errors you're seeing. It should give some log
Hi DistressedKoala73 ,
What version of ClearML are you using? Are you using a remote interpreter? You can also connect it manually with https://clear.ml/docs/latest/docs/references/sdk/task#set_script
HungryArcticwolf62 , I couldn't find something relevant 😞
AgitatedDove14 , wdyt?
task that reads a message from a queue
Can you give a specific example?
I have tried
task.upload_artifact('/text/temp', 'temp.txt')
but it's not working (I can access the task, but as soon as I click artifacts tab, it shows 404 error).
Can you please elaborate on this? Can you please share a screenshot?
Hi @<1657556312684236800:profile|ManiacalSeaturtle63> , can you please elaborate specifically on the actions you took? Step by step
I would suggest adding print outs during the code to better understand when this happens
Hi @<1523701122311655424:profile|VexedElephant56> , can you please elaborate a bit more on how you set up the server? Is it on top of a VPN? Is there a firewall? Is it a simple docker compose or on top of K8s?
How is the model being saved/logged into clearml?
Happy to help 🙂
Note that you will get almost all information about the task using tasks.get_by_id , then you would need a few more calls to extract the console/scalars/plots/debugs
And additionally does the
When executing a Task (experiment) remotely, this method has no effect).
part means that if it is executed in a remote worker inside a pipeline without the dataset downloaded the method will have no effect ?
I think this means the add tags specifically will have no effect
Hi @<1625666182751195136:profile|MysteriousParrot48> , I'm afraid that this looks like a pure ElasticSearch issue, I'd suggest checking on ES forums for help on this
Hello CurvedHedgehog15 , I don't think there is such an option. You can however add metrics over a completed task.
Try running with all them marked out so it will take defaults
Hi @<1523702031007617024:profile|GrotesqueDog77> , please refer to the documentation to see all the possibilities you have with the SDK - None (Just scroll down from there)
As a side note, this is the SDK, not the API 🙂
Hi @<1523701283830108160:profile|UnsightlyBeetle11> , I think you can store txt artifacts so you can store the string there. If it's not too long, you can even fetch it from the preview
Can you please also add the payload?