Hi @<1702492411105644544:profile|YummyGrasshopper29> , it looks like the controller is running, but is there any agent listening to where the tasks are being pushed?
Hi @<1535069219354316800:profile|PerplexedRaccoon19> , I think this is what you're looking for 🙂
None
Hi @<1541954607595393024:profile|BattyCrocodile47> , how does ClearML react when you run the scripts this way? The repository is logged as usual?
Hi @<1570220844972511232:profile|ObnoxiousBluewhale25> , what error are you getting?
Hi @<1523702786867335168:profile|AdventurousButterfly15> , I think this is what you're looking for - None
And you use the agent to set up the environment for the experiment to run?
Were you able to do clearml-init
and create a clearml.conf
?
Are you getting some error?
When you run your code after you've added Task.init()
into your code, you will get a link in the console. Following that link will take you to the console output of the experiment. From there you can go into 'Execution' tab and see it all there 🙂
Hi @<1523704157695905792:profile|VivaciousBadger56> , can you provide some screenshots of what you're seeing?
@<1523701087100473344:profile|SuccessfulKoala55> , what is the intended behavior?
Hi @<1534496186793201664:profile|PompousBluewhale96> , can you please elaborate on what is going on and what you expected to happen ?
Oh, can you please do the same with developer tools when user tries to accept?
with the combination of None :port/bucket
for --storage
?
Hi @<1623491856241266688:profile|TenseCrab59> , you need to mark output_uri = True
in Task.init()
Hi @<1539417873305309184:profile|DangerousMole43> , can you add an example of your usage + the errors you were getting?
Then the services agent should be part of it. There also should be a 'services' queue that by default should listen to the services queue
Hi @<1544853695869489152:profile|NonchalantOx99> , what actions would you exactly need to take on the machine? Genesis autoscaler allows storage on azure. If you need to add some extra commands to run before the code execution you can use the setup shell script when running inside a container
Hi @<1670964680270548992:profile|SuperiorOctopus47> , you can manually create experiments and log metrics into them via the REST API - None
You basically have some older runs on your tensorboard that you want to import to ClearML?
Hi @<1547028031053238272:profile|MassiveGoldfish6> , what version of clearml
& pytorch-lightning
? Does this happen to you with the example as well? Are you on a self deployed or the community server?
It's unrelated. Are you running the example and no scalers/plots are showing?
I'm not sure that is possible. What is your specific use case?
Or do you have your own code snippet that reproduces this?
Or should I set agent.google.storage {}?
Did you follow the instructions in the docs?
Just to make sure, run the code on the machine itself to verify that python can actually detect the driver
Hi @<1607184400250834944:profile|MortifiedChimpanzee9> , yes 🙂
This is exactly how the autoscalers work. Scale from 0 to as many as needed and then back to 0