Do you mean reporting scalars with tensorflow OR having the reported tensorflow scalars show up on ClearML?
Oh, I understand. I'm guessing the next 1-2 months would be a timeframe for a new release of the server.
Are you running it inside a docker yourself or is it run via the agent?
Also how are you uploading? Because if you don't zip the folder and upload withtask.upload_artifact('local folder', artifact_object=os.path('<PATH_TO_FOLDER>'))
This should work
Hmmm interesting. According to bytes it looks like 2GB. What type is the file?
Hi @<1731483438642368512:profile|LoosePigeon2> , you need to set the following:
sdk: {
development: {
store_code_diff_from_remote: false
store_uncommitted_code_diff: false
On the machine you're running your pipeline from
Hi @<1742355077231808512:profile|DisturbedLizard6> , I think you need to select the last/max/min options
Hi @<1533619725983027200:profile|BattyHedgehong22> , does the package appear in the installed packages section of the experiment?
Hi @<1523701083040387072:profile|UnevenDolphin73> , can you please elaborate?
In the "Execution" section go to "Container". You will need to edit the "image" section with the appropriate docker image. You will need to find a docker image that has the python version you need to run with.
In the UI, you can edit the docker image you want to use. You can then choose an image with the needed python pre-installed
OK. So now you have some task that you want to run with a specific docker image that you have in hand?
This is the default image. I guess it doesn't have the python version you need to run with.
The json file looks like an error. Did your log download fail?
I'm not personally familiar, but I'm sure searching for docker images with the python version you need will yield the required results 🙂
Hi @<1529633475710160896:profile|ThickChicken87> , do you mean via the API? I suggest taking a look at what the UI is doing when scrolling through metrics and copying that method of work
Hi @<1625303791509180416:profile|ExasperatedGoldfish33> , I would suggest trying pipelines from decorators. This way you can have very easy access to the code.
None
Hi TrickyFox41 , are you getting some sort of error?
Dataset.get
only fetches the dataset object, it doesn't try accessing files yet. What else are you doing in your code that reproduces your issue?
I think you can periodically upload them to s3, I think the StorageManager would help with that. Do consider that artifacts are logged in the system with links (each artifact is a link in the end) So even if you upload it to and s3 bucket in the backend there will be a link leading to the file-server so you would have to amend this somehow.
Why not upload specific checkpoints directly to s3 if they're extra heavy?
Hi @<1535069219354316800:profile|PerplexedRaccoon19> , I'm not sure I understand what you mean. Can you elaborate on the use case?
Either that or have a shared mount between the machines
Hi MammothParrot39 , what command do you run the agent with?
Hi MortifiedHorse99 , yes ClearML has a model repository and models are special entities in the system.
You can see the SDK docs here:
https://clear.ml/docs/latest/docs/references/sdk/model_model
By the way, is there a specific functionality you're looking for?
TenseOstrich47 , you could create a monitor task that reads model performance from your database and reports them as some scalar. According to that scalar you can create triggers 🙂
What do you think?
external trigger
What do you mean? Do you have a reference?
Hi @<1546665634195050496:profile|SolidGoose91> , when configuring a new autoscaler you can click on '+ Add item' under compute resources and this will allow you to have another resource that is listening to another queue.
You need to set up all the resources to listen to the appropriate queues to enable this allocation of jobs according to resources.
Also in general - I wouldn't suggest having multiple autoscalers/resources listen to the same queue. 1 resource per queue. A good way to mana...
Browser thinks it's the same backend because of the domain