Reputation
Badges 1
89 × Eureka!Okay great thanks SuccessfulKoala55
gdn4.xlarge (the best price for 16GB of GPU ram). Not so surprising they would want a switch
This was the response from AWS:
"Thank you for for sharing the requested details with us. As we discussed, I'd like to share that our internal service team is currently unable to support any G type vCPU increase request for limit increase.
The issue is we are currently facing capacity scarcity to accommodate P and G instances. Our engineers are working towards fixing this issue. However, until then, we are unable to expand the capacity and process limit increase."
AgitatedDove14 is any working on a GCP or Azura autoscaler at the moment?
(deepmirror) ryan@ryan:~$ python -c "import clearml print(clearml.__version__)" 1.1.4
We use albumentations with scripts that execute remotely and have no issues. Good question from CostlyOstrich36
`
import os
import glob
from clearml import Dataset
DATASET_NAME = "Bug"
DATASET_PROJECT = "ProjectFolder"
TARGET_FOLDER = "clearml_bug"
S3_BUCKET = os.getenv('S3_BUCKET')
if not os.path.exists(TARGET_FOLDER):
os.makedirs(TARGET_FOLDER)
with open(f'{TARGET_FOLDER}/data.txt', 'w') as f:
f.writelines('Hello, ClearML')
target_files = glob.glob(TARGET_FOLDER + "/**/*", recursive=True)
# upload dataset
dataset = Dataset.create(dataset_name=DATASET_NAME, dataset_project=DATASET_PR...
I'll like to call Run Time
via the task object.... I think I need to calculate manually
i.e.
task = clearml.Task.get_task(id) time = task.data.last_update - task.data.started
` client.queues.get_default()
Traceback (most recent call last):
File "<string>", line 1, in <module>
File "/opt/conda/lib/python3.9/site-packages/clearml/backend_api/session/client/client.py", line 378, in new_func
return Response(self.session.send(request_cls(*args, **kwargs)))
File "/opt/conda/lib/python3.9/site-packages/clearml/backend_api/session/client/client.py", line 122, in send
raise APIError(result)
clearml.backend_api.session.client.client.APIError: APIError: code 4...
I also noticed that my queue stats haven't been updated since 7/1/2022 @ 12:41am
This was the error I was getting from uploads using the old SDKhas been rejected for invalid domain. heap-2443312637.js:2:108655 Referrer Policy: Ignoring the less restricted referrer policy "no-referrer-when-downgrade" for the cross-site request:
Hi SuccessfulKoala55 yes I can see the one upload using 1.6.1 but all old datasets have now been remove. I guess you want people to start moving over?
Looks like it's picking up the projects but then viewing on the UI they disappear
I've got it... i just remembered I can calltask_id
from the cloned tasked and check the status of that 🙂
so I guess if the status has changed from running to completed
we normally do something like that - not sure what why it's freezing for you without more info
Hi SuccessfulKoala55 I gave up after 20 mins and also got a notification from firefox "This page is slowing down Firefox. The speed up your browser, stop this page". I'm heading out soon so I could leave it on. Also, had the same behaviour in chrome.
great thank you it's working. Just wanted to check before adding all env vars 🙂