Reputation
Badges 1
14 × Eureka!Hadrien, just making sure I get the terminology, stopped instance meaning you don't pay for it, but just its storage, right? Or is it up and idling (and then Martin's suggestion is valid)? Do you get stopped instances instantely when you ask for them?
the upload method (which has an SDK counterpart) allows you to specify where to upload the dataset to
Hey Tim! So what natanM gave you is a fair place to start (albeit probably not up-to date on either side). There are a few product overviews online (but they are outdated like a month after they're written so...)
As for pricing, we are going to release our new website with updated pricing that will make it more transparent AND easier to compare 🙂
Hey, AFAIK, SDK version 1.1.0 disabled the demo server by default (still accessible by setting an envvar).
https://github.com/allegroai/clearml/releases/tag/1.1.0
Is this still an issue even in this version?
Just making sure, you're running the server locally and run the script on jupyter also locally, right?
Hmm, I'm not 100% sure I follow. you have multiple models doing predictions. Is there a single data source that feeds to them and they run in parallel. or is one's output is another input and they run serially?
Hi HandsomeGiraffe70 , David's suggestion is great and the way to go now. We are working on adding this functionality to the SDK (without using the APIclient) and document it better 🙂 Stay tuned 😄
` pipe = PipelineController(
project='examples',
name='Pipeline demo',
version='1.1',
add_pipeline_tags=False,
)
set the default execution queue to be used (per step we can override the execution)
pipe.set_default_execution_queue('default')
add pipeline components
pipe.add_parameter(
name='url',
description='url to pickle file',
default=' '
)
pipe.add_function_step(
name='step_one',
function=step_one,
function_kwargs=dict(pickle_data_url='${pi...
ReassuredTiger98 Nice digging and Ouch...that isn't fun. Let me see how quickly I can get eyes on this 🙂
Hi MysteriousSeahorse54 How are you saving the models? torch.save() ? If you're not specifying output_uri=True it makes sense that you can't download as they are local files 🙂
And when you put output_uri = True, does no model appear in the UI at all?
Hi Binoy, At the moment, we only support this featured in our enterprise offering. We're now adding more volume to our paid tier and this is indeed a candidate feature to be added. Stay tuned 🙂
ImmensePenguin78 we also have a new example for this!
https://github.com/allegroai/clearml/blob/master/examples/reporting/artifacts_retrieval.py
You can use:task = Task.get_task(task_id='ID') task.artifacts['name'].get_local_copy()
get_local_copy() would download the file to your cache and return it's path
And yes, we are going to revisit our assumptions for the model object, adding more stuff to it. Our goal is for it to have just enough info so you can have actionable information (IE, how accurate is it? How fast? How much power does it? How big it is, and other information), but not as comprehensive as a task. something like a lightweight task 🙂 This is one thing we are considering though.
Happy our intention was still clear
I...Think it's a UI bug? I'll confirm 🙂
Hi Mathis, actually, we fixed this in our latest SDK! you can use Task.query_tasks() and you'll get the id's of all the tasks that match the query. The reason we don't get task objects themselves is that it can be quite large and can take a long time.
pytorch wheels are always a bit of a problem and AFAIK it tells that there isn't a matching version to the cuda specified \ installed on the machine. You can try and update the pytorch to have exact versions and it usually solves the issue
Hi OutrageousSheep60 , The plan is to release this week \ early next week a version that solves this.
Hi ScaryBluewhale66 , I believe the new server that's about the be released soon (this \ next week), we'll allow you to report a "single value metric". so if you want to report just a number per experiment you can, then you can also compare between runs.
report_scalar() with a constant iteration, is a hack that you can use in the meantime 🙂