Reputation
Badges 1
10 × Eureka!CostlyOstrich36 hello, thank you! But what if I wanna have it in open-source version? It’s only one feature I want, and I can’t convince my CTO to buy PRO tier only because of it 🙂
It’s sad, but due to security measures we have to use self-hosted version and it seems like PRO
plan does not provide such option
Wow, sounds great! Thank you! I’ll do some research on Terraform
Do you mean that in the Model tab when you look into the model details the URL points to a local location (e.g. file:///mnt/something/model) ?
Exactly.
And your goal is to get a copy of that model (file) from your code, is that correct ?
See, it happens when I tried to connect
existed model (in model registry, model is already uploaded to remote storage). I query this model and connect it to the task
model = InputModel.query_models(model_name=name
task.connect(model[0...
I resolved issue. Works like a charm. I disabled framework auto logging, and clearml does not try to store local model again.
Let’s say I have a dataset from source A, dataset is finalised, upload and looks like this:train_data/data_from_source_A
Each month I receive new batch of data, create new dataset and upload it. And after few months my dataset looks like this:train_data/data_from_source_A train_data/data_from_source_B train_data/data_from_source_C train_data/data_from_source_D train_data/data_from_source_E
Each batch of data was added via creating a new dataset and adding files. Now, I have a large da...
Thank you, it good way to handle it. Of course, it would be great to have such func in clear ml. Only this stops me from deployment.
Have you ever benchmarked clear ml datasets on large datasets ? How good is it on handling them ?
Nothing special
dataset = Dataset.create(dataset_name = 'my_dataset', parent_datasets=None, use_current_task=False)
dataset.add_files(dataset_dir, verbose=False)
dataset.upload(output_url='
)
dataset.finalize(verbose=True)
@<1523701087100473344:profile|SuccessfulKoala55> any hints ?