Hi PricklyJellyfish35 ,
You should have the OmegaConf
file as part of the task configuration (under CONFIGURATION tab), do you have it there?
Hi SquareFish25 ,
I tried the follow and succeed to upload the file:
` import os
os.environ['AWS_ACCESS_KEY_ID'] = "***"
os.environ['AWS_SECRET_ACCESS_KEY'] = "***"
os.environ['AWS_DEFAULT_REGION'] = "***"
from clearml import StorageManager
remote_file = StorageManager.upload_file(<file to upload>, 's3://bucket_name/inner_folder/file_name') `Can you try it and update if it works for you?
Hi HelpfulHare30 ,
- Is dataset a separate object that can be used within different project or it is a project part?
You can use the dataset in every project you like. Dataset task is part of a specific project, but it can be shared between projects.dataset_folder = Dataset.get(dataset_id="dataset_id").get_local_copy()
- Can I find a dataset created with cli in Web UI?
Yes. it should create a Data processing
task.
- Can I configure clearml to store datasets by...
Hi DeliciousBluewhale87
Can you share the version you are using? Did you get any other logs? maybe from the pod?
Hi MammothGoat53 ,
which clearml
version are you using? I run the same and all worked as expected (I changed the project_name
and the task_name
to be 4 chars length)
If you like to create a plot (not an image plot) use show
without imshow
SpotlessFish46 You can change models and artifacts destination per experiment with output_uri
https://github.com/allegroai/trains/blob/b644ec810060fb3b0bd45ff3bd0bce87f292971b/trains/task.py#L283 , can this work for you?
You can always clone a “template” task and change everything (it will be on draft
mode), what is you use case? maybe we already have a solution for it
BTW why using the api calls and not clearml sdk?
ok, I think I missed something on the way then.
you need to have some diffs, because
Applying uncommitted changes Executing: ('git', 'apply', '--unidiff-zero'): b"<stdin>:11: trailing whitespace.\n task = Task.init(project_name='MNIST', \n<stdin>:12: trailing whitespace.\n task_name='Pytorch Standard', \nwarning: 2 lines add whitespace errors.\n"
can you re-run this task from your local machine again? you shouldn’t have anything under UNCOMMITTED CHANGES
this time (as we ...
In the task you cloned, do you have torch as part of the requirements?
not much info 😕
Can you manually run the docker ?
Didnt get such, are you using http://app.clear.ml ? your server?
Thanks for the information. do you get any errors? Warnings?
which ClearML agent version are you running?
Hi GreasyWalrus57 , sorry but didn’t get that.
You want to register the data? you can do it with clearml-data
and then use this task to connect between tasks and data
Hi ElatedTurtle20 ,
You can increase it in your configuration file - https://github.com/allegroai/clearml/blob/168074acd97589df58436a3ec122a95a077620c2/docs/clearml.conf#L332.
I added some custom fields to the tasks I am running
where did you add those? as part of the task’s parameters?
Hi HealthyStarfish45
If you are running the task via docker, we dont auto detect the image and docker command, but you have more than one way to set those:
You can set the docker manually like you suggested. You can configure the docker image + commands in your ~/trains.conf
https://github.com/allegroai/trains-agent/blob/master/docs/trains.conf#L130 (on the machine running the agent). You can start the agent with the image you want to run with. You can change the base docker image...
Hi DeliciousBluewhale87 ,
You can just get a local copy of the dataset with ds.get_local_copy()
, this will download the dataset from the dataset task (using cache) and return a path to the downloaded files.
Now, in this path you’ll have all the files that you have in the dataset, you can go over the files in the dataset with ds.list_files()
(or ds.list_files()[0]
if you have only 1 file) and get the one you want
maybe something like:
` ds_path = ds.get_local_copy()
iri...
Hi MysteriousBee56 ,
The https://github.com/allegroai/trains/blob/master/examples/services/cleanup/cleanup_service.py is an example how you can add services to manage your experiments.
You can change the criteria for fetching the tasks in this script (in the https://github.com/allegroai/trains/blob/master/examples/services/cleanup/cleanup_service.py#L72 call) to something like a specific tag you can add to the experiments ( delete
tag?, you can add tag to multi tasks) and it should...
Hi GreasyPenguin14 ,
What is the web server configure in your ~/clearml.conf
file ( api.web_server
)?
something like2021-10-11 10:50:33,703 - clearml.util - WARNING - 123 task found when searching for
my task name2021-10-11 10:50:46,494 - clearml.util - WARNING - Selected task
my task name(id=1cd66581b6624518862306069d220c8b)
where task
is the value return from your Task.init
call,
task = Task.init(project_name=<YOUR PROJECT NAME>, task_name=<YOUR TASK NAME>)
try pip install clearml==0.17.6rc1
Hi VictoriousPenguin97
sdk.storage.direct_access
is part of the extended support in the paid version.
But I think its not required since ClearML will simply try to access the path directly as it is, and you don’t need to configure it.
Hi SubstantialElk6 , you can use task.get_last_iteration()
and use it, what do you think?
Can you see it in the model? Click on the model link to get into the model
So according to it, you are using the repo requirements, and you have torch there?