![Profile picture](https://clearml-web-assets.s3.amazonaws.com/scoold/avatars/MagnificentSeaurchin79.png)
Reputation
Badges 1
109 × Eureka!I did, but I still have the same issue..
tglema@mvd0000xlrndtl2ย clearml-src
git:(28b8502) โ
git status
HEAD detached at 0.17.5rc3
I did a python setup.py develop, and ran the script:
` from clearml import Dataset
dataset = Dataset.create(dataset_project='test', dataset_name='example')
dataset.add_files('/home/tglema/example.jpeg')
dataset.add_files('/home/tglema/logo.png')
print(dataset.list_files())
dataset.upload()
dataset.finalize()
dataset_new = Dataset.create...
but I don't see any change...where is the link to the file removed from?
How are these two datasets different?
Thanks ๐
INHO, the remove_files('logo.png') shouldn't return 0..and I think the problem is that the file passed as argument is not correctly matched with the files stored in the dataset.
so how do I make a PR? ๐
I don't have write access..
no, only in the clearml.conf file
Now I removed the output_uri in the conf file of the machine that started the task, and when I run it as agent in GCP it works.
Is this a bug?
if I put ~/clearml
in the default_output_uri
key, and start the task, when run as agent in GCP I get clearml.Task - INFO - Completed model upload to file:///$github_proj_directory/~/clearml/$proj_name/$experiment_name
so in my main file I have:
from my_package import dummy_module
dummy_module.func(args)
sure, but I don't know if this doesn't break something else
thanks! that was the script I used..but for same reason making two sbs was a bit more complicated than just stacking two..
but I was finally able to do it:
if I write:
in tl2 conf : default_
output_uri
: "/home/tglema/clearml"
in GCP it saves in that same dir
great, let me know if I can help you in any way. Thanks!
Did you put anything insideย
init.py
?
nope
it's my first PR to an opensource project ๐
so I need to run a sed command to replace some lines in one of the tensorflow files..do you know if I can do this as part of the setup.py install?
I don't see anything in the CONFIGURATION section:
sounds like you need to run a service to monitor for new commits in PROJ_1, to trigger the pipeline
https://github.com/GoogleCloudPlatform/python-docs-samples/blob/master/compute/api/create_instance.py I think this is a nice place to start