
Reputation
Badges 1
67 × Eureka!Hi AgitatedDove14 , is the Dataset.get
will take all child too?
it seems i forgot using clearml-agent init.
i follow this way: https://clear.ml/docs/latest/docs/guides/ide/google_colab/
remove this params will solve use_current_task=True,
# downloading data from s3 manager = StorageManager() target_folder = manager.download_folder( local_folder='/tmp', remote_url=f'
` '
)
# upload to clearml
dataset = Dataset.create(
dataset_project=metadata[2],
dataset_name=metadata[3],
dataset_tags=tags,
output_uri=" ` ` "
)
fp_target_folder = os.path.join(target_folder, minio_s3_url)
print('>>...
oh okay, so i need to set that to path ssd, yeah?
is it this one? or there is
docker_internal_mounts {
sdk_cache: "/clearml_agent_cache"
apt_cache: "path/to/ssd/apt-cache"
ssh_folder: "/root/.ssh"
pip_cache: "path/to/ssd/clearml-cache/pip"
poetry_cache: "/mnt/hdd_2/clearml-cache/pypoetry"
vcs_cache: "path/to/ssd/clearml-cache/vcs-cache"
venv_build: "path/to/ssd/clearml-cache/venvs-builds"
pip_download: "path/to/ssd/cle...
alright, will try thanks!
i feel pain to make it as form if so much varible want to changes.
Hi @<1523701070390366208:profile|CostlyOstrich36> , thanks for response, sorry for late replay,
this is my configuration in yaml, i facing difficulty when there is params in list. somehow, form to display bunch list not easy to see. do you have suggestion? Thanks!
download-data:
dataset_train:
-
-
-
dataset_test:
-
-
-
train:
data:
batch: 4
input_size: 224
split:
t...
the current my solution is upload my config to s3, and the pipeline will download it and read it when execute. but its decrase flexiblity.
i see,
thanks for clarify. i just want to find other solutions to storing secret value. rightnow i just storing secret value on env in clearml.conf in my workers. but it will complicated if there is new value, i need update workers conf and redeploy workers.
it seems if i access with my dns cannot see
and if access with ip address can see
yeah, it seems i will try option 2 that you mentioned.
i upload my full my configuration, and when done using google colab my configuration will gone too.
nope, still looking away to set AWS S3 secret_key without doing clearml-agent init
wow, okay, i think will move all logs/plot/artifacs to my storage s3. Thanks! really helpful!
Hi @<1523701087100473344:profile|SuccessfulKoala55> , Thanks for your response.
I'm not entirely sure about the use of CLEARML_ENV
since I haven't interacted with it before. Could you guide me on what I should set as its value?
Previously, the system was running smoothly. However, I've run into some issues after making certain configuration changes to modify the server permissions. Specifically, I'm curious if these changes might have influenced the agent's permission to access certain...
Hi, @<1523701070390366208:profile|CostlyOstrich36> ,yes! correct! how to achive that? it will save my storage.
i see, it solved right now using default_output_uri, Thanks!
i need custom output_uri for some function because split dataset and model artifacs.
yes, so far i know, if we want to upload dataset on clearml, we need provide local_path to data, then clearml will upload to the platform.
my data not on local, but s3 bucket.
is there a way to point s3 url ? my currently workflow is download my data from s3 bucket to local, then upload to clearml.
Hi @<1523701070390366208:profile|CostlyOstrich36> , just want to update,
this is solve by
- remove
-f
- change Task.force_requirements_env_freeze(False, req_path) -> Task.add_requirements(req_path)
- change my clearml-agent settings
Thanks! i just prove it can run in next day, but not for the same day. i hope can run in same day too.
Syncing scheduler
Waiting for next run, sleeping for 5.13 minutes, until next sync.
Launching job: ScheduleJob(name='fetch feedback', base_task_id='', base_function=<function test_make at 0x7f91fd123d90>, queue=None, target_project='Automation/testing', single_instance=False, task_parameters={}, task_overrides={}, clone_task=True, _executed_instances=None, execution_limit_hours=None, r...
Hi @<1523701205467926528:profile|AgitatedDove14> , Thanks for rresponse!
this my simple code to test scheduler
import datetime
from clearml.automation import TaskScheduler
def test_make():
print('test running', datetime.datetime.now())
if __name__ == '__main__':
task_scheduler = TaskScheduler(
sync_frequency_minutes=30,
force_create_task_name='controller_feedback',
force_create_task_project='Automation/Controller',
)
print('\n[utc_timestamp]...
i attach train.py here,
and to run it i do python src/train.py
from src.net import Classifier
ModuleNotFoundError: No module named 'src'
hmm yeah i think, this is not possible to share a whole script here.
Hi @<1523701070390366208:profile|CostlyOstrich36> , i think can try this to run it as standalone:
i running this on 2.35 am, but the job not launching after 2.40 am