Reputation
Badges 1
28 × Eureka!I will try recreating my env.
Thx! fixing the too-big-diff (due to a jupyter notebook) solved the issues.
Well taking the file out of my git repo folder it seems to work perfectly. May it have something to do with the git integration and the uncommitted changes?
nope. still same problem.
Well, I couldn’t find where to add the output_uri
.
I tried now the following:
` import clearml
local_path='<some-local-path>'
s3_path = 's3://<some-bucket-path>'
dataset = clearml.Dataset.create(dataset_project='project_name', dataset_name='trial_01')
dataset.add_files(path=local_path, dataset_path=s3_path) `but I don’t see the files on the s3 bucket.
I also tried this:dataset.sync_folder(local_path=local_path, dataset_path=s3_path)
and still no success. It seems like it uploadin...
It seems to be stuck in the following step:
` $ python eraseme.py
ClearML Task: created new task id=fa3db29498f241a19de81b75f787887e
ClearML results page:
======> WARNING! Git diff to large to store (1327kb), skipping uncommitted changes <======
2021-07-11 19:17:31,351 - clearml.Task - INFO - Waiting for repository detection and full package requirement analysis
ClearML Monitor: GPU monitoring failed getting GPU reading, switching off GPU monitoring
2021-07-11 19:17:32,821 - clearml.Ta...
Hi SuccessfulKoala55 !
Did you managed to fix this issue?
Thx!
` {
"meta": {
"id": "09e19b94a3c44b30b3a52fe89ee27fcf",
"trx": "09e19b94a3c44b30b3a52fe89ee27fcf",
"endpoint": {
"name": "tasks.get_all_ex",
"requested_version": "2.13",
"actual_version": "1.0"
},
"result_code": 400,
"result_subcode": 12,
"result_msg": "Validation error (invalid task field): path=hyperparams.input.num_train_epochs.value",
"error_stack": null,
"error_data": {}
},...
nope. it happens for all of them (only on our on-prem server, so it might be a misconfiguration of some kind)
AgitatedDove14
Hi!
Any idea of how to solve this issue?
Thx!
response:{"meta":{"id":"09e19b94a3c44b30b3a52fe89ee27fcf","trx":"09e19b94a3c44b30b3a52fe89ee27fcf","endpoint":{"name":"tasks.get_all_ex","requested_version":"2.13","actual_version":"1.0"},"result_code":400,"result_subcode":12,"result_msg":"Validation error (invalid task field): path=hyperparams.input.num_train_epochs.value","error_stack":null,"error_data":{}},"data":{}}
Still getting the same error. Any idea?
It’s a big dataset/.
Hi!
Thanks for your response!
That’s what I see in the Network tab:
Thx! will try it tomorrow.
seems like. I’m working with git only locally, with remote machine where the code actually runs (you know this pycharm pro remote). So for git integration I had to reproduce the git on the remote machine. and since it’s not updated regularly, the git diffs are large…
Do you have a better solution for the git integration for the way I work?
request payload:
` {"id":[],"project":["6d3dc225d3e041768689bac6ea82573c"],"page":0,"page_size":15,"order_by":["-last_update"],"type":["$not","annotation_manual","$not","annotation","__$not","dataset_import"],"user":[],"system_tags":["-archived"],"tags":[],"include_subprojects":false,"only_fields":["system_tags","project","company","last_change","started","last_iteration","tags","user.name","type","name","status","project.name","last_update","parent.name","parent.project.id","parent.proje...
And anyway, the git diffs are logged long before, and still the reporting doesn’t upload, and I don’t see why it should.
SuccessfulKoala55 , I sent everything that might help, since I don’t really know which could be most helpful…
Thx!
I want to upload the dataset into s3. is there a flag that tells it to do so?