Examples: query, "exact match", wildcard*, wild?ard, wild*rd
Fuzzy search: cake~ (finds cakes, bake)
Term boost: "red velvet"^4, chocolate^2
Field grouping: tags:(+work -"fun-stuff")
Escaping: Escape characters +-&|!(){}[]^"~*?:\ with \, e.g. \+
Range search: properties.timestamp:[1587729413488 TO *] (inclusive), properties.title:{A TO Z}(excluding A and Z)
Combinations: chocolate AND vanilla, chocolate OR vanilla, (chocolate OR vanilla) NOT "vanilla pudding"
Field search: properties.title:"The Title" AND text
Answered
What Could Be The Reason For Fail Status Of A Task That Seems To Have Completed Correctly? No Information In The Log Whatsoever

What could be the reason for FAIL status of a task that seems to have completed correctly? No information in the log whatsoever

  
  
Posted 3 years ago
Votes Newest

Answers 30


Hmm should not make a diff.
Could you verify it still doesn't work with TF 2.4 ?

  
  
Posted 3 years ago

well let me try excecute one of your samples

  
  
Posted 3 years ago

Thanks!

  
  
Posted 3 years ago

I commented the upload_artifact at the end of the code and it finishes correctly now

  
  
Posted 3 years ago

Okay, could you try to run again with the latest clearml package from github?
pip install -U git+

  
  
Posted 3 years ago

yes

  
  
Posted 3 years ago

Could you download and send the entire log ?

  
  
Posted 3 years ago

I had to downgrade tensorflow 2.4 to 2.2 though..any idea why?

  
  
Posted 3 years ago

I had to mask some parts 😁

  
  
Posted 3 years ago

File "aicalibration/generate_tfrecord_pipeline.py", line 30, in <module>
  task.upload_artifact('train_tfrecord', artifact_object=fn_train)
 File "/home/usr_341317_ulta_com/.clearml/venvs-builds/3.7/lib/python3.7/site-packages/clearml/task.py", line 1484, in upload_artifact
  auto_pickle=auto_pickle, preview=preview, wait_on_upload=wait_on_upload)
 File "/home/usr_341317_ulta_com/.clearml/venvs-builds/3.7/lib/python3.7/site-packages/clearml/binding/artifacts.py", line 560, in upload_artifact
  self._task.set_artifacts(self._task_artifact_list)
 File "/home/usr_341317_ulta_com/.clearml/venvs-builds/3.7/lib/python3.7/site-packages/clearml/backend_interface/task/task.py", line 1201, in set_artifacts
  self._edit(execution=execution)
 File "/home/usr_341317_ulta_com/.clearml/venvs-builds/3.7/lib/python3.7/site-packages/clearml/backend_interface/task/task.py", line 1771, in _edit
  raise ValueError('Task object can only be updated if created or in_progress')
ValueError: Task object can only be updated if created or in_progress
2021-03-05 22:36:13,111 - clearml.Task - INFO - Waiting to finish uploads
2021-03-05 22:36:13,578 - clearml.Task - INFO - Finished uploading

  
  
Posted 3 years ago

just this one...it marks as completed when executed locally

  
  
Posted 3 years ago

Hmm... any idea on what's different with this one ?

  
  
Posted 3 years ago

it's my error: I have tensorflow==2.2 in my venv, and added Task.add_requirements('tensorflow') which forces tensorflow==2.4:

Storing stdout and stderr log into [/tmp/.clearml_agent_out.kmqde7st.txt]
Traceback (most recent call last):
 File "aicalibration/generate_tfrecord_pipeline.py", line 15, in <module>
  task = Task.init(project_name='AI Calibration', task_name='Pipeline step 1 dataset artifact')
 File "/home/username/.clearml/venvs-builds/3.7/lib/python3.7/site-packages/clearml/task.py", line 536, in init
  TensorflowBinding.update_current_task(task)
 File "/home/username/.clearml/venvs-builds/3.7/lib/python3.7/site-packages/clearml/binding/frameworks/tensorflow_bind.py", line 36, in update_current_task
  PatchKerasModelIO.update_current_task(task)
 File "/home/username/.clearml/venvs-builds/3.7/lib/python3.7/site-packages/clearml/binding/frameworks/tensorflow_bind.py", line 1412, in update_current_task
  PatchKerasModelIO._patch_model_checkpoint()
 File "/home/username/.clearml/venvs-builds/3.7/lib/python3.7/site-packages/clearml/binding/frameworks/tensorflow_bind.py", line 1450, in _patch_model_checkpoint
  from tensorflow.python.keras.engine.network import Network # noqa
 File "/home/username/.clearml/venvs-builds/3.7/lib/python3.7/site-packages/clearml/binding/import_bind.py", line 59, in __patched_import3
  level=level)
 File "/home/username/.clearml/venvs-builds/3.7/lib/python3.7/site-packages/tensorflow/python/keras/engine/network.py", line 83, in <module>
  class Network(base_layer.Layer):
 File "/home/username/.clearml/venvs-builds/3.7/lib/python3.7/site-packages/tensorflow/python/keras/engine/network.py", line 379, in Network
  @trackable_layer_utils.cache_recursive_attribute('dynamic')
AttributeError: module 'tensorflow.python.training.tracking.layer_utils' has no attribute 'cache_recursive_attribute'

  
  
Posted 3 years ago

0.17.2 ?

  
  
Posted 3 years ago

it is ok?
Task init
params setup
task.execute_remotely()
real_code here

  
  
Posted 3 years ago

it works!

  
  
Posted 3 years ago

YEY!

  
  
Posted 3 years ago

thanks!

  
  
Posted 3 years ago

Yes
Are you trying to upload_artifact to a Task that is already completed ?

  
  
Posted 3 years ago

your example with absl package worked

  
  
Posted 3 years ago

I commented the upload_artifact at the end of the code and it finishes correctly now

upload_artifact caused the "failed" issue ?

  
  
Posted 3 years ago

Hmm let me check something

  
  
Posted 3 years ago

BTW:
Task.add_requirements('tensorflow', '2.2') will make sure you get the specified version 🙂

  
  
Posted 3 years ago

no, to the current task

  
  
Posted 3 years ago

where can I find more info about why it failed?

  
  
Posted 3 years ago

This is odd , and it is marked as failed ?
Are all the Tasks marked failed, or is it just this one ?

  
  
Posted 3 years ago

it would be completed right after the upload

  
  
Posted 3 years ago

if I have 2.4 or 2.2 in both there is no issue

  
  
Posted 3 years ago

latest

  
  
Posted 3 years ago
974 Views
30 Answers
3 years ago
one year ago
Tags