Reputation
Badges 1
109 × Eureka!well let me try excecute one of your samples
I understand that it uses time in seconds when there is no report being logged..but, it has already logged three times..
great, thanks!
` from clearml import Task
import argparse
only create the task, we will actually execute it later
task = Task.init(project_name='examples', task_name='pipeline demo',
task_type=Task.TaskTypes.controller, reuse_last_task_id=False)
task.execute_remotely()
args = {'dataset_path' : ''}
task.connect(args, section='Args') `like this?
Hi! I was going to ask about this, but I didn't understand the solution...currently logs are >100MB because of this..is there a way to save the line only once the epoch is done? AgitatedDove14
I've never created an image before..is it normal to take this long? 15 minutes?
very possible yes..but doesn't it fallback to iteration =epoch then after?
all is good 🙂
ahh, I see cool 🙂
not according to the logs:
info ClearML Monitor: Reporting detected, reverting back to iteration based reporting
If I download each file in the folder using StorageManager.get_local_copy(remote_url=gs_url)
, it works OK
so it's not running docker under the hood?
if I have 2.4 or 2.2 in both there is no issue
I had to downgrade tensorflow 2.4 to 2.2 though..any idea why?
the my_package now works ok 🙂
and then when running in agent mode, it fails because my_package can't be installed using pip...so I have to manually edit the section and remove the "my_package"
it seems that I need to add it ( import pandas
) in the main file...even though I don't use it there...
File "aicalibration/generate_tfrecord_pipeline.py", line 30, in <module>
  task.upload_artifact('train_tfrecord', artifact_object=fn_train)
 File "/home/usr_341317_ulta_com/.clearml/venvs-builds/3.7/lib/python3.7/site-packages/clearml/task.py", line 1484, in upload_artifact
  auto_pickle=auto_pickle, preview=preview, wait_on_upload=wait_on_upload)
 File "/home/usr_341317_ulta_com/.clearml/venvs-builds/3.7/lib/python3.7/site-packages/clearml/binding/artifacts.py", line 560, in upload_artifa...
it is ok?
Task init
params setup
task.execute_remotely()
real_code here
your example with absl package worked
in this example my main func is going to be the scripts that creates the pipeline controller
it would be completed right after the upload
it's my error: I have tensorflow==2.2 in my venv, and added Task.add_requirements('tensorflow')
which forces tensorflow==2.4:
Storing stdout and stderr log into [/tmp/.clearml_agent_out.kmqde7st.txt]
Traceback (most recent call last):
 File "aicalibration/generate_tfrecord_pipeline.py", line 15, in <module>
  task = Task.init(project_name='AI Calibration', task_name='Pipeline step 1 dataset artifact')
 File "/home/username/.clearml/venvs-builds/3.7/lib/python3.7/site-packages/clearm...
great, thanks! 🙂