Reputation
Badges 1
109 × Eureka!<tf.Tensor 'Loss/RPNLoss/localization_loss:0' shape=() dtype=float32
This is what is being logged as scalar in the OD API
yes, that works..but wasn't the issue with logging tensors?
so how do I make a PR? 😅
I don't have write access..
Hi! I'm also interested in this feature..let me know how I can help 🙂
INHO, the remove_files('logo.png') shouldn't return 0..and I think the problem is that the file passed as argument is not correctly matched with the files stored in the dataset.
FYI, in case it is useful for someone else:
` import tensorflow as tf
import numpy as np
from plotly.subplots import make_subplots
import plotly.graph_objects as go
import itertools
from clearml import Task
def get_trace(z, series, classes, colorscale='blues', showscale=True, verbose=False):
if verbose:
print(z)
ind = '1' if series=='train' else '2'
trace = dict(
type="heatmap",
z=z[::-1],
x=classes,
y=classes[::-1],
...
Hi! were you able to reproduce the issue CostlyOstrich36 ?
are you planning on changing to f-strings incrementally?
I would like to be able to compare sbs train/val confusion matrices
if I write:
in tl2 conf : default_
output_uri
: "/home/tglema/clearml"
in GCP it saves in that same dir
if I have 2.4 or 2.2 in both there is no issue
I had to downgrade tensorflow 2.4 to 2.2 though..any idea why?
the my_package now works ok 🙂
and then when running in agent mode, it fails because my_package can't be installed using pip...so I have to manually edit the section and remove the "my_package"
it seems that I need to add it ( import pandas
) in the main file...even though I don't use it there...
File "aicalibration/generate_tfrecord_pipeline.py", line 30, in <module>
  task.upload_artifact('train_tfrecord', artifact_object=fn_train)
 File "/home/usr_341317_ulta_com/.clearml/venvs-builds/3.7/lib/python3.7/site-packages/clearml/task.py", line 1484, in upload_artifact
  auto_pickle=auto_pickle, preview=preview, wait_on_upload=wait_on_upload)
 File "/home/usr_341317_ulta_com/.clearml/venvs-builds/3.7/lib/python3.7/site-packages/clearml/binding/artifacts.py", line 560, in upload_artifa...
it is ok?
Task init
params setup
task.execute_remotely()
real_code here
your example with absl package worked
in this example my main func is going to be the scripts that creates the pipeline controller
it would be completed right after the upload
it's my error: I have tensorflow==2.2 in my venv, and added Task.add_requirements('tensorflow')
which forces tensorflow==2.4:
Storing stdout and stderr log into [/tmp/.clearml_agent_out.kmqde7st.txt]
Traceback (most recent call last):
 File "aicalibration/generate_tfrecord_pipeline.py", line 15, in <module>
  task = Task.init(project_name='AI Calibration', task_name='Pipeline step 1 dataset artifact')
 File "/home/username/.clearml/venvs-builds/3.7/lib/python3.7/site-packages/clearm...
great, thanks! 🙂
the issue is with StorageManager.download_folder