Examples: query, "exact match", wildcard*, wild?ard, wild*rd
Fuzzy search: cake~ (finds cakes, bake)
Term boost: "red velvet"^4, chocolate^2
Field grouping: tags:(+work -"fun-stuff")
Escaping: Escape characters +-&|!(){}[]^"~*?:\ with \, e.g. \+
Range search: properties.timestamp:[1587729413488 TO *] (inclusive), properties.title:{A TO Z}(excluding A and Z)
Combinations: chocolate AND vanilla, chocolate OR vanilla, (chocolate OR vanilla) NOT "vanilla pudding"
Field search: properties.title:"The Title" AND text
Answered
Hello All, I Am Trying To Report A Confusion Matrix For My Output Model As Follows:

Hello all,

I am trying to report a confusion matrix for my Output Model as follows:

output_model.report_confusion_matrix(title = "Validation Confusion Matrix", 
                                        series = "Validation Data", 
                                        matrix = best_total_cm, 
                                        xaxis = "Predicted Labels", 
                                        yaxis = "True Labels", 
                                        xlabels = list(classes.keys()), 
                                        ylabels = list(classes.keys()),)

However, when I update the weights later in the script I get the following warning:

2023-07-18 14:15:34,407 - clearml.Metrics - ERROR - Action failed <400/131: events.add_batch/v1.0 (Events not added: Event must have a 'task' field=1)>

The weights successfully upload but the confusion matrix cannot be seen on the clearml web app. Does anyone know what could be causing this?

  
  
Posted 9 months ago
Votes Newest

Answers 6


Reporting the confusion matrix AFTER updating the weights fixed the issue.

  
  
Posted 9 months ago

Ok, thanks for trying

  
  
Posted 9 months ago

What does it look like when you instantiate the output_model object?

  
  
Posted 9 months ago

Hi @<1592326527742119936:profile|ThoughtfulSeahorse27> , this indeed seems like a bug - can you please open a GitHub issue? 🙂

  
  
Posted 9 months ago

task = Task.init(project_name=params['project name'], 
                    task_name= "-".join([params['task name'], params['net']['model'], 'lr', str(params['opt']['lr'])]),
                    auto_connect_frameworks={"tensorflow":False,"tensorboard": False,'pytorch': False,})
    

output_model = OutputModel(task=task, 
                        framework="PyTorch", 
                        config_dict = params,
                        tags = [params['project name']] + params['tags'],)
  
  
Posted 9 months ago

That looks good to me, not sure

  
  
Posted 9 months ago
572 Views
6 Answers
9 months ago
9 months ago
Tags