Examples: query, "exact match", wildcard*, wild?ard, wild*rd
Fuzzy search: cake~ (finds cakes, bake)
Term boost: "red velvet"^4, chocolate^2
Field grouping: tags:(+work -"fun-stuff")
Escaping: Escape characters +-&|!(){}[]^"~*?:\ with \, e.g. \+
Range search: properties.timestamp:[1587729413488 TO *] (inclusive), properties.title:{A TO Z}(excluding A and Z)
Combinations: chocolate AND vanilla, chocolate OR vanilla, (chocolate OR vanilla) NOT "vanilla pudding"
Field search: properties.title:"The Title" AND text
Answered
Hello All, I Am Trying To Report A Confusion Matrix For My Output Model As Follows:

Hello all,

I am trying to report a confusion matrix for my Output Model as follows:

output_model.report_confusion_matrix(title = "Validation Confusion Matrix", 
                                        series = "Validation Data", 
                                        matrix = best_total_cm, 
                                        xaxis = "Predicted Labels", 
                                        yaxis = "True Labels", 
                                        xlabels = list(classes.keys()), 
                                        ylabels = list(classes.keys()),)

However, when I update the weights later in the script I get the following warning:

2023-07-18 14:15:34,407 - clearml.Metrics - ERROR - Action failed <400/131: events.add_batch/v1.0 (Events not added: Event must have a 'task' field=1)>

The weights successfully upload but the confusion matrix cannot be seen on the clearml web app. Does anyone know what could be causing this?

  
  
Posted 10 months ago
Votes Newest

Answers 6


Ok, thanks for trying

  
  
Posted 10 months ago

That looks good to me, not sure

  
  
Posted 10 months ago

Hi @<1592326527742119936:profile|ThoughtfulSeahorse27> , this indeed seems like a bug - can you please open a GitHub issue? 🙂

  
  
Posted 10 months ago

What does it look like when you instantiate the output_model object?

  
  
Posted 10 months ago

task = Task.init(project_name=params['project name'], 
                    task_name= "-".join([params['task name'], params['net']['model'], 'lr', str(params['opt']['lr'])]),
                    auto_connect_frameworks={"tensorflow":False,"tensorboard": False,'pytorch': False,})
    

output_model = OutputModel(task=task, 
                        framework="PyTorch", 
                        config_dict = params,
                        tags = [params['project name']] + params['tags'],)
  
  
Posted 10 months ago

Reporting the confusion matrix AFTER updating the weights fixed the issue.

  
  
Posted 10 months ago
588 Views
6 Answers
10 months ago
10 months ago
Tags