Examples: query, "exact match", wildcard*, wild?ard, wild*rd
Fuzzy search: cake~ (finds cakes, bake)
Term boost: "red velvet"^4, chocolate^2
Field grouping: tags:(+work -"fun-stuff")
Escaping: Escape characters +-&|!(){}[]^"~*?:\ with \, e.g. \+
Range search: properties.timestamp:[1587729413488 TO *] (inclusive), properties.title:{A TO Z}(excluding A and Z)
Combinations: chocolate AND vanilla, chocolate OR vanilla, (chocolate OR vanilla) NOT "vanilla pudding"
Field search: properties.title:"The Title" AND text
Answered
Hi, I'Ve Been Trying To Use Hyperparameter Optimization For Yolov11. When I Try To Get The Top Metrics Using

Hi,
I've been trying to use hyperparameter optimization for Yolov11. When I try to get the top metrics using

# optimization is completed, print the top performing experiments id
k = 3
top_exp = optimizer.get_top_experiments(top_k=k)
print('Top {} experiments are:'.format(k))
for n, t in enumerate(top_exp, 1):
    print('Rank {}: task id={} |result={}'
          .format(n, t.id, t.get_last_scalar_metrics()['val']['metrics/mAP50(B)']['last']))

t.get_last_scalar_metrics() returns an empty dict. I've tried using different titles and series names but haven't had any success. Any ideas on what I'm missing here?
Thanks!

  
  
Posted 26 days ago
Votes Newest

Answers 6


Hi CostlyOstrich36 I ran the example provided here:
None
and I can see the hyperparameters and the reported scalars for the base task. Running the optimizer doesn't show any errors but printing the results from the top experiments returns this:

Rank 1: task id=0b9a695218424ac7bb42eb7cca490f72 |result={}
Rank 2: task id=a599229be44a48f0b77e3cad2e6bd4a4 |result={}
Rank 3: task id=b70bebfa88d54e6c96b73ce22c95c405 |result={}

image
image

  
  
Posted 24 days ago

Hi ContemplativeParrot88 ! Are the scalars in the UI in the optimization tasks (not the base task)?

  
  
Posted 17 days ago

The scalars are in the base task. They don't show up in the optimization task

  
  
Posted 17 days ago

Then there is likely a problem with those tasks. For example, could be that the hyper parameters get values that are too low or high which just bugs out the training.

  
  
Posted 17 days ago

CostlyOstrich36 Any ideas on how I can resolve this?

  
  
Posted 17 days ago

Hi ContemplativeParrot88 , are you sure the hyper parameters themselves are properly connected? If you run as a single run and change parameters, do they take effect?

  
  
Posted 25 days ago
135 Views
6 Answers
26 days ago
16 days ago
Tags