Skip to content

Error Codes - Object Detection

MLdebugger error codes are labels that classify how models make mistakes. This guide explains the definition of error codes for Object Detection tasks (2D / 3D shared).

About Issue Category

For the definition of Issue Categories (Stable Coverage, Hotspot, etc.) to which error codes belong, see Evaluation and Result.

Error Code Format

Object Detection task error codes are expressed in the following format.

Basic Format

DET:{det_error_code}:CLS{t}->{y}
  • det_error_code: Detection error code indicating the type of detection result
  • t: Annotated ground truth class ID
  • y: Predicted class ID

Examples: - DET:A:CLS0->0: Aligned detection, class 0 is correct and predicted as class 0 - DET:M:CLS3->5: Localization error, class 3 is correct but predicted as class 5

Detection Error Code (det_error_code)

det_error_code Name Description
A Aligned Correctly matched detection with sufficient IoU
M Matched Matched to GT but with localization error
SP Separated Preds Prediction split into multiple boxes for a single GT
SG Separated GTs Multiple GT boxes matched to a single prediction
D Duplicate Duplicate detection for the same GT
ON Overlap No Pair Prediction with no GT match but partially overlapping other predictions
N No Pair Prediction with no GT match (No GT Pair)
U Undefined Detection with indeterminate matching result

Display Names in GUI

In the monitoring screen (GUI), some codes are displayed with different names.

  • NNG (No GT Pair)
  • ONONG (Overlap No GT Pair)

Unmatched GTs (diagnosis: no_prediction) are displayed as NP (No Pred Pair).

Uncertain Detection Suffix

When a single detection error type cannot be determined with enough confidence, a * suffix is appended to det_error_code as an uncertain marker.

This happens when the error probability is below the threshold (1 - error_proba_thresh, default 0.8). In particular, for classes the model has not learned (error probability is always 0), detections are always marked as uncertain.

DET:N|SP*:CLS1->3

This indicates the detection is uncertain between No-pair (N) and Separated Pred (SP).

Predicted Class (y)

Notation Meaning
Single digit (e.g. 0) Predicted class ID
x Unmatched GT (no prediction made)
** Prediction matches GT class
{y1}\|{y2} Fluctuation between multiple classes (e.g. 2\|3)
* Uncertain top prediction(s)
? Unknown (used when GT class is also unknown)
ODD Out-of-Distribution — model has not learned this class

With Fluctuation

DET:{det_error_code}:CLS{t}->{y1}|{y2}

When internal features are unstable and the predicted class fluctuates between multiple classes, they are expressed separated by |.

Examples: - DET:M:CLS0->2|3: Loc Error, class 0 is correct but predicted class fluctuates between 2 and 3 - DET:A:CLS5->7|1|9: Aligned, class 5 is correct but predicted class fluctuates among 7, 1, and 9

Critical/Aleatoric Marker

DET:{det_error_code}:CLS{t}->{y}**

Within Critical Hotspot or Aleatoric Hotspot, when the most plausible class based on internal features matches the GT class, ** is appended.

Examples: - DET:D:CLS0->0**: Duplicate detection, class 0 is correct and prediction matches GT (high-confidence duplicate)

Unmatched GT (no_prediction)

GT boxes with no matching prediction are recorded with diagnosis field set to no_prediction.

Examples: - DET:U:CLS2->x: GT class 2 was not detected (no prediction)

Unmatched Predictions

DET:N:CLS?->?

When a prediction has no GT match and the class cannot be determined, ? is used for both GT class and estimated class.

Out-of-Distribution (ODD)

DET:{det_error_code}:CLS{t}->ODD

When the model has not learned a particular class (i.e. no error probability model exists for that pred_class_id), all error probabilities are 0. In this case, the predicted class is set to ODD (Out-of-Distribution Detection). Since confidence for a single error type cannot be established, these detections are always marked as uncertain (* suffix on det_error_code).

Examples: - DET:N*:CLS?->ODD: Uncertain No-pair detection for an unlearned class - DET:A*:CLS3->ODD: Uncertain Aligned detection where the model has not learned class 3

Error Code Interpretation Examples

Example 1: DET:M:CLS0->2|3 (Hotspot)

  • Detection error code: Loc Error (matched but with localization error)
  • Ground truth class: 0
  • Predicted class: Fluctuates between 2 and 3
  • Issue Category: hotspot (Unstable)
  • Interpretation: GT class 0 is detected but with poor localization, and the class estimate is unstable
  • Action: Add training data for class 0 with clearer boundaries, review anchor settings

Example 2: DET:D:CLS0->0** (Critical Hotspot)

  • Detection error code: Duplicate
  • Ground truth class: 0
  • Predicted class: 0 (matches GT)
  • Issue Category: critical_hotspot (NMS issue)
  • Interpretation: Duplicate detections of class 0 where the model is confident but NMS fails to suppress
  • Action: Tune NMS threshold, review post-processing pipeline

Example 3: DET:U:CLS2->x (Recessive Hotspot / no_prediction)

  • Detection error code: Undefined (unmatched GT)
  • Ground truth class: 2
  • Predicted class: Not detected
  • diagnosis: no_prediction
  • Issue Category: recessive_hotspot (Under-Confidence)
  • Interpretation: Class 2 objects are being missed entirely
  • Action: Add training data for class 2, adjust score threshold

Example 4: DET:N:CLS?->? (Critical Hotspot / Aleatoric)

  • Detection error code: No-pair (false positive)
  • Ground truth class: Unknown
  • Predicted class: Unknown
  • Issue Category: critical_hotspot (Aleatoric)
  • Interpretation: False positive detections with no corresponding GT, possibly due to unclear data/labels
  • Action: Review annotations, add hard negative examples

Example 5: DET:N*:CLS?->ODD (Critical Hotspot / ODD)

  • Detection error code: No-pair, uncertain (*)
  • Ground truth class: Unknown
  • Predicted class: ODD (out-of-distribution)
  • Issue Category: critical_hotspot
  • Interpretation: The model has not learned this class. Error probabilities are all 0, so the detection is always uncertain. Likely a class that was not present in training data
  • Action: Add training data for the missing class, or verify that the class should be in scope

Debug Code

Each error code is assigned a debug_code indicating the debugging direction.

debug_code Description Action
model_epistemic Due to model's lack of knowledge Add data, retrain
postprocess_nms NMS-related issue (D, SP, SG, etc.) Tune NMS/IoU threshold
postprocess_threshold Score threshold issue Adjust confidence threshold
annotation_review Annotation review needed (N, ON, etc. where estimation is OK) Review annotations
aleatoric Due to data ambiguity itself Review annotation, reconsider task definition

3D Object Detection debug_codes

3D Object Detection uses only model_epistemic and aleatoric. postprocess_nms, postprocess_threshold, and annotation_review are specific to 2D OD.

Using Result API

# Get issue list
issues_df = result.get_issues()

# Distribution by error code
error_distribution = issues_df.groupby("error_code")["counts"].sum()

# Distribution by debug_code
debug_distribution = issues_df.groupby("debug_code")["counts"].sum()

# Extract only NMS-related issues
nms_issues = issues_df[issues_df["debug_code"] == "postprocess_nms"]

Visualization Examples

# Get distribution by category
category_view = result.get_view(
    groupby=["category"],
)

# Detailed error code distribution in Hotspot
hotspot_detail = result.get_view(
    query="category == 'hotspot'",
    groupby=["error_code"],
)

# Distribution by debug_code
debug_view = result.get_view(
    groupby=["debug_code", "category"],
)

Next Steps