Examples: query, "exact match", wildcard*, wild?ard, wild*rd
Fuzzy search: cake~ (finds cakes, bake)
Term boost: "red velvet"^4, chocolate^2
Field grouping: tags:(+work -"fun-stuff")
Escaping: Escape characters +-&|!(){}[]^"~*?:\ with \, e.g. \+
Range search: properties.timestamp:[1587729413488 TO *] (inclusive), properties.title:{A TO Z}(excluding A and Z)
Combinations: chocolate AND vanilla, chocolate OR vanilla, (chocolate OR vanilla) NOT "vanilla pudding"
Field search: properties.title:"The Title" AND text
Answered
Hi Everyone, Has Someone Of You Tried To Track Your Shap Plots With Clearml? Somehow In My Dashboard The Tracked Plots Are Empty. Might They Be Too Complex Or Something? Br Sophie

Hi everyone, has someone of you tried to track your SHAP plots with clearML? Somehow in my dashboard the tracked plots are empty. Might they be too complex or something? BR Sophie

  
  
Posted 2 months ago
Votes Newest

Answers 12


Hey Martin, it's really weird, but running this same code, you added here, still leads to these empty figures in the "Plots" section of the clearML experiment.

We decided now to use the "report_image"-option which at least somehow displays the plots in the "debug samples"-section of the experiment.

Thank you very much for your help 🙏

  
  
Posted 2 months ago

"plt" comes from matplotlib.pyplot and as I understand the clearML documentation, matplotlib plots are logged automatically. In other scripts, this works just fine but not with these SHAP-plots that are just displayed as empty plots:
image

  
  
Posted 2 months ago

another though, see what happens if you remove the .save and .close and stay with the .show, maybe the close call somehow interfere's with it ?
Otherwise, if you can test one of the shaps examples and see maybe they fail in your setup that's another avenue to take for reproducing the issue

  
  
Posted 2 months ago

AgitatedDove14 can you please tell me which SHAP-, matplotlib-, and clearML-versions you are using? And would it be possible to add a title to that plot you are tracking or would plt.title() again destroy something there?

  
  
Posted 2 months ago

Hey, good day and thank you for your quick replies! So this is the code snippet I was using to create the plots (see appended image). I also tried removing the plt.savefig() part or the plt.show() part or manually adding the report_matplotlib_figure-part for the task but nothing seems to make a difference.
image

  
  
Posted 2 months ago

Hmm Could you check if it makes a difference importing ClearML before shap ?
If this changes nothing, could you put a standalone script to reproduce the issue ?

  
  
Posted 2 months ago

This should work:

from clearml import Task

task = Task.init(project_name="examples", task_name="shap example")
import xgboost
import shap

# train an XGBoost model
X, y = shap.datasets.california()
model = xgboost.XGBRegressor().fit(X, y)

# explain the model's predictions using SHAP
# (same syntax works for LightGBM, CatBoost, scikit-learn, transformers, Spark, etc.)
explainer = shap.Explainer(model)
shap_values = explainer(X)

# visualize the first prediction's explanation
shap.plots.waterfall(shap_values[0])

image

  
  
Posted 2 months ago

Notice that in your example you have

plt.figure()

This actually clears the matplotlib figure, this is why we are getting a first white image then the actual plot,
once I removed it I got a single plot (no need for the manual reporting)

X, y = make_regression(
    n_samples=100,     # Number of samples
    n_features=10,     # Number of features
    noise=0.1,   # Number of informative features
    random_state=42    # For reproducibility
)

# Convert to DataFrame for better feature naming
feature_names = [f"Feature {i+1}" for i in range(X.shape[1])]
X = pd.DataFrame(X, columns=feature_names)

# Step 2: Split Data into Train/Test Sets
X_train, X_test, y_train, y_test = train_test_split(X, y, test_size=0.2, random_state=42)

# Step 3: Train a Simple Model
model = RandomForestRegressor(random_state=42)
model.fit(X_train, y_train)

# Step 4: Use SHAP to Explain Model Predictions
explainer = shap.Explainer(model, X_train)
shap_values = explainer(X_test)  # Returns a SHAP Explanation object in new SHAP versions

shap_plot = shap.summary_plot(shap_values, X_test)  # Show top 5 features

plt.show()

image
image

  
  
Posted 2 months ago

Hey Martin, unfortunately the adapted import order did not change anything. Thank you anyways 🙂 I will try to provide some standalone code so the issue can be reproduced

  
  
Posted 2 months ago

I think latest:

clearml==1.17.0
matplotlib==3.6.2
shap==0.46.0

Python 3.10

  
  
Posted 2 months ago

Hey AgitatedDove14 , thank you for your help. Unfortunately it did not help to remove the "close". I also created a Dummy-Notebook to reproduce the issue. But it does not seem to work. My plot in the clearML dashboard is empty, unfortunately. I also tried, not using plt at all and uploading the shap-plot as is:

# Create the Beeswarm Plot
shap_plot = shap.summary_plot(shap_values, X_test)  # Show top 5 features

clearml_task.get_logger().report_matplotlib_figure(title="Shap beeswarm plot", series="My SHAP Plots", iteration=0, figure=shap_plot, report_interactive=True)

But this just leads to an empty plot even without a title (see image).

Could it be connected to the version of clearML I use? So our team, right now, works with the Community version - might it only work in the Pro version?
image

  
  
Posted 2 months ago

Hi BlandCormorant75 , how are you logging those plots? Can you provide a stand alone snippet that reproduces your behaviour?

  
  
Posted 2 months ago
235 Views
12 Answers
2 months ago
2 months ago
Tags