2025, Dec 21 21:00

Make subtle structure visible in Plotly: subtract column/row means, plot residuals, use HDR colorscales

Reveal subtle variations in Plotly contour plots: subtract column/row means to detrend, plot residuals, and use high dynamic range colorscales to reveal detail.

When a matrix carries a strong baseline or near-linear trend across columns or rows, small but meaningful deviations get buried in a standard false-color plot. The visual dynamic range is spent on the dominant structure, so fine variations are barely visible even with a good colormap.

Reproducing the issue

The following example renders a 2D contour from the given data. The logic mirrors a typical Plotly flow and shows why subtle changes across columns disappear behind the larger trend.

import numpy as np
from plotly.subplots import make_subplots
import plotly.graph_objects as go
from pathlib import Path
import os
import plotly.io as pio
x_cap = 1000
t_span = 1
matrix_raw = np.array([[  0.        , 234.        , 568.33333333, 881.66666667],
                       [  0.        , 234.66888889, 569.44555556, 865.03806017],
                       [  0.        , 235.33627407, 570.06910753, 849.9592539 ],
                       [  0.        , 235.9989178 , 570.2632452 , 836.28943309]])
rows, cols = matrix_raw.shape
xs = np.linspace(0, x_cap, cols)
ts = np.linspace(t_span, 0, rows)
X, Y = np.meshgrid(xs, ts)
canvas = make_subplots(rows=1, cols=1)
z_contour = go.Contour(
    z=matrix_raw,
    x=X[0, :],
    y=Y[:, 0],
    colorscale='Inferno'
)
canvas.add_trace(z_contour, row=1, col=1)
canvas.update_layout(
    title_x=0.5,
    title_font=dict(size=40),
    xaxis_title="x-axis",
    yaxis_title="time (Year)",
    height=900,
    width=855,
    xaxis_title_font=dict(size=36),
    yaxis_title_font=dict(size=36),
    legend_title_font=dict(size=36),
    font=dict(size=30),
    paper_bgcolor='white',
    margin=dict(l=20, r=20, t=350, b=20)
)
canvas.update_xaxes(title_font=dict(size=24))
canvas.update_yaxes(title_font=dict(size=24))
for ann in canvas['layout']['annotations']:
    ann['font'].update(size=40)
    ann.update(y=1.02)
canvas.show()

Why the plot hides the variations

The matrix contains a dominant structure across columns. In a 256-step false-color mapping, the strongest signal or trend consumes nearly all contrast, leaving too few display levels to show column-wise micro-variations. Even if the data are accurate, the visible contrast does not scale with your interest in subtle changes; it scales with the maximum range present in the array.

A practical fix: subtract the column averages and plot residuals

A simple and effective way to regularise such data is to remove the column-wise average before plotting. This isolates the deviations from the main trend. After this transform, the residuals carry the fine structure, and the false-color plot can spend its contrast on what matters.

Applying this operation to the data produces the residuals matrix:

0   -1.00101999  -1.194477075   23.42831321
0   -0.3321311   -0.082254845    6.799706712
0    0.33525408   0.541297125   -8.279099557
0    0.99789701   0.735434795  -21.94892037

Here is the same Plotly pipeline rendering the residuals instead of the raw values.

import numpy as np
from plotly.subplots import make_subplots
import plotly.graph_objects as go
x_cap = 1000
t_span = 1
matrix_raw = np.array([[  0.        , 234.        , 568.33333333, 881.66666667],
                       [  0.        , 234.66888889, 569.44555556, 865.03806017],
                       [  0.        , 235.33627407, 570.06910753, 849.9592539 ],
                       [  0.        , 235.9989178 , 570.2632452 , 836.28943309]])
col_avg = matrix_raw.mean(axis=0)
residual_grid = matrix_raw - col_avg
rows, cols = residual_grid.shape
xs = np.linspace(0, x_cap, cols)
ts = np.linspace(t_span, 0, rows)
X, Y = np.meshgrid(xs, ts)
canvas = make_subplots(rows=1, cols=1)
residual_contour = go.Contour(
    z=residual_grid,
    x=X[0, :],
    y=Y[:, 0],
    colorscale='Inferno'
)
canvas.add_trace(residual_contour, row=1, col=1)
canvas.update_layout(
    title_x=0.5,
    title_font=dict(size=40),
    xaxis_title="x-axis",
    yaxis_title="time (Year)",
    height=900,
    width=855,
    xaxis_title_font=dict(size=36),
    yaxis_title_font=dict(size=36),
    legend_title_font=dict(size=36),
    font=dict(size=30),
    paper_bgcolor='white',
    margin=dict(l=20, r=20, t=350, b=20)
)
canvas.update_xaxes(title_font=dict(size=24))
canvas.update_yaxes(title_font=dict(size=24))
for ann in canvas['layout']['annotations']:
    ann['font'].update(size=40)
    ann.update(y=1.02)
canvas.show()

This change preserves the original logic of the plot, but allocates the visual dynamic range to deviations rather than the baseline. If the principal structure is column-driven, subtract the column means; if it is row-driven, subtract the row means. You can also combine both if appropriate to your analysis goal.

An alternative: a high dynamic range CLUT for faint structure

Another option is a more sophisticated 256-shade false-color mapping designed for visualising high dynamic range data. A family derived from space-filling curves on the RGB cube is particularly useful for making fine details visible. One realisation, often described in relation to Hilbert-like traversal of the color cube, can be generated from simple integer rules. This approach can reveal subtle features and also amplify noise, so it should be used with caution.

The following function builds a 253-entry colorscale according to the provided generating scheme. The starting color at the first segment is arbitrary and can be swapped if desired.

import math
def peano0_colorscale():
    def to_hex(r, g, b):
        return f"#{r:02x}{g:02x}{b:02x}"
    scale = []
    total = 252  # indices 0..252 inclusive
    for N in range(0, total + 1):
        if N == 0:
            r, g, b = 0, 0, 0
        else:
            segment = (N - 1) // 36
            k = 1 + ((N - 1) % 36)
            V = (255 * k + 18) // 36
            if segment == 0:
                r, g, b = 0, 0, V
            elif segment == 1:
                r, g, b = V, 0, 255
            elif segment == 2:
                r, g, b = 255, 0, 255 - V
            elif segment == 3:
                r, g, b = 255, V, 0
            elif segment == 4:
                r, g, b = 255, 255, V
            elif segment == 5:
                r, g, b = 255 - V, 255, 255
            elif segment == 6:
                r, g, b = 0, 255, 255 - V
            else:
                r, g, b = 0, 0, 0
        pos = N / total
        scale.append([pos, to_hex(r, g, b)])
    return scale

You can plug this colorscale into the same Plotly contour to emphasise faint structure in high dynamic range arrays.

custom_scale = peano0_colorscale()
canvas = make_subplots(rows=1, cols=1)
contour_hd = go.Contour(
    z=residual_grid,  # or matrix_raw, depending on what you want to examine
    x=X[0, :],
    y=Y[:, 0],
    colorscale=custom_scale
)
canvas.add_trace(contour_hd, row=1, col=1)
canvas.update_layout(height=900, width=855)
canvas.show()

It is also worth noting another transform that can reveal very faint details when an image is mostly background or blown out. A greedy histogram equalisation on 256 bins can bring up subtle features dramatically, though it destroys quantitative fidelity and should be reserved for exploratory visualisation only.

Why this matters

In high dynamic range arrays the default mapping compresses what you care about. If your data encode a dominant baseline or monotonic trend, directly plotting the raw values delegates the entire contrast budget to the largest excursions. Removing the trend or switching to a mapping that traverses 3D color space more uniformly can prevent false negatives in your visual analysis and guide you toward the right quantitative follow-up.

Conclusion

When small variations must be visible, transform the data or the colormap, not the message. Subtracting the column or row averages focuses the plot on residuals and makes subtle structure stand out. If you need even more sensitivity, a high dynamic range colorscale built from a space-filling traversal of the color cube can expose fine details, with the caveat that it may also accentuate noise. For quick visual exploration of very faint signals in largely saturated arrays, histogram equalisation is a useful but non-quantitative tool. Choose the approach that aligns with what you want to see and what you need to measure.