The low-temperature performance characteristics of a commercial lean NOX trap catalyst were evaluated using infra-red thermography (IRT) before and after a high-temperature aging step. Reaction tests included propylene oxidation, oxygen storage capacity measurements, and simulated cycling conditions for NOX reduction, using H₂ as the reductant during the regeneration step of the cycle. Testing with and without NO in the lean phase showed thermal differences between the reductant used in reducing the stored oxygen and that for nitrate decomposition and reduction. IRT clearly demonstrated where NOX trapping and regeneration were occurring spatially as a function of regeneration conditions, with variables including hydrogen content of the regeneration phase and lean- and rich-phase cycle times. As expected, lower reductant concentration led to incomplete regeneration, limiting nitrate decomposition to the upstream portions of the sample and therefore isolating NOX trapping in the front, or upstream, portion of the catalyst. More reductant, via longer regeneration time or higher reductant concentration, resulted in more catalyst being used for trapping, with the length of catalyst involved in trapping a function of the amount of reductant delivered during the regeneration phase. Tests at 200°C and 300°C also demonstrated differences in the amount of catalyst used for trapping NOX, related to the efficiency of reductant use during the rich phase, with 200°C showing poorer performance. Tests with the thermally aged catalyst demonstrated the same trends, but with measured differences in the efficiency of H₂ use during regeneration. The temperature measurement results were consistent with all concentration trends, indicating such measures can predict subsequent catalyst activity and be used as a measure of the extent of degradation.