You are correct in considering the moisture factor in this conversation. There is 970 btu/lb of water the HVAC system has to deal with before it changes the sensible indoor air temperature. This is a lot.
Your answer is not cut and dry.
There are numerous factors involved in the operating conditions that can not be analyzed with just a thermometer. You need to determine the Btu/hr, not the temperature change (Delta-T).
How can you get a 15* ^t on 72* indoor air by setting the t-stat to 60 when 15* is 57*?
To get 15* ^t on 72* air, you have to operate the A/C outside it’s designed operation.
ASHRAE sets the design OA Temperature for every city in the US. If that design temp is 95*, when the outdoor air is above 95*F, it is too small to handle that BTU Load. What happens is that the BTU/hr entering the building is greater than the HVAC capacity. In this event, the unit never shuts off and because it is operating below its load, it’s Designed Sensible Heat Ratio changes. It continues to remove Latent Heat (moisture) lowering the Wet Bulb temperature which increases evaporation of your perspiration, and you can essentially be freezing at 85^F indoor air. If this condition is occurring at the time of your inspection, you will not get any Delta-T. Does this mean the unit is not removing 12,000 Btu/hr/ton? Absolutely not.
Can your Thermometer tell you if any of this is occurring? Absolutely not…
From the other perspective; if the HVAC unit is oversized (therefor wrong), you can easily get your 15-20 Delta because it’s designed sensible heat ratio works on temperature, not moisture. The result is a cold, damp cave with mold growing all over the place. But your Thermometer said everything was just Oky Dokey! That is, until an industrial hygienist stops by about the mold and determines the HVAC is wrong.