To put it simply, I know very little about thermal imaging, but I have an old B-cam, and if I point it at something hot, I can tell that it’s hot, if I point it at something that has both hot and cold areas, I can see the differences. If I point it at something with a wet spot, if the water and material adjacent to it aren’t the same temperature, I can see it. I understand that a roof leak with saturated substrate is going to show up better when temperatures are changing because different materials change temperature at different rates.
This IR camera doesn’t look as complicated as my digital camera to use. I point this thing at something, the temperature shows upon the screen, yellow’s hot, blue’s not, it’s not rocket science. Color palettes? I don’t care what color it is, I just want to see the problem.
- If all I want to do is spot leaks and electrical problems in homes, why should I fork out thousands of dollars for training when I can see them with NO training? I don’t want to be a thermographer, I just want to do these two things. Leaks and electrical.
2.Why should I care about emissivity?
- All this stuff about how complicated interpretation is, show me an IR photo of a leak or an electrical problem that I need $1800-$5000 of training to interpret correctly.
If I’d paid a lot of money for training, I’d be going on and on about how important training is. If I made my living teaching thermography, I’d be going on and on about how important training is. But since I haven’t and I don’t, tell me why I should.