I'm not sure what you mean by sensitivity here. Leaks mainly seem to (substantially) affect the time constant. They seem to have a pretty much undetectable effect on the eventual steady-state deltaT (I think because the intentional leak out of the exhaust opening in the top of the calorimeter is so much bigger in comparison). I assume that the increase in sensitivity appearing as the difference between the blue and red traces in Paradigmnoia's plot is due to the use of extra insulation.
The leak at the fan, which is ‘sucking’ air ideally from the inside of the box, lets in unheated air, reducing the apparent outlet temperature, and therefore reducing apparent power across the whole experiment.
A leak of air almost anywhere else, inside the box, is not much different than coming in through the inlet. There is not much pressure difference to drive substantial leaks compared to the large (75 mm) inlet hole. The small leaks add up to a small decrease in overall calculated output efficiency, which is true.
The removal of about 20 kg total mass from inside the calorimeter envelope gave all the rate increases.
Heat only the air, ideally.