Hi all,
I’m trying to understand a discrepancy between my gas usage (in MJ) and the theoretical energy required to heat water. The numbers don’t seem to line up, even after accounting for system efficiency.
Meter Data
Read Date: 23 Mar 2026
Read Type: Actual
Start Read: 1,887
End Read: 3,342
Usage: 1,455 units
Unit Size: 10 L (i.e. tens of litres)
Conversion Factor: 0.477937
Total Energy: 6,954 MJ
So total water volume used:
1,455 × 10 L = 14,550 L
Expected Energy to Heat Water
Assuming:
Starting temp = 15°C
Target temp = 60°C
Temperature rise = 45°C
Specific heat capacity of water = 4.186 kJ/kg·°C
1 L ≈ 1 kg
Energy required:
Energy (kJ) = 14,550 × 4.186 × 45
= 2,741,000 kJ ≈ 2,741 MJ
Accounting for System Efficiency
Assuming a 70% efficient gas storage hot water system:
Required input energy = 2,741 ÷ 0.70
= 3,916 MJ
Comparison
Calculated requirement (with efficiency): ~3,916 MJ
Metered usage: 6,954 MJ
This is ~77% higher than expected — nearly double the theoretical heating requirement.
Question
Why is the MJ figure from the meter/conversion factor so high?
Even allowing for:
Storage losses
Pipe losses
Standing heat loss
Inefficiencies
…it still seems significantly above what physics would suggest is required to heat that volume of water.