Losses on low-voltage networks are often substantial. For example, in the UK they have been estimated as being 4% of the energy supplied by low-voltage networks. However, the breakdown of the losses to individual conductors and their split over time are poorly understood as generally only the peak demands and average loads over several months have been recorded. The introduction of domestic smart meters has the potential to change this. How domestic smart meter readings can be used to estimate the actual losses is analysed. In particular, the accuracy of using 30 min readings compared with 1 min readings, and how this accuracy could be improved, were investigated. This was achieved by assigning the data recorded by 100 smart meters with a time resolution of 1 min to three test networks. Smart meter data from three sources were used in the investigation. It was found that 30 min resolution data underestimated the losses by between 9 and 24%. By fitting an appropriate model to the data, it was possible to reduce the inaccuracy by ∼50%. Having a smart meter time resolution of 10 min rather than 30 gave little improvement to the accuracy.