Computed tomography (CT) is the gold standard imaging modality for the diagnosis and follow-up of urolithiasis. Before the use of CT, intravenous urography (IVU) was the imaging modality of choice. CT remains contentious because of the cancer risk related to radiation exposure above a threshold level. We aimed to compare the radiation exposure dose to the average patient with urolithiasis in the era of CT with that of IVU.Our hospital medical records database was searched for patients who presented to the Emergency Department over a 1-month period in 1990 with a diagnosis of renal colic. Patients with the same presentation, from the same month, in 2013 were also identified. A total of 14 patients from each year fulfilled the inclusion criteria. The estimated effective radiation exposure dose for each patient was calculated by using data from population-based studies.The median effective radiation dose per patient in the 1990 group, for initial diagnosis and subsequent follow-up, was 4.05 mSv (interquartile range [IQR], 3.7-4.4 mSv). The corresponding median dose in the 2013 group was 4.2 mSv (IQR, 4.2-4.9 mSv), and there was no evidence of a statistical difference between the groups (p=0.8).Despite the contentiousness related to the use of serial CT scanning, our study demonstrated that for radiological investigation and follow-up of urolithiasis, the estimated effective radiation exposure dose to each patient is only marginally higher than in the era of IVU, with improvements in length of hospital stay and time to definitive diagnosis.