Terrestrial laser scanning (TLS) has now become an important tool for monitoring ground surface movements during and after construction. To better understand surface changes measured, it is important to have a sound understanding of digital elevation model (DEM) accuracy. A recent study showed that a linear model can be used to represent the digital elevation model (DEM) error in terms of root mean square error (RMSE) for some typical data spacing of TLS point clouds. However, the effects of the measurement noise on that model is not clearly understood. In this study, the measurement noise, as a controlled parameter, is added to data points in a semi-artificial point cloud that is assumed to free of measurement errors to form various new measurement-error-contaminated point cloud datasets. These new datasets were analysed using a statistical resampling technique, with an attempt to quantitatively investigate the effects of random measurement errors on the coefficients of the linear model.