We describe a “data model” which takes any observed data as comprising components relating to signal, background and noise. Multiscale transform methods, such as the wavelet transform, can be used effectively to distinguish between signal and background. Noise filtering in wavelet space is also very often helpful. We discuss data entropy (or informativeness) in this context. Using a small dataset resulting from a new optical engineering technique for the finger-printing of beverages and other liquids, we exemplify this general approach to data analysis.