Journal of Medical Systems, cilt.21, sa.5, ss.267-274, 1997 (SCI-Expanded)
The implementation of Time Domain Correlation Method for blood flow velocity estimation requires the conversion of the reflected ultrasonic signals into digital format in order that the computer can be utilized to investigate the technique. A number of quanitization errors, namely, that the A/D has a finite bit resolution, only a limited number of samples of the waveform may be taken per wavelength and correlation function only exits at discrete points, have an effect on the velocity estimate. This paper discusses the effect of errors associated with A/D conversion process on the accuracy of the velocity estimate. A computer simulation was performed to achieve this goal A band passed white Gaussian noise segment was used to simulate the reflected ultrasonic signal. The jitters of the velocity estimates corresponding various A/D resolutions were calculated and plotted. Effects of interpolating the correlation function rather than determining the complete function were presented.