What is an ADC?
An analog-to-digital converter (ADC) converts a continuous analog voltage into a digital value by sampling the signal at discrete times and quantizing its amplitude into a finite number of levels.
What are common ADC architectures and their trade-offs?
Common types include flash (very fast but expensive, usually lower resolution), SAR (successive-approximation; good balance of speed, power, and resolution), sigma-delta (high resolution with oversampling, slower), and integrating/dual-slope (robust to noise, slower/built for measurement instruments).
What is sampling, and why is the sampling rate important?
Sampling measures the analog signal at discrete times. The sampling rate must be at least twice the highest signal frequency (Nyquist) to avoid aliasing; use an anti-aliasing filter before the ADC.
What is quantization, and how does it affect accuracy?
Quantization maps the continuous amplitude to the nearest available digital level, introducing quantization error (noise). This limits accuracy; ideal SNR is approximately 6.02n + 1.76 dB for an n-bit ADC.
What does resolution mean in an ADC?
Resolution is determined by n bits. There are 2^n levels; the least-significant bit (LSB) equals the full-scale range divided by 2^n. Higher n improves dynamic range and precision (≈6.02n dB).