KAD5510P
16
FN7693.2
May 2, 2011
User-Initiated Reset
Recalibration of the ADC can be initiated at any time by driving the
RESETN pin low for a minimum of one clock cycle. An open-drain driver
with less than 0.5mA open-state leakage is recommended so the
internal high impedance pull-up to OVDD can assure exit from the reset
state. As is the case during power-on reset, the SDO, RESETN and DNC
pins must be in the proper state for the calibration to successfully
execute.
The performance of the KAD5510P changes with variations in
temperature, supply voltage or sample rate. The extent of these
changes may necessitate recalibration, depending on system
performance requirements. Best performance will be achieved
by recalibrating the ADC under the environmental conditions at
which it will operate.
A supply voltage variation of less than 100mV will generally
result in an SNR change of less than 0.5dBFS and SFDR change
of less than 3dBc.
In situations where the sample rate is not constant, best results will be
obtained if the device is calibrated at the highest sample rate. Reducing
the sample rate by less than 75MSPS will typically result in an SNR
change of less than 0.5dBFS and an SFDR change of less than 3dBc.
Figures
24 and
25 show the effect of temperature on SNR and
SFDR performance with calibration performed at -40°C, +25°C,
and +85°C. Each plot shows the variation of SNR/SFDR across
temperature after a single calibration at -40°C, +25°C and
+85°C. Best performance is typically achieved by a user-initiated
calibration at the operating conditions, as stated earlier.
However, it can be seen that performance drift with temperature
is not a very strong function of the temperature at which the
calibration is performed. Full rated performance will be achieved
after power-up calibration regardless of the operating conditions.
Analog Input
The ADC core contains a fully differential input (VINP/VINN) to
the sample and hold amplifier (SHA). The ideal full-scale input
voltage is 1.45V, centered at the VCM voltage of 0.535V as
Best performance is obtained when the analog inputs are driven
differentially. The common-mode output voltage, VCM, should be
used to properly bias the inputs as shown in Figures
27 through
29. An RF transformer will give the best noise and distortion
performance for wideband and/or high intermediate frequency
(IF) inputs. Two different transformer input schemes are shown in
FIGURE 23. CALIBRATION TIMING
CLKP
CLKN
CLKOUTP
RESETN
ORP
CALIBRATION
BEGINS
CALIBRATION
COMPLETE
CALIBRATION
TIME
FIGURE 24. SNR PERFORMANCE vs TEMPERATURE
-4
-3
-2
-1
0
1
2
3
-40
-15
10
35
60
85
S
N
R
CHANGE
(dBfs)
CAL DONE AT
+85°C
TEMPERATURE (°C)
CAL DONE AT
-40°C
CAL DONE AT
+25°C
FIGURE 25. SFDR PERFORMANCE vs TEMPERATURE
-15
-10
-5
0
5
10
15
-40
-15
10
35
60
85
SFDR
CHANGE
(dBc)
TEMPERATURE (°C)
CAL DONE AT
-40°C
CAL DONE AT
+25°C
CAL DONE AT
+85°C
FIGURE 26. ANALOG INPUT RANGE
1.0
1.8
0.6
0.2
1.4
INP
INN
VCM
0.535V
0.725V