I am using the 8753 OPT011 in a test set up to measure small signal gain of power amplifiers in various wirleless bands ( from 0.8 to 3 GHz with 10 to 60 MHz TX bands ). I want to know the most useful signal dynamic range for the RX ports R ; A ; B and the associated accuracy in any input level range. I cannot find this information in any manual or application note. I have different amplifiers with different gains and output power levels and input return losses for testing and also thru calibration. In addition the set up also can swicth to ACPR / CCDF measuremennts using ESGs and PSAs so I also have to mind the input levels into the detectors for this situation.
The various amplifier senarios are ( 1.93 - 1.99 GHz ) :
1.0 Output Stage : G=13dB ( small signal ) ; Pout ( large signal ) = 48dBm ; Pout ( small signal ) = 33dBm or 22dBm ; Input Retun Loss ( small signal ) = -10 to -25dB
2.0 Complete Amplifier : G=60dB ( small signal ) ; Pout ( large signal ) = 48dBm ; Pout ( small signal ) = 33dBm or 22dBm ; Input Retun Loss ( small signal ) = -10 to -25dB
3.0 Driver Stage : G=30dB ( small signal ) ; Pout ( large signal ) = 36dBm ; Pout ( small signal ) = 22dBm ; Input Retun Loss ( small signal ) = -10 to -25dB
The large signal power levels are used with modulated carriers ( LTE; CDMA ; WCDMA and can have up to 60 MHz instantaneous BW and mixed mode ). Small signal measurements are using the 8753 source. A thru response calibration is used mostly. Usually a wider band is called ( so 1.8 to 2.2 GHz for a PCS band amp ) to check on out of band response ( need for DPD chararacterization ).
As can be seen by the different levels and also gains there needs to be several differnet set ups for each amplifier ( including driver amplifiers ) but the small signal levels into teh receiver can vary by 20 to 30dB. What is the optimum rx range for accuracy of the R ; A and B inputs for teh 8753 OPT011 ? Thanks.