I interfaced FPGA(Kintex_7, LVDS_25, Vadj=1.8v)with external board to provide inputs(Analog voltage and reference voltage) to LVDS. I adjusted frequencies of signal generators to 1µHz and amplitudes to least possible value 10mv. When both the inputs are 10mv, comparator output is zero. I kept analog voltage 10mv and increased reference voltage(51mv) till the comparator turn on. In the next step, reference voltage is kept same and analog voltage is increased till comparator turn on. This process is repeated to max voltage levels. Please find the attached file for the values noted down. I’m unable to relate this to theory. Analog voltage is always less than Reference voltage but still why the comparator keeps switching? In the beginning its 41mv difference but later it will be 100mv, 250mv …why so?. It would be helpful if someone explain the LVDS input output behavior as a comparator considering those noted values in the file.
There are actually two methods in doing this:
A self running oscillator adding noise onto the signal causing statistical toggling, which is (over-)sampled, reduced down to the corresponding noise generators' freq and then compared by decimation of both. The debris which resides after having subtracted the noise from the input obtained is the signal.
A PDM-integrator which uses the incoming comparator output as the signal in order to decimate it. The integrator and the comparator can be totally realized in the FPGA- but using a LVDS requires calibration and / or signal processing to get rid of BIAS.
I once combined both techniques to apply a PDM bases "noise" onto the incoming signal to lift it accordingly in order to drive the input and remove BIAS dynamically.