During offset calibration we see unexpected spikes among the sampled data.
While leaving the analog inputs open we would except that the samples are all around the middle code (511), but instead we sometimes see values of 0 to 400 and 550 to 650. Normally these hazards only occur on one or two channels while the others are totally normal.
This issue occurs on all of our four FMC 126 boards.
To eleminate an error on the digital interface we let the ramp test run for several minutes without any error.
Additionally this error sometimes disappears after writing new offset values.
Is this a known and expected behaviour of the ADC or is there something wrong.
I am not sure to understand. Do you have this issue with the calibration package we are selling? Or you are trying to implement that yourself and getting unexpected results?
One thing which looks funny each time, you would need to try cleaning the FMC connectors with some alcohol and a tooth brush, any dirt in there will cause troubles at such high speed.
Best Regards, Arnaud
over 2 years ago
we did this with our own implementation which is based upon the VC707 example design.
We did try this with cables not connected, cables connected but signal generator turned off and cables connected with a dc signal applied. The results were every time the same.
After some more testing we got the following observations: 1. Lowering the sampling rate yields in less of these errors. 2. Applying no offset at all normally results in no errors. This depends on whether the signal is allready inside the middle code range (errors) or not (no errors) 3. If for example the floating signal gives a mean value of 500 there are no errors. After writing the apropriate offset to shift the signal to 511.5 these errors occur. Writing the same offset again normally increases the error rate, which is really strange.
As I said before, the LVDS interface from the ADC to the FPGA was tested with the RAMP test of the ADC and there were nor errors in a 30 minute time window.
over 2 years ago
I added the histogram of channel A before and after setting the offset value.
As can be seen in the picture histogramBeforeOffset.jpg the mean is somewhere near 502 and there are no values below 496 and above 506.
After setting the appropriate offset value we get the histogram presented in histogramAfterOffset.jpg where the mean is at around 511.5 but we also see some values that are too far away: 640, 641, 132, 471, 128, 133, 255
Do you have any explanation for this behaviour?
over 2 years ago
What you describe is surely digital domain error; a ramp test is not the ultimate test, a noisy input around 0 is much more stressful than a ramp so you should not conclude that after your ramp test everything is right.
In our one channel @5Gsps firmware we have different buffering and different PHY interface. We have a calibration package (firmware and software) working out of the box on both ML605 and VC707, the cost is 2400 euros, maybe this is something to consider on your side.
To stick to your question, no we have not seen that but we really made sure the digital domain is fine by having a bit alignment machine (which is not using the ADC ramp test, I think it uses this pseudo random mode ). Then we work on the analog compensation.
I am attaching the user manual of this calibration package in the case you are interested by cutting out some headache.