So I am trying to stream the data of one ADC running at 1GS/s over the PICe bus to the host PC using the example firmware (CID672, pc821_fmc120_fmc120_axi_ku115).
Theoretically this should not be a problem; 1GS/s, 2 bytes per sample, => ~2GB/s and the PCIe bandwidth for gen 2.0 x8 lanes is ~4GB/s. The throughput of the PCIe interface when running the CID621 and AxiMemTest is about 75MB/s shy of 4GB/s.
With the goal in mind, the changes I made to the example code for CID672 are fairly minor. I don't loop through the ADC capture blocks, and I enlarge the dma_buffer. Next, I set the capture numberBurst to 0 (Never have gotten anything else to work except for 1 and 0) and set burstSize to 0. My understanding of setting the burstSize to zero will force the capture block to continually stream data to whatever is listening.
The problem that I am seeing is that there is a clear discontinuity every 2**18 samples (exactly half of a megabyte) where it has dropped a number of samples on the order of 2**18 samples(I am pretty sure it is dropping exactly a half megabyte). The signal source for this was an Agilent signal generator, basically known good. My guess is that I am not doing something quite right with the capture block burst/burst_size/triggering setup. I am looking for some insight into this problem, it would be very useful to be able to capture tens of megabytes of raw ADC data without any problems.
John Gard
NIST - Boulder
1 Comment
J
Johnathon Gard
said
over 6 years ago
I would like to make an addendum in that there is a PCIe difference between CID672 and CID621; 621 is actually running Gen3.0 PCIe. CID672 would need some work to get it to meet timing at a full internal 250MHz.
Johnathon Gard
Hello,
So I am trying to stream the data of one ADC running at 1GS/s over the PICe bus to the host PC using the example firmware (CID672, pc821_fmc120_fmc120_axi_ku115).
Theoretically this should not be a problem; 1GS/s, 2 bytes per sample, => ~2GB/s and the PCIe bandwidth for gen 2.0 x8 lanes is ~4GB/s. The throughput of the PCIe interface when running the CID621 and AxiMemTest is about 75MB/s shy of 4GB/s.
With the goal in mind, the changes I made to the example code for CID672 are fairly minor. I don't loop through the ADC capture blocks, and I enlarge the dma_buffer. Next, I set the capture numberBurst to 0 (Never have gotten anything else to work except for 1 and 0) and set burstSize to 0. My understanding of setting the burstSize to zero will force the capture block to continually stream data to whatever is listening.
The problem that I am seeing is that there is a clear discontinuity every 2**18 samples (exactly half of a megabyte) where it has dropped a number of samples on the order of 2**18 samples(I am pretty sure it is dropping exactly a half megabyte). The signal source for this was an Agilent signal generator, basically known good. My guess is that I am not doing something quite right with the capture block burst/burst_size/triggering setup. I am looking for some insight into this problem, it would be very useful to be able to capture tens of megabytes of raw ADC data without any problems.
John Gard
NIST - Boulder