we are experiencing some issues regarding the ADC (channel 5) of our Colibri board. More specifically, in our application we use the ADC to read a voltage in the range of 0 V to 10 V. This voltage is fed to the ADC input (Analogue Input <3>) through a resistor divider consisting of an 8.2 kΩ resistor and a 1.8 kΩ resistor to GND. So the ADC input sees 0V to 1,8V.
I ran a script to log the measured voltages. In the range of about 0–3 V we measure with an accuracy of approximately ±60 mV. However, as we approach 10 V, we observe a delta of almost 500 mV (the deviation increases with higher voltage).
We are aware that the source impedance is probably too high. For the moment, however, we would like to try to improve the accuracy purely through software. Unfortunately, several attempts to adjust the sampling frequency have not been successful. Increasing the number of cycles to give the sample-and-hold capacitor more time also did not help.
Among other things, we tried the following (in our device tree):
I might make mistakes or occasionally get you wrong. Our support team will step in if needed.
I’d be glad to help! Could you let me know which OS and version you are using? To provide the most accurate guidance for your ADC accuracy issue, I’ll also need to know which specific Colibri module you’re working with.
You can get this information by running the tdx-info command on your device. For more details about this command, please visit: tdx-info
Once I know your hardware platform and software version, I can provide specific recommendations for improving your ADC accuracy through software configuration.
Best regards, Toradex AI Assistant.
Did you like my answer?
If I answered your question, please tick the “Solution” checkbox.
For follow-up questions to me, please use @ToradexAI in your post.
As noted in the module datasheet, “There are 1 kΩ series resistors placed in the ADC lines on the module to protect the SoC input.” Considering that, and given that the SoC datasheet specifies a maximum source impedance of 5 kΩ for the ADC input, the effective impedance in your case is too high. Increasing the sampling frequency may make the situation even worse, since the sampling capacitor may not have enough time to charge fully to the input voltage, which can lead to inaccurate ADC conversion results. You may want to consider using a buffer with an op-amp or a similar solution.
increasing the sample rate makes the situation worse – that’s also what we have observed. However, we want to do the opposite: reduce it to a limit where the sample-and-hold capacitor can still charge completely and our application runs without issues.
As you can see in my opening post, we already tried several approaches to achieve this, but unfortunately without success.
At the moment, changing the hardware is not a solution we would like to pursue. We would prefer to try solving it with software first.
The NXP ADC Linux driver is not of the highest quality and contains several hard-coded parameters, such as averaging over 128 conversions and a fixed timeout that does not depend on the sampling rate. You may try using a 12 MHz clock rate, but I would rather recommend a hardware solution. If the measured voltage does not change quickly, you can add a 10 nF capacitor between the module’s ADC input pin and ground. This helps mitigate the high source impedance by providing a local charge reservoir for the ADC sampling capacitor.
If you prefer to address this in software first, the driver will likely need to be modified..