Good morning,
I have some doubt about ADC functioning because we read always an offset (about 10%, not fixed in absolute value) when we read an external sensor.
We configured ADC1 in our DT to use gpio1.00 and gpio1.01.
HW circuit is composed of on 150 ohm resistor connected to gpio pin to measure the voltage.
Then, if we simulate an external sensor generating 4mA / 8 mA / 12mA / 16mA / 20mA, we read values in figure:
But we are expecting values like:
current - raw value (current * 150 ohm)
4 mA - 600
8 mA - 1200
12 mA - 1800
16 mA - 2400
20 mA - 3000
We measure the resistor and it is 150 ohm for sure.
Do we need to calibrate the ADC? or change some parameters?
Other unresolved question is what is the ‘in_voltage_scale’ value which is stored into the file in figure.
We understood it is related to reg_module_3v3_avdd declared in DT, but it seems it is not used. We tried to change reg_module_3v3_avdd = <4096000> to obtain in_voltage_scale = 1. But ADC raw values remain the same.
So, which is the functioning of in_voltage_scale value?
Seems like the ADC you are using is the i.MX6ULL SoC one, right? If yes, from where you got this num-channels property?
Also, please share the HW schematics that you are using to generate the voltage. The ground connection is really important: which ground are you using?
In our schematic AI1_4-20mA is directly connected to external connector, while AI1_0.6-3V is connected to pin sodimm8 of colibri SoM. GND is standard for the entire board. In between there is a component for protecting the uC. You see the 150ohm resistance.
Do you have a separate supply for your ADC going into pins 10/12 of the colibri? You’ll need something filtered going in there (not the standard 3.3 from the entire board).
I’ve also never seen a design without some sort of opamp between your output voltage and the colibri adc input.
You could also back up and just feed a direct voltage into the pin without the external components and see what you get.
These are expected millivolts. in_voltage_raw are raw ADC counts, so you should expect
mA*150 / Vref[mV]*4096 (2^12)
And then you get less than expected. One of reason is how you set up IOMUXC. 0x3000 - is enable 100 kOhm pull down, which using your filter resistor gives 100/(10+100) voltage divider. So for 16mA you should read 2978 without divider and 2708 taking divider into account. Pull resistance values aren’t very precise, so 2708 vs 2650 is pretty close.
Try disabling pull device with 0x0 instead of 0x3000 in pad settings.
It is the number to multiply raw value by to get voltage in mV units.