A customer has a product that uses a 3.7V lithium battery to power a control circuit that controls actuators such as motors according to certain logic.
The processor needs to detect the battery voltage in order to stop responding to user inputs and send out an alert signal when the battery is discharged below a certain level.
Because of cost considerations, the customer chose a very low-cost, simple processor that operates in the 3.0V-5.0V range and has an ADC that does not have a dedicated reference voltage input, but instead uses the power supply as a reference voltage.
Previously, for the similar products, lithium battery feeds directly to the processor, but when the need to detect battery voltage, if it is done by a fixed voltage divider, when the battery voltage changes, the ADC's reference voltage also follows the change, resulting in the reading of the ADC value is always a fixed value, will not change with the change of the battery voltage;
The customer's idea is to use a 3.3V LDO, the lithium battery is regulated to 3.3V by the LDO to supply power to the processor, so that the processor's ADC can get a fixed reference voltage.
I think, there are several problems for this method.
1) Voltage drop, when the load current is above 100mA, the voltage drop between the input and output of the LDO reaches at least 300mV or more, meaning that the input voltage reaches 3.6V or more before the output voltage can be constant at 3.3V, and the Li-Ion battery still has quite a large volume at 3.6V.
2) Low voltage, with the lithium battery discharge, the voltage continues to drop, when the drop to the LDO can not properly regulate the voltage, the output voltage of the LDO will follow the input voltage changes, resulting in the voltage detection of the ADC value may be a fixed value, thus the battery voltage will be misclassified as a normal high voltage;
2) Cost, the LDO and peripheral devices need at least 0.5RMB cost, which is a very huge cost for small appliances that are extremely cost sensitive;
Based on these analyses, I offer another solution using a TL431 as a reference;
When the processor needs to detect the battery voltage, a high level is output by PA4, which is the supply voltage VDD.
ADC detection through PA5 to get the value AD, for 12-bit ADC detection, the
Calculate to get the supply voltage by the following formula:
VDD=2.5*4096/AD.
Because the TL431 can normally regulate the output current as low as 0.5mA or so, in order to achieve 3.0V can still work properly, the resistance value of resistor R26 is selected as
The resistance value of resistor R26 is selected as 1K, And the following formula is calculated:
(VDD-2.5)/R26>0.5mA=>R26<1K.
When the battery voltage decreases below the point that the TL431 cannot work normally, the input voltage value of PA5 pin is the output voltage value VDD of PA4;
At this time, the AD value detected by PA5 pin is VDD/VDD*4096=4096;
In this case, the programme calculates the power supply voltage value of VDD = 4096 * 2.5/4096 = 2.5V.
which is completely different from the result obtained by using the LDO scheme.
From the point of view of DFMEA analysis, the LDO scheme will misjudge the battery voltage as normal at low voltage and perform normal action, which may lead to serious consequences, which is unacceptable.
It is very important that designing a product should not only focus on the normal function, but also require an in-depth DFEMA analysis to ensure that the abnormal mode will not affect the product with serious abnormality.
Top comments (0)