Home |
Search |
Today's Posts |
#11
![]() |
|||
|
|||
![]()
The battery measured 12.7V both with and without the charger connected.
So the charger (putting out 13.7V and 500mA) doesn't have enough juice, er current, to change this. So right now 500mA is being converted to heat. I think there may be another interpretation possible. What does your charger design look like? When it's in a "no load" situation, and when you're measuring 13.7 volts, is there actually enough load on the regulator IC's output to ensure proper regulation? If I recall correctly, and LM317 requires a minimum of 10-20 mA of load on its output to regulate correctly - without this, the output voltage creeps up above what you'd expect. It's possible that under light-to-moderate load (say, 100 mA) your regulator's output voltage is dropping well below 13.7 and might need to be adjusted. If you haven't already done this, try the following: stick a reasonable resistive load on the charger (maybe 30 ohms 5 watts) so that you're actually drawing an appreciable fraction of the charger's normal output, and then readjust to 13.7. Also, use an ammeter to make sure that the regulator is actually working correctly and is truly delivering the amount of current you expect. Oh... did you heatsink the regulator? The regulator might be limiting the current flow (by dropping the output voltage) in order to protect itself. I don't think that 500 mA is being converted to heat. I think it's actively charging the battery, which is probably at least somewhat "run down". The time you'd see the power being dissipated as heat, would be when the charger's output had risen up to 13.7 and the battery was truly being "floated". I suspect that you've looked at the situation shortly after connecting the charger to the battery, while the charger was actively charging the battery to overcome the previous amount of discharge. If you were to leave the charger connected for a few hours or days, I believe you'd see that the battery terminal voltage had risen to 13.7 volts, and that the charger was delivering rather less than its maximum amount of current. This would be the "battery is fully charged, and is now being floated" state. As an example: I have a 45-amp-hour glassmat battery, hooked to a well-regulated charger (13.5 volts) which is powered from a 450 mA solar panel. If I hook up the battery after a period of moderate use, what I see is: - Before hookup, the battery voltage is somewhere down around 12.3 volts. - Upon hookup, the charger begins drawing maximum current from the solar panel. The battery voltage jumps up to around 12.6 volts. The charger turns on its "I am not limiting the voltage, as the load is drawing more than my input can supply" light. [If I use a 3-amp bench supply in place of the solar panel, the battery draws the full 3 amps at least briefly.] - Gradually, over a period of an hour or more, the battery voltage rises upwards, and the current being drawn from the panel slowly decreases. - After a few hours, the battery voltage rises to 13.5. The charger switches into "voltage regulation" mode. - The current continues to drop off, flattening out to a few tens of mA after a while and remaining there. I believe that if you monitor your charger and battery for a period of time, you will see a very similar pattern of behavior. This begs the question, what then is the point in regulating the charge voltage to 13.3V (or 14.2V at freezing temperatures)? Wouldn't a charger regulated at say 12.9V do just as well at keeping a full charge? If I recall correctly: the reason for using a slightly higher voltage has to do with the way that electrical charge is distributed in a battery. My recollection is that the charge consists (conceptually) of two parts... a fairly evenly-distributed charge in the plates, and a "surface charge" on the surfaces of the plates / crystals which is present during charging. The distributed charge is what gives you the 12.7 volts... it's the "steady state" charge within the battery. When you start driving more current into the battery, the "surface charge" appears (on the surfaces of the lead sulphide plates and crystals) as the electrochemical reactions begin to occur. If you stop driving current in, the surface charge decays away over a period of a few minutes or hours (or, quite rapidly if you start drawing current from the battery) and the battery terminal voltage drops back to 12.7 (or whatever its steady state voltage is). The surface charge creates an additional voltage, which the charger must overcome in order to force current into the battery. If you try to use a 12.9-volt charging circuit, you won't get very much additional power pushed into the battery before the surface charge rises to 0.2 volts, the battery terminal voltage rises to 12.9 volts, and the battery stops charging. If the battery had been somewhat depleted (say, it was down to 12.3 volts), the surface charge will still jump up fairly quickly and cut down the charging rate, and it'll take a long time to "top up" the battery to full charge. The 13.7-volt setting is, to some extent, a compromise. It's high enough to allow a battery to be trickle-charged up to full in a reasonable amount of time (it's high enough to overcome quite a bit of surface-charge backpressure), but it's not high enough to cause a fully-charged battery to begin electrolyzing the water out of its cells. This comes full circle on my original thread postulation. There is NO point in regulating the voltage, just connect a properly sized wall wart and you're done. The proof is right here. The battery makers say you're in error - or, at least, oversimplifying, and taking risks with your battery. Lots of peoples' experience says likewise. Go ahead if you wish. In certain very specific special cases, what you propose _may_ be safe. These would be the cases where the wall wart's maximum output current does not exceed the sum of [1] the static load on the battery, and [2] the amount of self-discharge current and loss-by-heating which would limit the battery's terminal voltage to no higher than about 13.7 volts. Because the self-discharge, and battery cell voltages are somewhat temperature-sensitive, I think you'd find that no single wall-wart would produce optimum results with a single battery under all circumstances. In the more general case, one of two things is very likely to be true: - The wall wart is smaller than ideal, and isn't capable of delivering enough current to pull the battery up to 13.7 volts in "steady state" operation. The battery will probably charge, but more slowly than would otherwise be the case. - The wall wart is larger than ideal, and it pulls the battery up to well above the optimal float voltage. The battery begins gassing, and its life is shortened. That's why a properly-regulated float-charging circuit is very desireable. It allows for a rapid recharge if the battery is run down (because you can use a nice, hefty DC supply) but ensures a stable floating voltage once the battery reaches steady state. And, a single such circuit can be used with a wide range of battery capacities - you don't need to carefully hand-select a wall wart to match each specific battery. -- Dave Platt AE6EO Hosting the Jade Warrior home page: http://www.radagast.org/jade-warrior I do _not_ wish to receive unsolicited commercial email, and I will boycott any company which has the gall to send me such ads! |
#12
![]() |
|||
|
|||
![]()
You have a couple of potential problems with using an unregulated
charger to maintain or charge a lead-acid battery. The first problem is that if the battery attempts to draw more current than the unregulated supply can handle you may overhead the transformer or other components in the supply. If on the other hand, the supply can supply more current than the maximum bulk charge rating of the battery then you may overheat and damage the battery if it is connected to the supply when it is not fully charged. The third problem results if the supply can output a voltage higher than 13.8 volts and can also supply the necessary charging current. The battery voltage will eventually climb to the supply voltage (above 13.8 volts), continue to draw charging current and boil the water out of the cells damaging the battery. To be safe, you really need to regulate the voltage at 13.8V maximum AND limit the current to protect the battery and the charger. If you also want to get a fast charge on a discharged battery then you need a multi-state charger that will apply a higher voltage (about 14.5 V) at a limited current until the battery is almost fully charged and then switch to 13.8 V to top it off and maintain a "float" charge. I expect your battery did not reach the no-load supply voltage because the supply was not capable of producing that voltage at the trickle current needed by the battery. |
#13
![]() |
|||
|
|||
![]()
You have a couple of potential problems with using an unregulated
charger to maintain or charge a lead-acid battery. The first problem is that if the battery attempts to draw more current than the unregulated supply can handle you may overhead the transformer or other components in the supply. If on the other hand, the supply can supply more current than the maximum bulk charge rating of the battery then you may overheat and damage the battery if it is connected to the supply when it is not fully charged. The third problem results if the supply can output a voltage higher than 13.8 volts and can also supply the necessary charging current. The battery voltage will eventually climb to the supply voltage (above 13.8 volts), continue to draw charging current and boil the water out of the cells damaging the battery. To be safe, you really need to regulate the voltage at 13.8V maximum AND limit the current to protect the battery and the charger. If you also want to get a fast charge on a discharged battery then you need a multi-state charger that will apply a higher voltage (about 14.5 V) at a limited current until the battery is almost fully charged and then switch to 13.8 V to top it off and maintain a "float" charge. I expect your battery did not reach the no-load supply voltage because the supply was not capable of producing that voltage at the trickle current needed by the battery. |
#14
![]() |
|||
|
|||
![]()
"Bruce W...1" wrote in message ...
Not long ago and in another thread many of you gave me great advice on how to make a car battery float charger. I wanted to just connect a properly sized wall wart, but everyone recommended voltage regulation. So I connected a voltage regulator (13.6V) to a 500mA wall wart. The wall wart has an open-circuit voltage of 18V and is rated 500mA at 12 V. Further background, I built this charger to prevent my having to start a friends car once a week while they're on extended vacation. Now two weeks later I check the battery. Its voltage is 12.7V. The charger circuit measures 13.7V. And I measured the drain, from the alarm and radio, it is 10mA. The ambient temperature on average is about 40F. What went wrong? Why is the battery only 12.7V instead of 13.7? Lacking a better solution from you guys it seems we need more power, ugh, ugh. 2A ought to do it. Spec's say that car batteries (at room temperature) are best regulated at 13.3V. For 32 degrees F 14.2V is better. Yet the failure analysis remains incomplete. Where did we go wrong? Thanks for your help. **** a Duck , Bruce, this is becoming increasingly metaphysical - the mental effort you (and everyone else) is putting into debating a car battery is ludicrous. Let me make a few points. 1. If the battery is more than 4 years old its probably stuffed or close to it. Sad but true. 2. Go and buy a hygrometer (they are about $3 - people used them before digital multimeters were invented) - have a look at the SG in the cells. If its green, its OK. Check all cells, if 1 or 2 are very different SG then its stuffed. 3. Do a load test on the thing, turn on all the lights and see how much the voltage drops. Leave them on for 0.5 hour, if it drops much below 12v then its stuffed. How much does a new battery cost anyway?......... de VK3BFA ANdrew |
#15
![]() |
|||
|
|||
![]()
"Bruce W...1" wrote in message ...
Not long ago and in another thread many of you gave me great advice on how to make a car battery float charger. I wanted to just connect a properly sized wall wart, but everyone recommended voltage regulation. So I connected a voltage regulator (13.6V) to a 500mA wall wart. The wall wart has an open-circuit voltage of 18V and is rated 500mA at 12 V. Further background, I built this charger to prevent my having to start a friends car once a week while they're on extended vacation. Now two weeks later I check the battery. Its voltage is 12.7V. The charger circuit measures 13.7V. And I measured the drain, from the alarm and radio, it is 10mA. The ambient temperature on average is about 40F. What went wrong? Why is the battery only 12.7V instead of 13.7? Lacking a better solution from you guys it seems we need more power, ugh, ugh. 2A ought to do it. Spec's say that car batteries (at room temperature) are best regulated at 13.3V. For 32 degrees F 14.2V is better. Yet the failure analysis remains incomplete. Where did we go wrong? Thanks for your help. **** a Duck , Bruce, this is becoming increasingly metaphysical - the mental effort you (and everyone else) is putting into debating a car battery is ludicrous. Let me make a few points. 1. If the battery is more than 4 years old its probably stuffed or close to it. Sad but true. 2. Go and buy a hygrometer (they are about $3 - people used them before digital multimeters were invented) - have a look at the SG in the cells. If its green, its OK. Check all cells, if 1 or 2 are very different SG then its stuffed. 3. Do a load test on the thing, turn on all the lights and see how much the voltage drops. Leave them on for 0.5 hour, if it drops much below 12v then its stuffed. How much does a new battery cost anyway?......... de VK3BFA ANdrew |
#16
![]() |
|||
|
|||
![]() |
#17
![]() |
|||
|
|||
![]() |
#18
![]() |
|||
|
|||
![]()
Dave Platt wrote:
The battery measured 12.7V both with and without the charger connected. So the charger (putting out 13.7V and 500mA) doesn't have enough juice, er current, to change this. So right now 500mA is being converted to heat. I think there may be another interpretation possible. What does your charger design look like? When it's in a "no load" situation, and when you're measuring 13.7 volts, is there actually enough load on the regulator IC's output to ensure proper regulation? If I recall correctly, and LM317 requires a minimum of 10-20 mA of load on its output to regulate correctly - without this, the output voltage creeps up above what you'd expect. It's possible that under light-to-moderate load (say, 100 mA) your regulator's output voltage is dropping well below 13.7 and might need to be adjusted. If you haven't already done this, try the following: stick a reasonable resistive load on the charger (maybe 30 ohms 5 watts) so that you're actually drawing an appreciable fraction of the charger's normal output, and then readjust to 13.7. Also, use an ammeter to make sure that the regulator is actually working correctly and is truly delivering the amount of current you expect. Oh... did you heatsink the regulator? The regulator might be limiting the current flow (by dropping the output voltage) in order to protect itself. I don't think that 500 mA is being converted to heat. I think it's actively charging the battery, which is probably at least somewhat "run down". The time you'd see the power being dissipated as heat, would be when the charger's output had risen up to 13.7 and the battery was truly being "floated". I suspect that you've looked at the situation shortly after connecting the charger to the battery, while the charger was actively charging the battery to overcome the previous amount of discharge. If you were to leave the charger connected for a few hours or days, I believe you'd see that the battery terminal voltage had risen to 13.7 volts, and that the charger was delivering rather less than its maximum amount of current. This would be the "battery is fully charged, and is now being floated" state. As an example: I have a 45-amp-hour glassmat battery, hooked to a well-regulated charger (13.5 volts) which is powered from a 450 mA solar panel. If I hook up the battery after a period of moderate use, what I see is: - Before hookup, the battery voltage is somewhere down around 12.3 volts. - Upon hookup, the charger begins drawing maximum current from the solar panel. The battery voltage jumps up to around 12.6 volts. The charger turns on its "I am not limiting the voltage, as the load is drawing more than my input can supply" light. [If I use a 3-amp bench supply in place of the solar panel, the battery draws the full 3 amps at least briefly.] - Gradually, over a period of an hour or more, the battery voltage rises upwards, and the current being drawn from the panel slowly decreases. - After a few hours, the battery voltage rises to 13.5. The charger switches into "voltage regulation" mode. - The current continues to drop off, flattening out to a few tens of mA after a while and remaining there. I believe that if you monitor your charger and battery for a period of time, you will see a very similar pattern of behavior. This begs the question, what then is the point in regulating the charge voltage to 13.3V (or 14.2V at freezing temperatures)? Wouldn't a charger regulated at say 12.9V do just as well at keeping a full charge? If I recall correctly: the reason for using a slightly higher voltage has to do with the way that electrical charge is distributed in a battery. My recollection is that the charge consists (conceptually) of two parts... a fairly evenly-distributed charge in the plates, and a "surface charge" on the surfaces of the plates / crystals which is present during charging. The distributed charge is what gives you the 12.7 volts... it's the "steady state" charge within the battery. When you start driving more current into the battery, the "surface charge" appears (on the surfaces of the lead sulphide plates and crystals) as the electrochemical reactions begin to occur. If you stop driving current in, the surface charge decays away over a period of a few minutes or hours (or, quite rapidly if you start drawing current from the battery) and the battery terminal voltage drops back to 12.7 (or whatever its steady state voltage is). The surface charge creates an additional voltage, which the charger must overcome in order to force current into the battery. If you try to use a 12.9-volt charging circuit, you won't get very much additional power pushed into the battery before the surface charge rises to 0.2 volts, the battery terminal voltage rises to 12.9 volts, and the battery stops charging. If the battery had been somewhat depleted (say, it was down to 12.3 volts), the surface charge will still jump up fairly quickly and cut down the charging rate, and it'll take a long time to "top up" the battery to full charge. The 13.7-volt setting is, to some extent, a compromise. It's high enough to allow a battery to be trickle-charged up to full in a reasonable amount of time (it's high enough to overcome quite a bit of surface-charge backpressure), but it's not high enough to cause a fully-charged battery to begin electrolyzing the water out of its cells. This comes full circle on my original thread postulation. There is NO point in regulating the voltage, just connect a properly sized wall wart and you're done. The proof is right here. The battery makers say you're in error - or, at least, oversimplifying, and taking risks with your battery. Lots of peoples' experience says likewise. Go ahead if you wish. In certain very specific special cases, what you propose _may_ be safe. These would be the cases where the wall wart's maximum output current does not exceed the sum of [1] the static load on the battery, and [2] the amount of self-discharge current and loss-by-heating which would limit the battery's terminal voltage to no higher than about 13.7 volts. Because the self-discharge, and battery cell voltages are somewhat temperature-sensitive, I think you'd find that no single wall-wart would produce optimum results with a single battery under all circumstances. In the more general case, one of two things is very likely to be true: - The wall wart is smaller than ideal, and isn't capable of delivering enough current to pull the battery up to 13.7 volts in "steady state" operation. The battery will probably charge, but more slowly than would otherwise be the case. - The wall wart is larger than ideal, and it pulls the battery up to well above the optimal float voltage. The battery begins gassing, and its life is shortened. That's why a properly-regulated float-charging circuit is very desireable. It allows for a rapid recharge if the battery is run down (because you can use a nice, hefty DC supply) but ensures a stable floating voltage once the battery reaches steady state. And, a single such circuit can be used with a wide range of battery capacities - you don't need to carefully hand-select a wall wart to match each specific battery. -- Dave Platt AE6EO Hosting the Jade Warrior home page: http://www.radagast.org/jade-warrior I do _not_ wish to receive unsolicited commercial email, and I will boycott any company which has the gall to send me such ads! ================================================== =============== You may have something here. It would sure explain a lot. I need to do some more testing. Thanks. |
#19
![]() |
|||
|
|||
![]()
Dave Platt wrote:
The battery measured 12.7V both with and without the charger connected. So the charger (putting out 13.7V and 500mA) doesn't have enough juice, er current, to change this. So right now 500mA is being converted to heat. I think there may be another interpretation possible. What does your charger design look like? When it's in a "no load" situation, and when you're measuring 13.7 volts, is there actually enough load on the regulator IC's output to ensure proper regulation? If I recall correctly, and LM317 requires a minimum of 10-20 mA of load on its output to regulate correctly - without this, the output voltage creeps up above what you'd expect. It's possible that under light-to-moderate load (say, 100 mA) your regulator's output voltage is dropping well below 13.7 and might need to be adjusted. If you haven't already done this, try the following: stick a reasonable resistive load on the charger (maybe 30 ohms 5 watts) so that you're actually drawing an appreciable fraction of the charger's normal output, and then readjust to 13.7. Also, use an ammeter to make sure that the regulator is actually working correctly and is truly delivering the amount of current you expect. Oh... did you heatsink the regulator? The regulator might be limiting the current flow (by dropping the output voltage) in order to protect itself. I don't think that 500 mA is being converted to heat. I think it's actively charging the battery, which is probably at least somewhat "run down". The time you'd see the power being dissipated as heat, would be when the charger's output had risen up to 13.7 and the battery was truly being "floated". I suspect that you've looked at the situation shortly after connecting the charger to the battery, while the charger was actively charging the battery to overcome the previous amount of discharge. If you were to leave the charger connected for a few hours or days, I believe you'd see that the battery terminal voltage had risen to 13.7 volts, and that the charger was delivering rather less than its maximum amount of current. This would be the "battery is fully charged, and is now being floated" state. As an example: I have a 45-amp-hour glassmat battery, hooked to a well-regulated charger (13.5 volts) which is powered from a 450 mA solar panel. If I hook up the battery after a period of moderate use, what I see is: - Before hookup, the battery voltage is somewhere down around 12.3 volts. - Upon hookup, the charger begins drawing maximum current from the solar panel. The battery voltage jumps up to around 12.6 volts. The charger turns on its "I am not limiting the voltage, as the load is drawing more than my input can supply" light. [If I use a 3-amp bench supply in place of the solar panel, the battery draws the full 3 amps at least briefly.] - Gradually, over a period of an hour or more, the battery voltage rises upwards, and the current being drawn from the panel slowly decreases. - After a few hours, the battery voltage rises to 13.5. The charger switches into "voltage regulation" mode. - The current continues to drop off, flattening out to a few tens of mA after a while and remaining there. I believe that if you monitor your charger and battery for a period of time, you will see a very similar pattern of behavior. This begs the question, what then is the point in regulating the charge voltage to 13.3V (or 14.2V at freezing temperatures)? Wouldn't a charger regulated at say 12.9V do just as well at keeping a full charge? If I recall correctly: the reason for using a slightly higher voltage has to do with the way that electrical charge is distributed in a battery. My recollection is that the charge consists (conceptually) of two parts... a fairly evenly-distributed charge in the plates, and a "surface charge" on the surfaces of the plates / crystals which is present during charging. The distributed charge is what gives you the 12.7 volts... it's the "steady state" charge within the battery. When you start driving more current into the battery, the "surface charge" appears (on the surfaces of the lead sulphide plates and crystals) as the electrochemical reactions begin to occur. If you stop driving current in, the surface charge decays away over a period of a few minutes or hours (or, quite rapidly if you start drawing current from the battery) and the battery terminal voltage drops back to 12.7 (or whatever its steady state voltage is). The surface charge creates an additional voltage, which the charger must overcome in order to force current into the battery. If you try to use a 12.9-volt charging circuit, you won't get very much additional power pushed into the battery before the surface charge rises to 0.2 volts, the battery terminal voltage rises to 12.9 volts, and the battery stops charging. If the battery had been somewhat depleted (say, it was down to 12.3 volts), the surface charge will still jump up fairly quickly and cut down the charging rate, and it'll take a long time to "top up" the battery to full charge. The 13.7-volt setting is, to some extent, a compromise. It's high enough to allow a battery to be trickle-charged up to full in a reasonable amount of time (it's high enough to overcome quite a bit of surface-charge backpressure), but it's not high enough to cause a fully-charged battery to begin electrolyzing the water out of its cells. This comes full circle on my original thread postulation. There is NO point in regulating the voltage, just connect a properly sized wall wart and you're done. The proof is right here. The battery makers say you're in error - or, at least, oversimplifying, and taking risks with your battery. Lots of peoples' experience says likewise. Go ahead if you wish. In certain very specific special cases, what you propose _may_ be safe. These would be the cases where the wall wart's maximum output current does not exceed the sum of [1] the static load on the battery, and [2] the amount of self-discharge current and loss-by-heating which would limit the battery's terminal voltage to no higher than about 13.7 volts. Because the self-discharge, and battery cell voltages are somewhat temperature-sensitive, I think you'd find that no single wall-wart would produce optimum results with a single battery under all circumstances. In the more general case, one of two things is very likely to be true: - The wall wart is smaller than ideal, and isn't capable of delivering enough current to pull the battery up to 13.7 volts in "steady state" operation. The battery will probably charge, but more slowly than would otherwise be the case. - The wall wart is larger than ideal, and it pulls the battery up to well above the optimal float voltage. The battery begins gassing, and its life is shortened. That's why a properly-regulated float-charging circuit is very desireable. It allows for a rapid recharge if the battery is run down (because you can use a nice, hefty DC supply) but ensures a stable floating voltage once the battery reaches steady state. And, a single such circuit can be used with a wide range of battery capacities - you don't need to carefully hand-select a wall wart to match each specific battery. -- Dave Platt AE6EO Hosting the Jade Warrior home page: http://www.radagast.org/jade-warrior I do _not_ wish to receive unsolicited commercial email, and I will boycott any company which has the gall to send me such ads! ================================================== =============== You may have something here. It would sure explain a lot. I need to do some more testing. Thanks. |
Reply |
Thread Tools | Search this Thread |
Display Modes | |
|
|
![]() |
||||
Thread | Forum | |||
Battery charger and voltage converter | Equipment | |||
Battery charger and voltage converter | Equipment | |||
Battery charger and voltage converter | Equipment | |||
Car battery trickle charger? | Homebrew | |||
Car battery trickle charger? | Homebrew |