Originally Posted by
drivesafe
Hi Tin and while that guy may have had good intentions, he really has no idea what he was talking about.
He states that because his mate ( or whoever ) had used 10G cable instead of 6G cable, this would mean his battery would never fully charge
Completely wrong.
While the battery is in a low state of charge, yes it will draw the full 30 amps from the DC/DC charger and this 30 amp draw would cause the amount of voltage drop stated.
But as the battery got to around 70 to 80% charge state, the current the battery is drawing tapers off and as the current draw reduces so does the voltage drop.
So the battery will eventually be supplied will a low current charge at near the highest voltage being supplied by the DC/DC charger.
So with his limited knowledge of how batteries charge, if the battery needed that higher voltage to fully charge, it would be getting it.
Next, even at the reduced voltage, if this did not rise as the battery charged, the battery could still be fully charged with the lower voltage, it would just take a little longer.
All batteries can be fully charged with as little as 13.8v
Next, he calculated the voltage drop between the CRANKING battery and the DC/DC charger based on the cranking battery supplying the same current into the DC/DC device as that device was supplying to the AUXILIARY battery.
No DC/DC device works this way.
All DC/DC devices are inefficient and require far more current in than they can supply out to the battery being charged.
So his calculations are off by a considerable way.
If he knew this, he would have realised that what he was saying about having the charge device, be it a DC/DC charger or a solar charger, should mounted as close as possible to the battery being charged, was more important than he knew.