Alternator charging musings

I thought I would share my thoughts on this subject again. Our KS energy lithium batteries can be charged at 1C, which means equal capacity to their total capacity. Thus in the case of a KS-100, they will accept a recharge current of 100A.

Now then, when you have an expensive lithium battery like this, surely you want to take advantage of all of its benefits to the full? Our batteries have a low linear resistance curve. This results in a charge curve that is more akin to a capacitor than an old style lead battery. This means these batteries can be recharged quickly.

So why would you want to limit the charging current to typically 40 or 60 amps using an expensive battery battery charger when you could recharge them much quicker using an appropriately rated split charging set-up?

I cannot seem to find an answer to this conundrum and the only argument I seem to be able to find out there is that running your alternator near to or at its maximum capacity will drastically shorten its life.    ......but Really ? I mean "REALLY ??" where is the evidence for this?

I cannot find any credible published literature that actually shows this can be true. To my mind, an alternator has winding's and a regulator which, even in the worst scenario, should have been designed to handle its stated output and maintain this output for its entire life, which would surely be until the engine wears out. Certainly this should be the case if the Alternator is an OEM item!

Lets quickly consider modern alternator construction. The brushes rest on smooth rings, carrying little current to that of the stationary primary winding, these should never wear out, wear should not be current dependent and moreover - that is even if there are brushes. Many designs even do away with brushes these days. Then there's bearings which are sealed for-life bearings and which are cooled by fans - often either end so overheating is impossible. Lastly a solid state rectifier, often controlled externally by the ECU for increased efficiency.  

Recently I had an elderly Australian couple stay with me, as I run an AirBnB. Its fascinating who you meet. The elderly gent had spent much of his working life with GM, in its electrical QA/test department. He explained that they used to run their alternators at full capacity for thousands of hours, testing them to destruction by using an environmental chamber to produce that well above worst case scenarios.

He explained that there really ought not to be any harm constantly running at full capacity. Interestingly he went on to say that the cooling was designed to cope at very high temperatures, in excess of a typical Australian climate and that it was only when they were run over and above such temperatures that failure would result. Which, as he explained, was typically the resin within the copper winding's softening allowing the winding's to expand and short and of course above worst case humidity scenarios, where upon corrosion was a problem.

So unless you vehicle has something else other than a standard alternator such as a combined alternator/starter or regenerative braking alternator, where is the case for an external battery/battery charger.

I would therefore like to hear from anyone out there who can make a case for actually using a battery battery charger over a split charger set-up and I will certainly publish any sensible comments. Please email me at

until, next time, all the best, Neal.