The relationship between a battery and a charger appears straightforward: plug in and recharge. While this simplicity holds true for devices like smartphones and laptops, the reality for electric vehicles (EV) and industrial machinery is far more complex.
Data itself illustrates the complexities. Surveys show traditional lead-acid batteries now favor applications like uninterruptible power supplies (UPS) and telecom systems. Meanwhile, lithium-ion batteries are becoming the preferred choice in the automotive, consumer electronics and industrial sectors.
Why the divergence across lead-acid and lithium-ion? In a word: integration. This choice of battery chemistry alone offers distinct tradeoffs in power, charge time, battery size and a dozen other factors that must precisely match with surrounding components — and the intended application.
That’s why it's critical for original equipment manufacturers (OEMs) to understand the intricacies of battery and charger integration to guide strategic decisions in their electrification journey.
Lead-Acid Batteries: The Chemistry That Started it All
First developed in 1859, lead-acid batteries revolutionized energy storage with a compact and rechargeable power source. Technological refinement led to variant designs, such as maintenance-free “sealed” lead-acid batteries that use a liquid electrolyte gel, eliminating the need for watering.
Today, the simple act of plugging in a lead-acid battery to a modern, high-frequency charger sets into motion a three-stage series of complex electrical reactions:
- Electrical output activates, allowing the battery to absorb energy up to a standardized voltage that depends on battery chemistry.
- Voltage remains constant as the battery charges — electrical current decreases to the value needed for the third phase.
- Upon nearing 95% charge, IUoU (the DIN designation for 3-stage charging) configuration ensures 100% State-of-Charge is achieved. It then activates a continual, low-draw state to safely maintain this charge even after extensive non-use.
Given the maturity of the lead-acid market, these are well-established standards for batteries. However, not all chargers properly support these precise electrical reactions necessary for a safe and optimal recharge.
Although lead-acid batteries do not benefit from the more advanced integration features discussed later, consulting with a professional charger supplier is still advisable. This allows specialists to develop a charging profile specifically calibrated to your equipment's use case, lowering maintenance costs, preventing possible battery damage and maximizing return on investment.
Lithium-Ion Batteries: Expanded Options, Enhanced Integrations
Rapidly reshaping the battery scene, research on lithium-ion batteries began in the 1970s, with the first commercial lithium-ion battery introduced in 1991. The chemistry’s unique capabilities have seen it dominate the industrial battery market, offering:
- higher energy density,
- longer life cycles and
- lower maintenance requirements.
Comparing compound annual growth rates (CAGR) provides a self-explanatory snapshot; within the same 5-year period of 2024-2029, lithium-ion is projected to grow a staggering 14.46%, contrasting lead-acid's humble 4.40% growth.
A key driver of this growth is the pressing need for electrification — the door to which lithium-ion has unlocked an increasing number of applications. This is seen by lithium-ion hoarding the lion’s share of the EV market, with nearly 67% of lithium-ion batteries used in automotive applications.
This natural progression benefits industrial OEMs, as many electrified vehicles powered by lithium-ion now favor on-board chargers. The challenge, again, lies in the subtleties of integration. Although lithium-ion unlocks new efficiencies across every link in the value chain, the logistical and technical hurdles also intensify.
The Importance of Battery Management Systems
Battery management systems (BMS) play a crucial role in controlling the charging process of lithium-ion batteries. As opposed to battery monitors, which passively check and log battery performance, BMS plays a much more active role as it tracks:
- voltage of individual cells,
- temperature of individual cells and
- state of charge (SoC).
By communicating these and other key data between the battery and charger (like the required electric current), a BMS adjusts the charging profile in real time. This prevents overcharging, overheating and other potential issues that could reduce the battery’s lifespan. These optimizations are also required to achieve higher charging rates without sacrificing safety.
Matching Communication Protocols and Signals
A key aspect of integrating a BMS with a charger is matching their communication protocols. This is often the first barrier to system integration — and a tall one given the lack of industry-wide standardization. While J1939 protocols are gaining traction as the normalized standard, in most cases, either charger or battery must adapt to the other's protocol to ensure compatibility.
Some batteries also require additional signals to initiate charging. This “wake-up” process could involve:
- providing a short-circuit through a charger-controlled relay,
- shorting specific pins on the DC battery connector and
- utilizing live voltage (12 or 24V).
Certain options are more complex than others, as not all chargers support these features. Careful consideration, customization and consultation with charger professionals are required to ensure successful integration.
Advanced Charger Technology Brings Range of Capabilities
Modern chargers have evolved beyond a simple medium for electricity to enter the battery. They are intelligent and integrated, utilizing advanced algorithms, Controller Area Network (CAN) bus protocols and other wireless technology to facilitate two-way communication between battery and charger.
These same integrations are moving beyond closed systems to establish external communications, unlocking even greater efficiencies such as:
- Remote room monitoring – Connected systems that oversee the status and performance of multiple batteries and chargers from a centralized location. Such integrations allow operators to monitor environmental or usage conditions that could affect battery performance, like high temperatures or bad user discipline, for instance, excessive opportunity charging. With real-time alerts and data visualization tools, remote room monitoring enables advanced troubleshooting and enhanced safety.
- Data mining – The process of analyzing large data sets generated by battery and charger systems to extract useful insights. This includes charging/discharging cycle patterns, failure rates and performance under different conditions. By applying data mining techniques, operators can predict maintenance needs, optimize charging protocols and make analytics-informed decisions to extend battery life and reduce operational costs.
- First-In-First-Out (FIFO) queues – A system that manages the order in which batteries are charged and used. FIFO ensures that the batteries that have been idle the longest are utilized first, optimizing battery usage and reducing wear. For example, in large-scale operations with multiple batteries, a FIFO queue prevents certain batteries from being repeatedly cycled while others remain underutilized.
The benefits of these technologies scale to the operation, allowing fleet managers to weaponize the data gathered from day-to-day operations to combat inefficiencies and maximize the value of their investments.
READ MORE: Revenue Potential Increasing for Electric Truck Powertrain Components
OEM Considerations and Tips for Successful Integration
Again, unlocking the benefits of advanced battery and charger integration requires expert input — preferably from the earliest design phases.
The value of these technologies lies in the ability to engineer solutions precisely matched to the application. While off-the-shelf components are always preferable, the reality is that these are often either in short supply or not an optimal match for every application.
Consider a few common integration challenges that highlight the necessity of an expert partner:
- Pre-charging – When either battery or charger lacks an internal pre-charge solution, the sudden spike in electric current can damage sensitive contactors and relays. This common problem must be addressed by implementing a pre-charge sequence via communication protocol.
- System fault tolerance – Failing to recharge properly, either because of the battery or charger, greatly impacts the application as it can prevent the vehicle from being operated. Optimizing system fault tolerance with carefully crafted error-confinement strategies provides a backup in case of such failure, minimizing the impact at a system level to improve operational quality.
- Application scalability – Beyond meeting immediate needs, electrified machinery should scale to future needs. For instance, what if you want to add an external, off-board charger to your on-board solution, or vice versa? Do you hope to prevent simultaneous charge or achieve it? What are the electrical and communication roadblocks to doing so? These considerations are best consulted with experts.
- Advanced protocols – Employing certain protocols is often a matter of technical integration. For example, Unified Diagnostic Services (UDS) provides a standardized way to access and interpret data from the battery and charger systems.
- Battery chemistry – Charging profiles must be optimized for specific chemistries to achieve maximum performance. For instance, thin plate pure lead (TPPL) batteries offer significant improvements over traditional lead-acid batteries — greater durability, reduced maintenance needs, shock and vibration resistance and cost-efficiency — yet rarely find proper charger support.
In addition to these considerations, OEMs must factor in Electric Vehicle Supply Equipment (EVSE), which includes the infrastructure and resources necessary to charge electric vehicles. Most often, this is done through region-specific sockets and charging standards:
- Type 1 sockets, also known as J1772 or SAE J1772, are generally used in North America and Japan.
- Type 2 sockets, sometimes referred to as Mennekes or IEC 62196-2, are more common in Europe. These sockets are based on the EN 61851-1 charging standard.
Integrating these sockets requires careful consideration of the communication protocols and electrical connections, although chargers with integrated EVSE support simplify this process. BMS software or other electronic control units (ECUs) in the vehicle must also accommodate EVSE to handle variations in direct current (DC) and to manage alternating current (AC) socket actuator unlocking policies.
This complexity underscores the importance of in-the-field experience, as modern on-board applications typically require both EVSE support and the ability to parallel multiple chargers.
READ MORE: Full-System Solutions Ease Electric Vehicle Design
Key Takeaways for Comprehensive Integration
Given the complex, ever-changing nature of battery and charger technology, the need for experienced suppliers and strategic partners cannot be overemphasized.
By partnering with a single supplier capable of delivering holistic charger expertise, such as both on-board and off-board chargers, OEMs can save on development expenses, navigate critical integration challenges and speed up time to market.
The value of experience and adaptability in this field goes beyond what can be measured with financial metrics. Forward-thinking OEMs are already securing partnerships with reliable experts, setting the stage for success in a sustainable, electrified future.
This article was written and contributed by Nicola Pini, Software Manager at ZIVAN.