When evaluating new switching transistor circuits, designers often focus solely on the transistor specifications. However, a critical factor influencing the robustness of the overall design is the
gate driver circuit. To understand the impact of driver parameters, let us first examine the ideal conditions using an example IGBT transistor (IKW20N60H3).
According to the transistor datasheet at 25°C, the relevant parameters are:
Vge max = ± 20V
Gate emitter voltage threshold = 4.1V - 5.7V
Based on these values, a gate driver supply of +15V relative to GND is sufficient. Under ideal conditions, the corresponding driver circuit would appear as follows:
Fig. 1: Simple gate driver circuit for an ideal IGBT
Looks pretty simple! However, when the parasitic elements are considered, the real-life model becomes more complex:
Fig. 2: Realistic gate driver circuit including IGBT parasitic components
If we now take into account that the gate-emitter threshold also varies over the temperature range, it becomes clear that the threshold voltage decreases significantly with increasing temperature (several mV/K) and, in the worst case, can fall well below the typical minimum value of 4.1V measured at 25°C.
Fig. 3: Gate-Emitter threshold voltage variation with temperature
The driver circuit must be designed to prevent unwanted turn-on under all operating conditions. Otherwise, this can lead to shoot-through short circuits, which may result in increased power losses, greater component stress, reduced service life, degraded EMC performance, and in extreme cases, the destruction of the transistor.
Essentially, there are two types of unwanted switch-on events:
An unwanted turn-on due to the effect of the Miller capacitance (C
reverse)
An unwanted turn-on due to the effect of the parasitic inductances (L
gate and L
emitter).