Wholesale electricity prices have increased enormously in the past year and the spotlight is once again turning to energy efficiency. But while users, in increasing numbers, are discovering that a variable speed drive is one of the most effective ways to help offset the rise in costs, few appreciate that there is a difference between drives, as Geoff Brown explains
Variable speed drives are one of the most effective weapons in the energy manager's fight against rising costs. The control they bring to the speed of motors allows processes and machines to be run at exactly the right speed, saving electricity and cutting running costs.
The total efficiency of the drive system depends on the losses in the motor and its control. Both drive and motor losses are thermal, so they appear as heat. Input power to the drive system is electrical in form, while output power is mechanical. That is why calculating the coefficient of efficiency requires knowledge of both electrical and mechanical engineering.
Modern semiconductor devices give a very efficient switching mechanism. In a frequency converter fixed losses are generally low - typically around 0.2% of its rating and the losses in its main circuit will vary with load and switching frequency - the total rising to perhaps 1.5% to 3% at full load, depending on design and rating.
Motor losses always form the majority of the losses in a drive system and when fed from the mains can vary from nearly 20% for a very small motor to around 3% for a very large one. These losses will increase when fed by a frequency converter, and in the past many motor manufacturers anticipated de-rating of 15% or even 20% when in inverter fed mode; however, little or no de-rating is required by most modern systems. It is therefore important to consider the total drive train efficiency, rather than that of the individual components when evaluating a drive.
Not all drives are the same
Most variable speed drives are sold for the purpose of improving energy efficiency. However, while many customers now treat drives as a commodity, there are very substantial differences in performance achieved by drives. Real data on the overall effect of running a motor with a frequency converter is notoriously difficult to obtain. An independent survey was commissioned by ABB at a German University to compare a number of competitive products, all powering the same motor using a common test rig.
While all the drives achieve substantially the same levels of efficiency at full load and speed, enabling a claim of 100% motor loadability to be substantiated, the object of a drive is to reduce the speed of a centrifugal fan or pump. The time running at full load will normally be limited to emergency situations, such as smoke extract in the case of a fire. It is therefore the partial load efficiency that needs to be considered.
The investment in developing drive technology is growing all the time, and this has led to both the size and cost of a drive falling dramatically, while the reliability has steadily improved. Typically, a 15kW drive which cost around £2,500 in 1985, and occupied 0.12m3 of panel space, could be replaced today by a unit costing perhaps £1,000 and occupying 0.03m3. The newer unit would also offer far superior performance, as improvements have taken place in several areas, notably semiconductor technology and control logic.
With a conventional voltage source inverter, the output waveform is synthesised by a set of semiconductor switches, and these have now become standardised on the Insulated Gate Bipolar Transistor or IGBT. Even the IGBT has changed substantially since its first commercial introduction. But the main limitation of this device is the losses it produces every time it is switched. These losses must be dissipated, and thus govern the size of the cooling system used in the inverter. This means that there must be a compromise between high switching frequencies - often seen as giving lower acoustic noise from the motor - and consequent high losses in the inverter. In most cases a switching frequency of 3-4kHz will give the optimum performance.
By improving the control logic in the inverter stage the losses in the motor can be minimised. Historically, a switching frequency of 1kHz and a fixed pulse pattern created by the modulator would need a motor de-rating of perhaps 10%. In more modern drives, improved pulse patterns will reduce the harmonics seen in the output voltage waveform and allow full torque, without de-rating. However, maintaining a minimised output harmonic content means retaining a substantial number of pulses per cycle and will necessarily result in a slight reduction in the maximum output voltage.
Older drive technology may switch to one or three pulses at maximum voltage, which can lead to total harmonic distortion (THD) in the output voltage of up to 30%. Modern drives will provide substantially more pulses, and reduce the THD to perhaps 5% to 10% and the spectrum looks very different. The use of improved control algorithms also improves overall efficiency by ensuring that the motor flux is continuously optimised.
Although drives have obvious advantages in motor operation, recent press reports have claimed that the harmonics produced by drives are making motors in variable speed drive operation unacceptably inefficient. This is a misconception based on the fact that few people appreciate the differences between drives. The switching techniques employed by the most modern drives are different to those of earlier generations of the product and can lead to big differences in efficiency.
Different switching strategies can lead to big differences in efficiency, especially at partial load and reduced speed. Furthermore, a seemingly small improvement in the efficiency of the drive can make a big difference in the overall efficiency of the drive system. Selecting the right drive can save thousands of pounds per year. And motors fed by modern drives are actually very efficient.
Geoff Brown is drive applications consultant at ABB