|Home | Articles | Forum | Glossary | Books|
The variable-voltage input (VVI) drive is the technology that was used in some of the earliest AC variable-frequency drives. Since these earlier drives did not have microprocessor chips to establish the transistor driver signals, they used existing technology such as oscillators. The diagrams below shows a block diagram of this type of drive and a typical waveform of the output.
From the block diagram one can see that the basic parts of the drive such as the rectifier section, filter, and inverter sections are much the same as modern drives. The major difference is that SCRs are used for the rectifier section instead of diodes. The reason for this is that diodes were not manufactured as large as SCRs at this time, so the SCRs were used because they were more durable. It is also possible to adjust the SCR's timing with a signal from the regulator section of the drive to change the amount of voltage and current delivered to the drive. The amount provided sufficient change to meet most of the varying demands of torque.
From the waveform diagram you can see that the voltage waveform was a
six-step signal. The switching device in the inverter section would be
switched on and off at specific points to provide the six-step signal.
The current waveform looks more like a sine wave because the inductance
of the motor helps to cause the phase shift needed to smooth out the square
wave shape of the six-step signal. You will encounter a few drives with
this early technology because they have paid for themselves over and over.
In some cases the drives are changed and a newer microprocessor programmable
drive is used whenever there is a problem with the original drive. It
is also more practical to change to a modern drive that can provide a
wider range of torque for all applications the motor may encounter.
PREVIOUS: Scalar Drives and Vector Drives
Top of Page | HOME