You are visitor number since 05/10/02
Copyright 2000-2002 by Frank J. Hoose, Jr. Home
Mini-Lathe Mini-Mill Bandsaw Grinder Anodizing Lapping Links Projects Resources Safety Premium Content
Mini-lathe: Accessories Adjustments Capabilities Chucks Dial Indicators Features Getting Started Glossary Introduction Materials Modifications My Shop Operation Reviews Sieg Factory Tool Grinding Troubleshooting Tuning Versions
If you have not already done so, please read the Disclaimer (last updated 10/18/09)
This page explains the difference between how SCR and FET (MOSFET) speed controls work. The information was extracted from posts to the 7x interest group.
Posted by Harvey White (msg #53845)
The SCR is capable of being turned on with a gate signal, but unlike a FET, the SCR latches on until (basically) power is removed. The supply to the motor and SCR is raw pulsating DC, coming from a bridge rectifier. A gate signal is generated from a ramp synchronized to the DC waveform coming in (this is one way of doing it). With a ram rising from , say 0 to 10 volts repeating each 1/120 of a second (for 60 Hz power), you compare the ramp voltage against a pot voltage (speed control). When the ramp voltage exceeds the pot voltage, the comparator flips on, and this signal is used to turn on the SCR. The circuit resets when the rectified DC voltage falls below the SCR's holding voltage/current limit.
For full speed, you trigger at the beginning of the half cycle, so you get the benefit of the full AC voltage half cycle, 0 to peak to 0. At half setting, you trigger at the middle of the cycle (and get your worse spike for this circuit). You've "ignored" the first 1/2 of the half cycle. At low RPM, you trigger very late in the cycle, and you're getting the part of the waveform that's at, say 25 volts descending to zero going to the motor.
Since Motor torque is related to the supply voltage, The torque falls from half speed down to zero. It's the same effect as if you had a variable DC supply. That's why running a DC motor off a variable DC supply gives you "bad" low end torque, low motor voltage/current.
Where a PWM controller is superior is that the motor voltage is always the full value. Since the supply is switched on and off at a rapid rate, the current flowing through the motor (when it stabilizes) is always the design normal value, and the torque in the motor is always (or mostly) the same, regardless of speed setting. Since the motor is an inductor, and since you can't change the current through an inductor instantaneously, there will be a practical limit to the minimum width of the pulse through the motor. Below this limit, the current through the motor does not have the opportunity to reach full value, and the motor torque drops. However, the theory says that with a good PWM supply, the motor torque is higher at low speeds than with an SCR supply.
However, SCRs are more robust than FETs, so they go poof less....
There used to be devices called GTO SCRs, for Gate Turn Off SCR (SCR is Silicon Controlled Rectifier). You could turn them off, you could turn them on.
Ironically, you could make one with an SCR and a MOSFET....
Posted by Rick Dickinson (msg #53864)
For those that aren't familiar with PWM, here's the Cliff Notes version: Pulse Width Modulation uses some sort of fast, low resistance, electronic switch (like a MOSFET) to turn the DC voltage driving the motor on and off thousands of times per second. By changing the percentage of the time that the switches are "on" rather than "off", the average power being applied to the motor variews proportionally, and the speed varies as well.
By contrast, an SCR-based motor controller uses devices called "Silicon Controlled Rectifiers" to turn on and off the power to the motor. On the "pro" side, SCRs are very robust devices, and hold up well to abuse without releasing their magic smoke. However, on the "con" side, they are peculiar devices: they can only be turned on by a triggering signal, not off. How do you use a switch that you can only turn on, not off, to control the power going to a motor? By taking advantage of one other feature of SCRs: they actually do turn off when the voltage across them drops to zero. So, if you use them with an AC signal (a 60 Hz 110V RMS sine wave from the power outlet, for instance), you can turn them on at whatever point in the sine wave you want, and they shut off all by themselves every time the voltage passes zero (120 times a second, for 60 Hz AC). Rectify the output, so that all of the "humps" of the sine wave are positive, and you've got a robust source of pulsing voltage that can be controlled as to what percentage of total power it delivers by changing at what point you turn the SCRs on.
Now, since MOSFET-based PWM motor controllers turn on and off thousands of times per second, you have very fine granularity across the whole range from full on to full off. However, SCR-based controllers are dependent on the 60 Hz AC sine wave from your power outlet. If you've ever looked at a sine wave, you can see that the curve slopes a lot more right near where it crosses zero than it does near the peaks. This means that, at the low end of the power curve (near full off), a small adjustment makes a big jump in speed. Also, no matter what speed is selected for a PWM-controlled motor, the motor always sees pulses of full voltage, which means that the motor always gets a full-strength "kick" to get it started moving as soon as the pulse hits it. An SCR-based controller is sending rounded pulses (shaped like part of a sine wave). At low speeds, the motor never sees full peak voltage, which makes it more likely that the motor will stall at low speeds. So, to summarize: PWM with MOSFETs gives smoother control over the whole range of speeds, while SCR-based controllers are more robust, and give their best control at mid-to-high-range speeds.
Posted by Dave Martindale
>IR - Nothing ever explains what "IR compensation" is. I'm assuming >it's IR as in Ohm's law (E=IR) and it's a kinda open-loop way of >detecting motor speed by measuring the voltage drop across it and >compensating for loading by increasing supply voltage (current, >whatever :-) in the hope of speeding the motor back up. >Anyway, my lathe had always been pretty easy to stall by hand at its >lowest speed so I turned this up a bit. I think I may have turned it >up a little too much, though, because now when doing high-load stuff >(like parting) the spindle ends up speeding up - I guess I've got "IR >over-compensation." You're basically right. What the controller does is measure the current through the motor (via a plug-in resistor that's matched to the motor size) and then boost the voltage an amount that's proportional to the current. If it's set just right, the additional voltage exactly compensates for the voltage drop in the motor armature, brushes, and external wiring, and the motor speed remains almost constant as the load increases. The reason for the adjusting pot is that the controller has no way of measuring the actual internal resistance of the motor, it can only measure current, so the pot matches the control to the motor by adjusting the voltage boost to armature current ratio. If the speed increases with increasing load, you're over-compensating. It should stay the same, or maybe drop just a bit.
Posted by Dewayne Hafenstein, Tue Jun 18, 2002, #55716
The mini-lathe controller, and I assume the mini-mill controller too, use a pulse width modulation approach with feedback from the motor to regulate the speed and current limit (mill only) to protect the motor. This is accomplished by sensing the current waveform across a sense resistor in series with the motor. When the pulse is active from the PWM, the voltage developed across the resistor determines the motor current. When the pulse is off, the voltage represents the flyback effect of the motor's magnetic field collapsing, and is proportional to the speed of the motor. This signal is sampled and used to provide a feedback correction to the PWM gating so that the pulse width can be lengthened or shortened as appropriate to keep the speed constant.
Now, it is my guess (guess mind you) that the "Offset voltage" adjustment is used to balance the feedback of the motor speed sense, the "output voltage" adjustment is to null the speed control, and the cut-off voltage adjustment is probably for current limiting. But, I have not yet completed reverse engineering the circuit and these assumptions should be suspect until proven accurate.
Regarding the use of other types of speed controls:
Geir Soland (65309)
I think this Router speed control unit is build around a zero crossing semiconductordevice, which means that it will only control energy/speed on AC circuits/motors and only on universal motors like AC drills, die grinders, routers and so on. (noisy, high speed motors) These speed controls uses the principle of regulating the amount of supplied energy/power to control speed. This is done by using triacs/thyristors to turn off power in part of the 50/60Hz cycles. (a kind of AC PWM (pulse width modulation)) These units will not function with induction motors which uses frequency and number of poles to give the speed. The only way to control speed one these motors are to either change the frequency with a VFD or to alter the number of poles There are a couple of ways to control speed electronically on DC motors and the main principles are either based on varying the amount of DC power supplied over time like PWM - Pulse Width Modulating or some kind of variable resistor (voltage drop control) in series with the motor.