Mostly the gear reductions are done for torque multiplication rather than resolution.
My mill runs direct drive steppers to 5mm pitch ballscrews, with 10x microstepping. That gives me 2.5µm (1 ten-thousandth) resolution.
Picked up a nice desktop CNC mill with ball screws and servos that looks like it could be commercial rather than a conversion. Looks to be about 1980's vintage. The ball screws are 5 TPI as close as I can measure and the encoders are 100 line. Given 4x decoding of the encoders that gives me 0.0005" resolution. I have another small ballscrew mill of similar vintage (but not similar construction) that has significant belt reductions so something like 40,000 steps per inch. Just wondering what resolution most conversions have. Without microstepping it would seem like most would be close or slightly smaller than the 0.0005" of the first machine. Also, what is typical/realistic for accuracy? Clearly the second machine has much more resolution than accuracy. I am thinking 0.0005" while not tool and die shop type resolution it should be more than adequate for most things.
Similar Threads:
Mostly the gear reductions are done for torque multiplication rather than resolution.
My mill runs direct drive steppers to 5mm pitch ballscrews, with 10x microstepping. That gives me 2.5µm (1 ten-thousandth) resolution.
LongRat
www.fulloption.co.uk
Hi,
my mini-mill which I made seven plus years ago had 5mm pitch C5 ground screws direct driven by 23 size 5 phase steppers (500 steps/rev) via integral low lash 10:1 planetary
gearboxes. The lash of the gearboxes were quoted as <2 arc min, it was so small I never succeeded in measuring it.
The supposed backlash (linear) from 2 arc min rotational lash:
2/60 / 360 x 5mm =0.00046mm or 0.46um.
I decided therefore that there was no point in trying to get a resolution better than that, in fact I decided that 1um was more than enough.
The natural resolution of 5 phase steppers is 500 pulse/rev, but with the 10:1 reduction that means 5000 pulses/rev. At 5mm pitch that means one pulse = 0.001mm
or 1um. Very convenient and easy to set up...and that is what I did. My mini-mill had a resolution of 1um, but for various reasons I considered its repeatable accuracy as 4.7um.
I used it that way for seven years......great.
I have recently (six months) built a new mill, much bigger and more powerful. It has 750W Delta B2 (160,000 count per rev encoders) servos on each axis direct coupled to 32mm BNFN double nut
THK C5 ground ballscrews. I decided that I would go for the same resolution as I had used on my mini-mill with such success. Thus I set the electronic gearing (within the servo drive
programming) to enact a 5000 pulse per rev regime. That results in a 1um step per pulse. I pushed my servos out to 5000 rpm, thus I required a pulse rate:
max pulse rate= 5000/60 *5000
=416.66kHz
which is comfortably inside the 500kHz max differential signaling speed of the drives. This results in rapids of 25m/min and still have resolution of 1um.
Servos eat steppers for breakfast!!!!...and overload.....have I told you about overload? Servos are great.
The bottom line is that you should try to get a resolution somewhat better than your accuracy but there's no point in overdoing it. My new mill has a resolution of 1um but I regard
it as a 10um (0.01mm) accuracy machine. It nice for instance to cut a hole using circular interpolation and be able to use go-no-go gauges to detect a programmed diameter
change of 0.01mm. The machine is better than my skill.......
Craig