Jump to content

actual size for a 0.2mm ruby probe


---
 Share

Recommended Posts

Depends on the grade of the ball. Most suppliers of styli use Grade 5 rubies, which have tolerances of +/- 1 micron to the nominal, and a form error of 0.125 microns max.

The diameter you see after qualifying is NOT the actual diameter of the ball, itโ€˜s a calculated correctional value and depends largely upon the bending parameters of the stylus.
Link to comment
Share on other sites

Please sign in to view this quote.

Which is why this value should be smaller than the actual diameter. The stylus bends in opposite directions at opposite sides of the sphere, resulting in a slightly smaller effective diameter.
Link to comment
Share on other sites

It's quite different for touch-trigger and scanning probes. While with touch-trigger probes the effective diameter mostly is smaller than the nominal (although there are exceptions depending on the module force), the effective diameters for scanning probes are usually larger (there are also exceptions possible, and I can imagine it varies from system to system like Renishaw/Zeiss VAST/Zeiss XXT). I can't fully explain it, but with scanning systems, the bending parameters are determined independently from the actual calibration and then somehow the calibration results are offset with the bending parameters.

And then, like I said, it varies with the ruby ball grade. Are you always sure what grade the balls of your styli are? A ball with a form error close to or even larger than the measuring uncertainty of your CMM totally annihilates the measuring uncertainty of your machine.
Link to comment
Share on other sites

Please sign in to view this quote.

.


Good point, Daniel. I guess I should probably avoid using one that looks like this:


. 4532_cb193dc5e15648ef0846c71ba4ddcffd.jpg
Link to comment
Share on other sites

 Share

×
×
  • Create New...