Jump to content

Checking calipers and micrometers


---
 Share

Recommended Posts

We are trying to go through measuring equipment and make sure everyone's tools are accurate, but I don't know how to or which settings are best. I can make the results say whatever I want, but I have no idea which is the most accurate (which is what I am going for). I do have a fairly basic understanding of Calypso, and Metrology in general (Compared to most of the people in these forums). I would Really appreciate some help here.
Thanks
Link to comment
Share on other sites

I really don't understand the question BUT if you are looking to verify calibers and Mics, buy some lab grade blocks. I have 2 sets that I use just for that purpose.

I have never used a CMM to validate calipers or Mics.
Link to comment
Share on other sites

Thank you guys for the input, I really appreciate it. But I am curious as to why you guys don't think the CMM can do it. I'm a newer CMM programmer and I don't fully understand why it can't or shouldn't be done.
Thanks,
Hayden
Link to comment
Share on other sites

I've been told by quality people much smarter than myself that CMM's are not good at Qualifying calipers & mic's because the measuring forces that are normally present while some operator has an ape fist on the handle is not present during qualification, so it's a matter of similarity in measurement.
Mics have a ratcheting thimbles because force matters.
I know from my brief stints in gage cal (i always find a way to get myself out of gage cal) that calipers often require adjustments to the rear tension screws, and that can cause err's of .002+ inch when not adjusted properly.

It would also be slower........ having to run calipers at .050, .500, 1.000, 2.000, 3.000, 4.000, 5.000, 6.000
Aint nobody got time for that!
Link to comment
Share on other sites

I'm not 100% sure, but calibrating gages or calipers with a CMM would NOT go over very well with an ISO auditor. I've been a CMM Programmer for almost 15 years and no company I have ever worked for would even attempt calibrating ANYTHING with a CMM in regard to gaging or inspection equipment. Auditors will be looking for Certifications as well.

However, I do tend to reference a ring gage with a disk probe and sometimes see if they're close to their size.
Link to comment
Share on other sites

I've been in this field for 30 years and have never seen a calibration laboratory use a CMM to calibrate hand gauges, it's just not feasible.
Aside from all the other comments (especially regarding measuring force and all the different stages),the accuracy required is NOT within the general accuracy of a CMM.

Using a micrometer with a stated 0.0001-inch [0.0025mm] accuracy guarantee for example:
You'd need to have a gauge with a minimum of 4 to 1 NIST certified accuracy to calibrate the micrometer, meaning the gauge used to calibrate the micrometer would have to have a guaranteed accuracy of 0.000002-inch [0.0006mm] and very few if any CMM's will have that type of accuracy. If you use the 10 to 1 accuracy ratio that a lot of certified calibration laboratories use, No CMM made is going to be accurate enough.

If you or your company are under any kind of quality system standard, ISO-9001 or 16949, etc, to avoid the "fox watching the hen house", most require gauge calibration be done by an outside source and a lot of them require that outside calibration company to be ISO 17025 certified,. In short, don't take this the wrong way (we all learn as we go), if you're audited by one of these quality standards, they're not going to be happy if you're calibrating your hand gauges with a CMM.
Link to comment
Share on other sites

As Owen already stated, its all about the accuracy of the check.

Calibration Gage Blocks (AA) have a tolerance of +0.10 μm to −0.05 μm.

Most CMM's have a Linear Spec something like the following.. "1.4 + L / 333" the 1.4 is microns at 0 mm distance and the L is in millimeters and when you do the calculation lets say at 666mm you would get 2 more microns. Which means at 666mm you would have an allowable error of 3.4μm (1.4μm + 2.0μm).

The key here anyhow is the leading number the 1.4. That is in microns. So at Zero Millimeters it is acceptable per the machine spec to be off by 1.4μm that is 14 times the maximum amount of error allowed on a calibration gage block.

The spec for a Zeiss Xenos which is one of the most accurate CMM's made is 0.3 + L/1,000 which means even the Xenos allows for 3 times the error as the gage block.

So assuming you did the check with gage blocks your measurement uncertainty from the gage would be 0.15μm, if you performed the same check with the CMM your measurement uncertainty from the gage would be 2.8μm (assuming a spec of 1.4 + L / 333).
Link to comment
Share on other sites

  • 2 months later...
Do you guys Check your gage blocks on a CMM? Is there an accurate way of doing this, or is this also not recommended?
(If there is a good way, how?) If not, how do you suggest I document the accuracy of my Gage blocks to show my Micrometers are accurate?
Link to comment
Share on other sites

Send your gage blocks out to be calibrated by a laboratory, you will receive a certificate that will have the traceability to national standards (NIST).
Now when you calibrate your calipers and mics you just put in your gage cal program that you used gage block set 1 (GB1) to do your calibrations and all your auditors will be very happy.
using a variable gage to calibrate another variable gage is just too much variation.
Link to comment
Share on other sites

Micrometers and Calipers are force related tools. You put pressure on the part when measuring. Measuring your tools using a CMM does it when they are in free state. Theoretically your micrometers have backlash though you set them at 1.0000 they may be beeping out at 1.0002.
I've tried this.
Link to comment
Share on other sites

 Share

×
×
  • Create New...