Jump to content

Standard Probe Deviation


---
 Share

Recommended Posts

I have a probe qualification program setup to qualify all my styli and report the Standard Probe Deviation for each. To report the Standard Probe Deviation I use the following formula in a Result Element Characteristic: "getProbe("X-","Star Probe").stdProbeDev" with the upper limit set to .0004". Over the years I haven't had any issues keeping our styli within this specification, until last week.

After getting our new Contura installed I assembled all the stylus systems with all new components and loaded them in the MSR, I ran the Probe Qualification program and two styli reported a deviation of .0012", way out of spec. I swapped out the styli with different new ones and got the same. I pulled an identical stylus assembly from our older machine and ran it through the program on our new machine and it reported .0002", within spec. I then ran the suspect stylus through the Probe Qualification on our older machine and it reported exactly the same as it did on the new machine .0012". So, it must be the stylus' right?

With the Zeiss tech on site we did a little digging and he seemed to think that my Probe Qualification program was reporting the Standard Probe Deviation in microns instead of inches even though the "units" of the program are set to inches and the global machine settings are set to inches. We looked at the sigma values of the styli in question and those looked fine. So now I'm very confused, if in fact I have been reporting microns the whole time do I need to change my upper limit to reflect the .0004" we are shooting for? Should I be reporting sigma value instead of standard probe deviation? Did I get a batch of bad stylus'?

I personally think my program is fine as I have never had issues before and should return the styli I ordered, any thoughts?
Link to comment
Share on other sites

Unit conversion aside, if the probe checks the same on both machines and you know the program you were running on the old program is the same and the old probes still check good on both CMM's, it has to be the stylus.

Look at your default print out and see what it says. I believe the it will report MM only in the default printout if you're using the Probing system qualification found under resources utilities and if you're getting any thing different there on the default printout, again, it has to be the stylus.
If you convert 0.0004 inch, it would be 0.010mm.
Who are you ordering the styli from?
Link to comment
Share on other sites

I suspect the tech is correct, that .getProbe() is grabbing the stored metric values.

But that has nothing to do with the fact that your values are much worse than what you're used to seeing. I would consider 1.2 micron to be way too high for a new stylus, but then my experience is with Micuras, not Conturas.

I would turn my focus to the adapter plates. You haven't ruled out that they are defective, or that there is some metallic debris stuck to them.
Link to comment
Share on other sites

PCM results are not converted automatically..

// this returns a metric result even in an inch program..
getProbe("2","015").stdProbeDev

// this returns the standard deviation in inches
getProbe("2","015").stdProbeDev/25.4

That said.. if you were actually showing all your probes within .0004 mm then I am shocked you were getting that good of results all along, since that would be .0000157 in Inches and I frequently get brand new probes with standard deviation higher than that.

I do agree with Aaron though.. 1.2 microns seems excessive for a new probe.
Link to comment
Share on other sites

  • 3 weeks later...
Okay, sorry it has been so long since I posted a response as i have been very busy getting our new lab and new Contura up to speed and running fluently.

In response to Owen: We always order our styli and components directly from Zeiss, I know some of the other vendors sell "seconds" and we steer clear of those.

My solution ultimately came from changing the probing dynamic on the styli in question to 20% which then yielded an acceptable qualification result (std.ProbeDev) of .0004". Which would explain that it is in fact an issue of manufacturing variation in the styli itself (bending parameters). But still in question is (1), if that is reporting in microns then how and the heck am I meeting that? And (2) If confirmed that it's reporting in microns I need to apply /25.4 to the formula to convert to inches and also, 3) What is the difference, if any, between a stylus' sigma value and it's standard deviation and IF there is a difference which one is more beneficial to track?
Link to comment
Share on other sites

 Share

×
×
  • Create New...