Jump to content

Position Result Changes


---
 Share

Recommended Posts

I was hoping someone could help me understand something. I have a cylinder that I'm checking the position of back to two datums, the primary is a plain and the secondary is a cylinder. The datums are set by filling in the boxes, not using a secondary alignment. If I set the tolerance to zero the result will change if I switch the feature between MMC and RFS. By result I mean the "Actual" value from the characteristic window. If I give it a "big" tolerance of say .1mm the result will stay the same if I switch between RFS and MMC. If I give it a "small" tolerance of say .005mm the result will change between RFS and MMC. If I continue to make the tolerance smaller the result will keep changing until it equals the same result that a zero tolerance yields. If I use a secondary alignment built with the same datums, the result will never change no matter what the tolerance and material condition is set too.

Does anyone know what's happening here?
Link to comment
Share on other sites

  • 1 month later...
So I've done some messing around and this is what I've figured out. If any of this is wrong hopefully someone will chime in and correct me.

The MMC/RFS or tolerance was not really the issue. The issue is the characteristic failed and the DRF was not fully constrained. So it looks like when you fill in the datum boxes for a position characteristic and the DRF isn't fully constrained Calypso uses the "Tolerance-2D-Best-Fit" to find the result. I cant find anything on how that fit works except for a PDF from R.L Guimont Co. Inc. that says its "goal is to iterate all elements into tolerance, if possible." My theory on how it tries to do that is it starts by looking at the data, guesses which fit (Gauss, Minimum Zone or L1) would have the least error and uses that. If the characteristic fails it goes through them all until if finds the one with the least error and reports that. At first I thought it was just using Gauss first and if that failed it would switch. But I was using real data to test at the time and that was the fit it used first. When I switched to simulated data (dispersion .015mm - .25mm) it started using a Minimum Zone fit first. I'm going to take a wild guess and say it looks at three form results of the feature and that's how it picks which fit to use for the best fit. But that's probably absolutely wrong.

Anyways that's how you can get the position result to change just by changing the amount of tolerance.
Link to comment
Share on other sites

 Share

×
×
  • Create New...