[Ow...] Posted July 23, 2018 Share Posted July 23, 2018 We have a Duramax HTG (High Temperature Gradient) CMM and the department that owns it is under the impression that everything is good from 59° to 104° F (15°C to 40°C). However, the CMM manual say's Max Temperature change is 3k (Kelvin) per hour. 3 kelvin is about 5°F (if I'm converting it right) and that means that if the temperature changes more than 5°F, you should re-qualify the CMM to maintain accuracy but, nowhere in the manual does it say this. For clarity, can anybody verify or confirm this should be done? Our own trial and error testing has shown a significant change after 10°F and even using the temperature compensation wires does not as have much error correction effect as just re-qualifying the probes in the higher temperature.Duramax HTG.jpg Link to comment Share on other sites More sharing options...
[Ow...] Posted July 24, 2018 Author Share Posted July 24, 2018 I guess I should have specified that the only deviation we are seeing with temperature change is diameter size. The CMM axis's are controlled/compensated internally but, the actual probe and qualification sphere size is not. Ruby has a Thermal Expansion Coefficient of 5.8 so, with 30+ degree's of temperature change, it's going to grow a little, right? We're only seeing about 2 to 3 micron of change on a 16mm bore after re-calibrating the probes once the temperature has changed in the afternoon. Small amount of deviation I know and it's close to within the CMM's accuracy limits but, sometimes it can make a difference in acceptance or rejection. My thoughts are that if we see an improvement after calibrating when temperature changes, we should just make them re-calibrate after a significant temperature change and go on with it but, time is money and I was looking for some documentation that would validate/approve the added the time. Any input on the subject is appreciated . Link to comment Share on other sites More sharing options...
[Ri...] Posted July 24, 2018 Share Posted July 24, 2018 2 to 3 micron change can mean the difference between acceptance and rejection. When you're fighting microns everything matters and so, everything must controlled. If this were me and my quality system, parts with tolerances this tight would only be measured in a room that was temperature and humidity controlled and only after the part has had time to soak in that room. If that's not possible then I would segregate any rejected product to be measured in a temperature controlled room. Temperature compensation is a compromise. It's not a magical software fix for physical reality. It works well when applied properly. Link to comment Share on other sites More sharing options...
[Ow...] Posted July 24, 2018 Author Share Posted July 24, 2018 Thanks Richard for your feedback. Yes, you nailed it exactly, "should be" measured in a controlled environment and yes, I've checked part's rejected in a controlled room after they have acclimated to temperature. There's talk about adding tolerance to the upper end that would eliminate most rejects, all kinds of compromises to get out of spending the big money to air-condition the department. Link to comment Share on other sites More sharing options...
[Ri...] Posted July 24, 2018 Share Posted July 24, 2018 Increasing tolerance is one way of making more "good" parts. Link to comment Share on other sites More sharing options...
Recommended Posts
Please sign in to comment
You will be able to leave a comment after signing in