On 6/13/2013 11:02 AM, [email protected] wrote:
> boeing and lockheed only buy off on the iges files , protocol demands they
> be checked against the iges file period . not a " good master" the reason
> being is a good master is simply a part that is "in tolerance " and stackup
> can burn you.
Understandable, as would any customer that knowledgeable - perhaps it's 
semantics, or the evolution of the discussion, but I'd certainly concede 
that the "interim masters" need not be only visually scanned off of the 
part in the vise (although that was my original statement for 
simplicity's sake) - to be honest, the "interim master" could (should?) 
be quite virtual, developed from the final part model - for the sake of 
discussion, I'm looking at what I would consider the inability for a 
computer to infer that a counterbore in a final part is not a fault when 
the machine hasn't made it yet, or the fact that if the Iges doesn't 
have modeled threads when the real part does, it also isn't a fault. I'm 
limiting the dataset to only a visual comparison, where the computer 
only has surface boundaries, and in its simplest form, EVERY surface 
boundary to distill down to a single "confidence" value.

In a perfect graphical-oriented app, perhaps the process would be load 
the "Finished iges" and click on the inner wall of the first-op hole - 
and in doing so it sets the logic that this diameter is the reference 
for the visual interim inspection of the first-op, not what one sees on 
the face of the finished part iges (counterbore and chamfer). While this 
concept is nothing new to say, Brown&Sharpe, I'm unsure if we would be 
successful at integrating that into LCNC anytime soon.

> now to the really important thing any documentation on your faro arm build?
> i would be highly interested in seeing links to this and is it capable of
> comparing to an iges file? or just a good master?
Ahh, I've been called! And I've also been lax on getting the blogs 
updated (the Tsugami is at casafrog.com/cfblog, also in need of 
updating, and "Round-tuits") - so I suppose I do need to put up stuff on 
the arm - I've no problems making it open source, anyway - I don't have 
any logos on the models, and the Frankenstein arm as assembled isn't one 
that a particular company in Lake Mary FL actually ever made, nor does 
it use any of their electronics, firmware or software....I guess I was 
just getting past the whole infringement thing.
It certainly isn't a 0.001mm tolerance device - there are micrometers 
and indicators for that, but I consider it quite adequate for what I use 
it for, and its cost. I've been able to measure stackups of my asian B 
gauge blocks (also what was used to calibrate them, so I suppose at some 
time I need to reference against another gauge) with repeatability to 
0.08mm and close volumetric precision at about +/-0.1mm (I think a good 
day would be +/-0.08mm) - neither of these numbers are amazing compared 
to a real Faro, which are an order of magnitude better, but my 
application isn't warranting NIST trace, either. Since the structure and 
encoders are not likely to be the difference when one is comparing the 
two, it comes down to mostly what I did with it - ignored thermal 
growth, maybe missing the occasional encoder count, maybe didn't get a 
dimension or model a section of arm correctly (these were measured 
parts, not from engineering drawings) - and yes, they (so I'm told) came 
from a dumpster and were in the back of some guy's pickup. None of which 
contribute to maintaining the precision the arm is capable of.
I fully concur that when you stackup some of those values you can have a 
part that isn't precise, but then again, if we've mis-calcuated the 
pitch on a leadscrew, you can end up with an imprecise part too! But at 
the end of the day, I'd trade the precision of a job for the fun of the 
hobby every time. At least until the wife and bank account decides 
otherwise....

One of the biggest challenges I've had with the arm is mounting - I've 
sat in on demos (since Faro is about an hour north of me), and I see 
them pulling out a tripod and hitting dead-zero bulls-eyes every time. I 
would have figured a pod would have been the least accurate of anything, 
and yet..... Often I'm clamping the base of my arm to a workbench. And 
it flexes. And I'm out 2mm at 1 m distance. Then I cry.

Although far from a releasable package, my process is my little Unity 
game either on a Win32 laptop or an Android, that gets both continuous 
and a "time-locked" series of encoder angles from my PICs in the arm 
(via serial). The continuous stream is nice to drive the visual rending 
of the arm on the screen, while the "time-locked" series is the 
important set - basically 6 encoder angles that are grabbed and stored 
when I push a button on the arm itself - thus there's little chance for 
loss of sync during the streaming process. The app also does the heavy 
lifting of the forward kins of figuring out where the touch probe is in 
3d space given the lengths/offsets of the arm components and the encoder 
angles, then determines an xyz co-ordinate of the probe tip, relative to 
a defined home position. It then adds that xyz to a text file which I 
import to Rhino as a point cloud. Although I can just do generic point 
clouds that way, I've been separating the triggers into segments that 
through some RhinoScript I can join points together with lines; either 
open curves, or polylines based upon points (like the circle example). I 
then use that info similar to a SolidWorks sketch to make solids. From 
there, I can import a companion Iges, and through absolutely no 
automatic work whatsoever, rotate/move the parts for comparison. It is a 
grueling and manual process between pressing the button and a finished 
comparison, certainly an area worthy of more development. Additionally, 
I don't yet have automatic tip offset - so I'm careful about how I probe 
(0.5mm ruby ball, typically), again, something that will see more 
help.....eventually!

The framework is generic enough that it could be adapted to "any" 
digitizing arm of similar style - and if Win32/Mac/or Linux is your 
aquisition platform of choice, (as opposed to Android) then it can be 
all compiled with the indie version of Unity. I'm using PICs as my 
encoder counters, over a little I2C net to a master that sends out the 
packets of angles to the Unity App via RS232 or BT. But it's not the 
only way.

This wouldn't really fit as the "poor man's dig arm", but it does fit as 
"this poor man's dig arm"!

Ted.

------------------------------------------------------------------------------
This SF.net email is sponsored by Windows:

Build for Windows Store.

http://p.sf.net/sfu/windows-dev2dev
_______________________________________________
Emc-users mailing list
[email protected]
https://lists.sourceforge.net/lists/listinfo/emc-users

Reply via email to