Model size and regen

Discussion in 'Pro/Engineer & Creo Elements/Pro' started by kenny, Jan 18, 2007.

  1. kenny

    kenny Guest

    Whilst doing some tests to understand the difference between relative
    and absolute accuracies I came upon this weirdness.

    Set up a directory with three files
    A - typical motorbike panel model size 1200mm
    B - empty file
    C - deliberately large block size 100 000mm

    Each set starts with erase not displayed.

    Bring in A suppress to #1, resume all, OK

    Bring in B do not touch, bring in A suppress to #1, resume all, OK

    Bring in C do not touch, bring in A suppress to #1, resume all, NO REGEN

    Bring in C suppress to #1, bring in A suppress to #1, resume all OK

    This was repeated for a number of runs. It doesn't get me any closer to
    what I was researching, but what is the meaning of this? Our templates
    are set to rel acc 0.0012 and not touched in this session. Do the parts
    in session have some cumulative effect?

    thanks
     
    kenny, Jan 18, 2007
    #1
  2. kenny

    Jeff Howard Guest

    I'm not sure what I'm reading.
    "Bring in" = Open or Place in assy or ...?
    "resume all, NO REGEN" = the feature(s) were not Resumed?

    Blundering on like I understand what's happening;
    I don't believe that what's in session should affect part
    accuracy or, for that matter, anything external to the part
    will affect it's accuracy (a possible exception might be
    where external geometry is referenced or copied?).

    What, specifically, is it you're trying to figure out?
    You might take a look at a discussion titled
    "relative accuracy and datum features"
    from late October and see if there's anything there that
    sheds any light. In a nutshell relative accuracy calculates
    an effective accuracy (essentially a variable absolute accuracy)
    based on model size and feature characteristics. It's (I think)
    calc'd and adjusted for each feature in the model.

    ============================================




    Whilst doing some tests to understand the difference between relative
    and absolute accuracies I came upon this weirdness.

    Set up a directory with three files
    A - typical motorbike panel model size 1200mm
    B - empty file
    C - deliberately large block size 100 000mm

    Each set starts with erase not displayed.

    Bring in A suppress to #1, resume all, OK

    Bring in B do not touch, bring in A suppress to #1, resume all, OK

    Bring in C do not touch, bring in A suppress to #1, resume all, NO REGEN

    Bring in C suppress to #1, bring in A suppress to #1, resume all OK

    This was repeated for a number of runs. It doesn't get me any closer to
    what I was researching, but what is the meaning of this? Our templates
    are set to rel acc 0.0012 and not touched in this session. Do the parts
    in session have some cumulative effect?

    thanks
     
    Jeff Howard, Jan 18, 2007
    #2
  3. kenny

    kenny Guest

    In article <tfOrh.15939$>,
    says...

    AS I said, this came about as I was researching something else, that can
    wait. The point was that in session 3 below the part failed just
    because another part was in session, there were no assemblies and no
    references between the parts only those 3 unrelated parts in the working
    directory. So the question is, why does a part fail to regen just
    because another part is in session. This was repeated a number of times
    and on two machines. We have had spurious failures before and didn't
    come to any conclusions. I guess that I was 'lucky' to find such a
    short example of the problem. But why is this happening?


    Set up a directory with three files
    A - typical motorbike panel model size 1200mm
    B - empty file
    C - deliberately large block size 100 000mm

    Each set starts with erase not displayed.

    1) Open file A suppress to #1, resume all, resumes OK. Erase session.

    2) Open file B do not touch, open file A suppress to #1, resume all,
    resumes OK. Erase session.

    3) Open file C do not touch, open file A suppress to #1, resume all,
    FAILS at #2. Erase session.

    4) Open file C suppress to #1, open file A suppress to #1, resume all,
    OK.
     
    kenny, Jan 19, 2007
    #3
  4. kenny

    Jeff Howard Guest

    Ok, I see.

    PTC might pay a bounty on that critter if they can reproduce it. `;^)

    Was the panel part chosen at random or does it have a "history"?
    Can you duplicate with any of your other, similar, parts?

    If you step thru the features one at a time instead of Resume
    All or Cancel Insert Mode will the feats regen? I'm not sure
    what that'll tell you but I've seen that work more than a few
    times (while we're on the subject of weirdness).

    Have you tried on a different build?
    What version and build are you using?

    Can you share the model or create something similar that
    duplicates the behavior that you can share? I'm curious. Would
    like to see it, the feat that fails, etc. I tried to duplicate
    with a couple of parts (aerodynamic fairing pieces, .001" abs
    acc) and couldn't. Then managed to find a rel acc they'd rebuild
    to with just a couple of geom checks and still couldn't duplicate.
    I can read thru WF2.

    ======================================
    ======================================

    kenny wrote:

    Set up a directory with three files
    A - typical motorbike panel model size 1200mm
    B - empty file
    C - deliberately large block size 100 000mm

    Each set starts with erase not displayed.

    1) Open file A suppress to #1, resume all, resumes OK. Erase session.

    2) Open file B do not touch, open file A suppress to #1, resume all,
    resumes OK. Erase session.

    3) Open file C do not touch, open file A suppress to #1, resume all,
    FAILS at #2. Erase session.

    4) Open file C suppress to #1, open file A suppress to #1, resume all,
    OK.
     
    Jeff Howard, Jan 19, 2007
    #4
  5. kenny

    kenny Guest

    It was chosen at random, #5 is EXTRERNAL COPY GEOM but set to
    independant. This is typical of the stuff on this project. This is the
    point that its failing on. The normal 'failed regen, feat aborted, geom
    was overlapping.
    Not tried yet.

    Pity, but I can't it's a current project.

    I've just done one more thing

    5) Open file C, open file A suppress to #1, close file C, DO NOT erase
    session, now A resumes OK!!

    So now it's something with having the other window open.


    If I have time I'll reduce this to an example I can share. Is your
    address as shown here real?
     
    kenny, Jan 19, 2007
    #5
  6. kenny

    Jeff Howard Guest

    So now it's something with having the other window open.

    The plot thickens. `;^)
    The address is real but I can't read WF3. If you can get something
    worked up you might post to mcadcentral where people using WF3,
    possibly more recent builds (?), can get their hands on it and try.

    I'd also try contacting a distributor with it even if you aren't
    on maint. I've done so and they've either sent back a note saying
    they'd fwd to PTC or told me it couldn't be reproduced in newer builds.
    Might be worth a try.
     
    Jeff Howard, Jan 19, 2007
    #6
  7. kenny

    John Wade Guest

    There are surfacing bugs up 'til at least cut M150 (control points
    don't work properly on boundary blend tool) but this one looks
    completely off the wall - There's no link between the models, but one
    influences the other's regenerability.
    Since Pro must use a finite model space, it's feasible that
    ridiculously huge models will force lower accuracy levels on other
    models, which seems to be what you're describing, but I won't pretend
    to have the faintest idea what I'm talking about...
    Why are you trying to do vehicle panel work with relative accuracy? Or
    indeed, anything? Apart from a minor performance hit with large models,
    and poorly constructed models may stop regenerating, there's almost no
    downside to absolute accuracy, and considerable upside.
     
    John Wade, Jan 22, 2007
    #7
  8. kenny

    kenny Guest

    The model is only ridiculously huge in space 110Kb compared to the
    failing model which is 330Kb.
    We cannot agree among ourselves what the pros and cons are and just left
    it at the default relative. What is the consensus among users on this.
    I know the subject been tackled a number of times, has anyone got a
    definitive summary on the matter?
     
    kenny, Jan 22, 2007
    #8
  9. kenny

    John Wade Guest

    The concensus seems to be 0.01 for metric models, which is used, I
    think, by Cat, Cummins, & a few other big users.
     
    John Wade, Jan 22, 2007
    #9
  10. kenny

    David Janes Guest

    It'd be nice if you could precis the sides/arguments so we'd know what the controversy boiled down to. Maybe what's been discussed in this group, over the years, is pertinent. Googling this group for "accuracy" brings up a number of discussions, the longer (number of posts) the better, though none has been, nor strived to be, definitive. Here's what I came up with:
    http://groups.google.com/group/comp.cad.pro-engineer/search?q=accuracy&start=0&hl=en&

    Generally, though, they've pointed out the following:
    * If you're in contract manufacture and just make parts, relative accuracy may be just fine, though you'll never get away with this gross setting (.0012) for any large or relatively complex sheet metal product. I've proved the vulnerability of sheet metal parts to relative accuracy by doing the following: made a part 6" x 10" x .03 with a .032 hole in it. Steadily doubled the part size (hole eventually fails) or steadily decreased hole/thickness values (something eventually fails). This is because the smallest feature size calculable is determined as an accuracy value times largest feature/overall model size. So, when the sheet metal part gets too thin to be calculated (by current accuracy value times overall size), the feature fails. And simply moving the accuracy value decimal one place left often solves this kind of relative accuracy problem.

    * If you're in some other line of work, one that involves multiple models, assemblies, top down modeling, merge features or merge cutouts (as in mold work) or anything from the Advanced modeling menu, you'll probably have to get acquainted with the "Absolute Accuracy" menu. Getting all those pesky relative accuracy values to sync with each other requires a master accuracy, a value that encompasses all of them ~ an accuracy of accuracies, a ratio of ratios, on an "absolute scale". Maybe it would be simpler said, on the larger scale, or how to translate the relative accruacies of the microbe to the solar system. Or something bigger (the galaxy) or something smaller (a mouse).

    * Lastly, with similar relative accuracy values but vastly differently scaled parts, how do you derive a compatible absolute accuracy value. Well, start by imagining a universe of parts, all of whom never violate a relative accuracy of .0012. So, the smallest feature of the microbe is never smaller than .0012 of the overall size of the microbe; and the smallest feature of the human was never .0012 of the overall human size. (Naturally, as we know, all these larger features encompass all the measurable smaller features, until gaseous galaxies are composed of molecules, ions and subatomic particles, in the real world.) Only from a modeling stand point, a computer representational standpoint, does this business of "accuracy" count. (PTC and others started from the necessity that they didn't want to expend the same computational resources to evaluate/represent features that could be caluculated with integer values as those that represented some neccessarily more precise scale of real numbers [scale to be determined]). If effect, all you are looking for is different, higher/lower accuracy (scaling) values, anything that is more accurate than your default .0012 value. So, if 90% of your accuracy values are .0012, but you have one that is .00012, you'll set your absolute accuracy (ratio of ratios) to .1 which will allow the translation of the more accurate into the less accurate realm.Absolutely agreed! In many cases, absolutely essential: small features on large sheetmetal parts, geometry copied from smaller parts into larger ones, merge cutouts of complex parts from larger, simpler featured blocks. Much top-down modeling work where more complex features are copied/referenced into models of a grosser scope of accuracy. All of these are suited to the application of absolute accuracy. And, these days, with computational power as it is, not much is lost by the application of "too high" a value. So, if you calculate the scale of accuracy values at .01 or .001, not much is lost by setting the value at ..0001. If it is more than needed, Pro/e compensates. It seems not to mind about too high a value and only crash on too a low of one.

    David Janes
     
    David Janes, Jan 23, 2007
    #10
  11. kenny

    John Wade Guest

    Hi Dave,
    It could also be worth mentioning that relative accuracy determines the
    smallest edge size that will be calculated, when the feature that
    creates it is made, so if you have a model which changes size as you
    progress through it's creation (like, er, most of them) then you may
    well see features which you can create at one point in your modelling
    process, but not at another when the size of the model, and thus it's
    accuracy, has changed.

    A few years ago, I looked at accuracy from the point of view that
    objects about 5 inches across always seem to work okay with default
    0.0012 relative accuracy. My interpretation of this (from the
    description of relative / absolute accuracy in the V18 manual) was that
    this was roughly equivalent to an absolute accuracy of 0.02 for parts
    that size. I used that value for a while until the rest of the world
    settled on 0.01 (at least in my industry - and there are exceptions, )
    when, not being one to try and push water up hill, I changed as well.
    It works fine for gearboxes and crankshafts and that sort of stuff.

    To finally get to my point, before deciding what value of absolute
    accuracy suits you, consider the smallest feature you're likely to make
    (so, if you work in a chip fab plant, and take my broad recommendation
    that 0.01 is probably going to be okay, then I'm afraid you're going to
    end up thinking I'm a bit of a moron, and not without some cause) and
    divide it by ten. That really should be the maximum value you use. Then
    talk with the people you collaborate with, and make sure that value
    will suit them, and only then make the switch.

    Be warned: last time I spoke to PTC, they were still recommending users
    stuck to relative accuracy. If you work for a dinosaur company where IT
    think anyone dumb enough to work there is too stupid to listen to, your
    local IT 'guru' may ring PTC and take the 'recommendation' of some
    chimp in a call centre above your considered professional advice. Far
    better for you & the other users to 'just do it'
     
    John Wade, Jan 23, 2007
    #11
Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments (here). After that, you can post your question and our members will help you out.