model fize size vs. methodology

 From:  mjs (MSHIDELER)
3957.1 
Michael G (or any other kernal experts voice up):

How much impact is there on the model's file size verses the method taken to build a model?

A simple example might be a cube with a center through hole with radii to break all edges.

One approach would be to extrude a solid cube, add the thru hole, and lastly break all the sharp edges with a nice finger safe fillet.

The second method might be to extrude 6 surfaces to represent a cube, cut a hole in two opposite sides, extrude a surface from the circular edges together from both of the holes. then (ok, this is a pain in the a$$ process but it is for example only) I could use sweep features to create the fillets, and lastly join all the surfaces together to make the final geometry.

Now, from what I would assume the second method would create the larger file.

However, is there some sort of command or tool that can be run to minimize model database size while still maintaining actual model topology?

The reason that I ask is on a large model (and we see lots of great looking potentially large models on the posting here....love it) it is hard to maintain a "best practice" method of model creation. I mean on a simple model the point in time in which you add a fillet is not really important. However, there are times in which on a complex model you may want to add your fillets and radii at the sketch level rather than after the fact.

I thought about testing this using various modelers and the two methods that I threw out there, but not understanding how the modeling kernals work underneath I didn't want to make a bad assumption based on testing a couple of models.

Coming from a history based modeling background I have seen what appears to be simple model files being huge in the end due to poor practice. But my experience could also be based on assumptions that are not applicable or too general.

Or, is there such a thing as general best practice?

mshideler