Hiller Associates posted the following article at ENGINEERING.COM yesterday. You can read it there at this link, or just keep reading below!
Another solid piton in the cliff of making product cost mainstream in CAD / PLM Products?
CATIA users can now get a faster and more effective way to design and source composites products with the highest profit by bringing the estimation ability closer to the designer’s native environment. Galorath Incorporated debuted its newest integration of SEER® for Manufacturing with the Dassault Systems 3DEXPERIENCE® Platform in CATIA at the JEC Composites conference in Paris. The new product is called the SEER Ply Cost Estimator.
Who is involved?
Galorath Incorporated grew out of a consulting practice focused on product cost estimation, control, and reduction that began in the late 1970
s. Over the last 30 years, Galorath built their costing knowledge into the SEER suite of software products. SEER is one of the independent product cost estimating software companies.
Hiller Associates spoke with Galorath CEO Dan Galorath, Vice President of Sales & Marketing Brian Glauser, and SEER for Manufacturing product lead Dr. Chris Rush and got a full product demo.
What does this integration do?
The integration allows users of CATIA to use SEER’s costing software for composite materials within the CATIA environment. In CATIA, the engineer designs a lay-up for a composite part, generating a Ply Table (a critical design artifact for composite parts that specifies material, geometry, and some manufacturing info). That activates the integrated SEER Ply Cost Estimator so that the designer (or the cost engineer or purchasing person aiding him) can set up more part-specific costing choices and preferences within the CATIA environment.
When ready, the user pushes the cost analysis button. The information is processed by SEER Ply Cost Estimator which sends the ply table data and other information to the interfacing SEER-MFG software to compute cost. The cost data is returned and presented to the user, once again within a native CATIA screen.
How broad is the capability?
Currently, the integration of SEER is applicable for parts made of composite materials. It’s a strong starting point for the integration partnership because SEER has a long experience in the field of costing composites, working with companies in the defense and aerospace verticals. Composites are also becoming more mainstream in other industries, such as automotive and consumer products. Galorath has been a major player in the US Government’s Composites Affordability Initiative (CAI), a 50/50 government/industry funded consortium including Boeing, Lockheed and Northrop Grumman that was formed to drive down the costs of composites. Galorath has also worked with Airbus in the area of composites parts costing.
Galorath’s Brian Glauser says that the SEER Ply Cost Estimator has hundreds of man-years invested, much from previous work with CAI and with aerospace companies that resulted in several of the modules already in the SEER-MFG standalone product.
The first version of the SEER Ply Cost Estimator handles many composites manufacturing processes, materials, concepts of complexity, and both variable and tooling costs. However, it does not yet directly cost the assembly of one part to another.
The initial integration will be to CATIA v5, but SEER and CATIA have signed a v6 agreement as well. That integration will follow later.
Galorath (and likely Dassault) are hoping that the SEER Ply Cost Estimator will be well received by customers and help drive many product cost benefits. If this happens, there may be demand from Dassualt’s end customers not only to improve the SEER Ply Cost Estimator, but to expand the SEER/CATIA integration to other manufacturing processes covered in SEER’s stand-alone software products such as machining, plastics, sheet metal and assembly processes.
What does it mean for Functional Level Groups?
Philippe Laufer, the CEO of CATIA was quoted saying :
“Using Galorath’s SEER for Manufacturing in CATIA…will help companies perform early trade-off analysis on the use of various materials and composites processes before manufacturing even takes place.”
Well, that has been one of the goals in Product Cost Management for a long time. If your company uses composites, the tool has the following possibilities:
- Engineering – identify which design choices are driving cost and by how much
- Purchasing/Manufacturing – get an early warning of what to expect from suppliers before asking for quotes or estimates (should-cost)
- Cost Engineering –focal point for the cross-functional discussion about cost to drive participation, especially from engineering
It’s important to realize that this integration will have its limitations, as with any costing product. First, the current integration applies only to composites. While expensive, composites are only one type of part on the Bill of Material (BOM). You will have to go beyond the current integration of SEER/CATIA to cost the full BOM, perhaps to SEER’s standalone costing product or to those of its competitors.
Second, remember that cost is far harder to “accurately” estimate than many physical performance characteristics. While meeting an absolute cost target is important, even more important are the following:
- Continuous Design Cost Improvement – If your company consistently designs lower cost products because you have superior cost estimation information, you WILL beat your competitors.
- Correct Cost and Margin Negotiation – If your company is better at negotiating quotes because it can give suppliers a better understanding of what it will cost them to make your part and you can negotiate a margin that is not too high, but adequate to keep your suppliers healthy, you WILL beat your competitors1.
What does it mean for the C-Level?
Philippe Laufer of CATIA also says:
“This [the SEER Composites integration] leads to finding out the most efficient way of manufacturing a product while meeting cost, performance, functionality, and appearance requirements.”
The C-suite doesn’t really care about composites or ply tables in and of themselves, but it does care about revenue and profit. Of course every well-marketed product will claim to improve these metrics. Regarding product cost, the good news is that Galorath and Dassault are aiming at a big target. Companies that use a lot of composites can have very high costs. For example, Boeing and Airbus have Cost of Goods Sold of 84.6% and 85.5% and Earnings before Tax of 7.2 and 3.6%, respectively2. Those COGS figures are big targets on top of a highly leveraged COGS/Profit ratio.
What does it mean for Product Cost Management becoming mainstream in the enterprise software stack?
We asked Dan Galorath how long it would be before Product Cost Management was as much of the PLM ecosystem as finite element, manufacturing simulation, or environmental compliance. He replied:
“Cost estimation software will never be on every designer’s workstation, but I don’t believe that is what it means for Product Cost Management to be considered ‘mainstream.’ It’s not fully mainstream yet, but a greater need is being driven by outsourcing and the tight business environment. The procurement folks can’t only rely on internal manufacturing knowledge like they used to. They need tools like SEER to fill the gap and move the cost knowledge base forward.”
We agree with Mr. Galorath. This is another step, another piton to secure Product Cost Management onto the PLM cliff, as PCM continues to climb this steep hill.
This is the first integration point between independent Product Cost Management software companies and the PLM/ERP world since September 2012, when Siemens PLM purchased Tsetinis Perfect Costing3. PTC has built some level of cost tracking ability into Windchill, and Solidworks (owned by Dassault) has developed the first couple versions of a costing calculation module for their product.
There is still a lot of ground to cover. There are quite a few independent product cost management software tools that have costing intellectual property that can accelerate the process, especially if the big PLM companies acquire them or partner with them. When that will happen is anybody’s guess, but for now it looks like CATIA users, at least, have a viable solution for composites costing… and maybe more in the future.
1 For more information, see the Hiller Associates Industry Week Article: Your Should-cost Number is Wrong, But It Doesn’t Matter
2 Per www.morningstar.com, trailing 12 months COGS, 3/13/2014
3 Siemens buys Perfect Costing Solutions (Tsetinis), Hiller Associates, September 2012
Numbers. They have such a comforting certainty to them, don’t they?
Words can be interpreted. But, numbers, they have that beautiful mathematical ring of truth. I was thinking about this the other day, when I got a number from a friend. I was helping him review a model he had made, and I asked him what the median result was from the model. He told me 16.42%. I ask him, “Do you believe it’s 16.42%?” He responded, “Yes, 16.42%.” This was a very smart guy, with multiple advanced degrees in engineering from a great school. However, the data set from which he was calculating this percentage came from a group of people who are giving him estimates of the money they had spent on certain activities, as well as data from an accounting system. And yet, he was quite positive that the result was 16.42%. I.e., he thought that the result he calculated from the inputs had enough precision to generate FOUR significant figures.
Now, I’m sure that he would have realized, if he had sat down and thought about it for a second, that expecting this kind of precision when the inputs had virtually no precision of all, at least not the precision of four significant figures, was ludicrous. However, that’s the great thing about computers, especially when using spreadsheets like Microsoft Excel. They will give you as much precision as you want. In fact, to what does Excel default… two significant digits behind the decimal point.
What I find really funny about this is that most engineers have learned the hard way, over time, that there is this thing called “tolerance stack-up.” In other words, no matter what you specify on a CAD model or drawing, a machine only has so much physical capability to hold that dimension. Therefore, engineers become very proficient at specifying tolerances. In recent years, they have even become much better at understanding the stack up of these tolerances on the final dimensions of a part. In fact, there are very sophisticated software packages dedicated to helping engineers do this.
In more general usage, Monte Carlo modeling became all the rage 10 to 20 years ago. Monte Carlo was an attempt to recognize the inherent noise in numbers that we measure, and how that uncertainty affects the models that we make, especially financial models. However, the funny thing is that when it comes to calculating product costs, people ignore the precision question, and just assume they have the precision they wish they had.
Take a look at the figure below . Let’s go through a simple product costing in concept. For the part we are looking at, we first need to know the physical quantities that are used in making it. For example, we need to know the mass of the part, but that’s a tricky thing, because we have scrap and varying amounts of mass could be used up in certain processes. So, we might be +/-1-3% in our estimate of how much was used. Similarly, we need to know how much time is actually spent on each machine. However, this varies batch to batch, and measurements aren’t always so accurate. There may be many processes that make up the part, including extra inspections and re-work. Let’s say our measurement of the time it takes has a range of 5 to 15%.
Until this point in the analysis, at least we’ve been dealing with physical quantities, not financial quantities. But, if we move to financial qualities, the problem gets much worse. Even material rates are not such a certain thing. They move around over time with various surcharges for this and that from the different material providers. And, the number depends on what material is sent t0 what lines, etc. Labor rates and overhead rates are far more black magic. Accountants with green eye shades spend endless hours calculating these rates from monstrous ERP systems, using Byzantine Activity Based Costing allocation schemes. We hope that the allocated rates are accurate to the real truth on the floor, but I don’t think we can really expect them to be more than +/- 10-20% from what’s really going on.
Never fear though! At the end of the calculation, we have calculated that this particular part cost is $93.45. Why $93.45? Well, that’s what our spreadsheet model or our product cost management software told us. And, of course, a cost NEEDS to be within 10% of what we think the real cost is.
If the product cost management user actually calculated the tolerance stack-up of the uncertainties of the inputs that went into that cost, they would probably find that the costs are more than +/-10% from the true cost. If they seriously considered the possible precision, would they say the part cost $93.40-93.50? I doubt it. Would they say it costs $92.00-93.00? Nope . They probably would say that the part could cost between $88-$97. But, a range like that is not very comforting . It’s much more fun to hit that little “$” format button on Excel or cut & paste the number from the product cost management software .
It’s $93.45. That’s what it is. Because ignorance is truly bliss in the world of Product Cost Management.
We are still on our epic quest to find the DARPA study (a.k.a. the legendary seminal study reported to say that ~80% of product cost is determine in the first ~20% of the product lifecycle). However, during our search we have been aided by Steve Craven from Caterpillar. No, Steve did not find the DARPA study, but he did send us a study attempting to refute it.
Design Determines 70% of Cost? A Review of Implications for Design Evaluation Barton, J. A., Love, D. M., Taylor, G. D. Journal of Engineering Design, March 2001, Vol. 12, Issue 1, pp 47-58
Here’s a summary of the paper and our comments and thoughts about this provocative article.
Where’s DARPA and Can We Prove this 70-80% number?
First, the authors question the existence of the DARPA study and say that most studies that support DARPA’s findings reference other corporate studies that are alleged to support DARPA’s findings. Most of these corporate studies are difficult to trace. They authors then analyze a Rolls-Royce study (Symon and Dangerfield 1980) that investigates “unnecessary costs.” In the Roll-Royce study, Symon and Dangerfield find that the majority of unnecessary costs are induced in early design. However, Barton, Love, and Taylor make the point that unnecessary costs are NOT the same as the TOTAL cost of the part itself. That’s fair.
The authors then go into a more “common sense” line of discussion about how the costs induced at different stages of the product lifecycle are difficult to disaggregate. The difficulty occurs because design choices often depend on other upstream product cost choices and the knowledge or expectation of downstream supply chain and manufacturing constraints. This section of the paper concludes with a reference to a study by Thomas (The FASB and the Allocation Fallacy from Journal of Accountancy) which says that “allocations of this kind are incorrigible, i.e. they can neither be refuted nor verified.”
We at Hiller Associates agree with these assertions in the sense that these statements are tautologically true. Maybe someone should have given this study to Bob Kaplan of Harvard Business School before he invented Activity Based Costing in the 1980’s in collaboration with John Deere? After all, wasn’t ABC all about the allocation of costs from indirect overhead? However, Kaplan’s attempt illustrates the reality of the situation outside of academia. We in industry can’t just throw up our hands and say that it’s impossible to allocate precisely. We have to make a reasonable and relevant allocation, regardless. If it is not ‘reliable’ from a canonical accounting definition point of view, we just have to accept this.
Is DARPA Actually Backwards in Its Cost Allocations?
What if the DARPA study’s 80/20 claim is more that an allocation problem? What if DARPA is actually promoting the opposite of the truth? The author references a paper by Ulrich and Pearson that may reverse DARPA. Ulrich and Pearson investigated drip coffee makers and conclude that the design effect on product cost accounted for 47% of cost, whereas manufacturing accounted for 65% of product cost variation. They did, of course, make their own assumptions for that type of possible manufacturing environments that could have made the 18 commercially available coffee makers.
Considering the pre-Amazon.com world in 1993 when the Ulrich and Pearson study was done, it brings a smile to my face thinking of MIT engineering grad students at the local Target, Kmart, or Walmart:CLERK: Can I help you? GRAD STUDENTS: Uh, yeah, I hope so. We need coffee makers. CLERK: Oh, well we have a lot of models, what is your need… GRAD STUDENTS: Awesome, how many do you have? CLERK: Uhh… I guess 17-18 models, maybe. GRAD STUDENTS: Score! We need 18. CLERK: 18 of which model? GRAD STUDENTS: Oh, not 18 of one model. One of each of the 18 models. CLERK: What! Huh… wha-why? GRAD STUDENTS: We’re from MIT. CLERK: Ooohhhhh…. right… GRAD STUDENTS: Uhh… Say, what’s your name? CLERK: Um… Jessica… like my name tag says. You say you go to MIT? GRAD STUDENTS: Um, yeah, well Jessica, we’re having a party at our lab in Kendall Square this Friday. If you and your friends want to come, that would be cool. What do you say? CLERK: Uh, yeah right… how about I just get you your “18” different coffee makers. Good luck.
… but we digress. Is product cost determined over 50% by manufacturing technique rather than design? That seems a bit fishy.
Design for Existing Environment
With the literature review out of the way, the authors get to business and propose their hypothesis:
That consideration of decisions further down the chain are beneficial can be illustrated with a new ‘design for’ technique, Design For the Existing Environment (DFEE) that aims to take into account the capacity constraints of the actual company when designing the product… This contrasts with the conventional DfX techniques that take an idealized view of the state of the target manufacturing system.
They then talk about a simulation that they did which they hope takes into account inventory, profit, cash flow, missed shipments to customers, etc. They run 5 scenarios through their simulation:
- A baseline with New Design 1 that lacks sufficient capacity needed by the customer demand
- A New Design 2 that uses DFEE to use the existing manufacturing environment and can meet customer demand.
- Making New Design 1 by buying more capacity (capital investment)
- Pre-Building New Design 1 to meet demand
- Late deliver of New Design 1
Not surprisingly, the authors show that scenario 2, using their DFEE technique, beats the other alternatives, considering all the metrics that they calculate.
Thoughts from Hiller Associates
This article is from over ten years ago, but it is thought provoking. Is 80% of the cost determined in the first 20% of design? We don’t know. We certainly believe that over 50% of the cost is determined by design. In our professional experience, a large part is controlled by design, even allowing for the relationships between design, purchasing, manufacturing, and supply chain. We’ve personally observed cases in which moving from one design to another allowed for the use of another manufacturing process that reduced total cost by 30%-70%.
Overall, the authors bring up a valid point that goes beyond the traditional ringing of the Total Cost of Ownership (TCO) bell. They present a simulation in which they claim to calculate Total Cost of Ownership in a rigorous way. The problem is that the calculation is too rigorous (it took them 4 hours per simulation). That kind of time and, moreover, the complexity underlying such a model is likely not practical for most commercial uses. However, a more simplified estimation of Total Cost of Ownership is more appropriate. In fact, Hiller Associates has helped our client’s teams use flexible tools like Excel, along with a well designed process, to estimate a Total Cost of Ownership. Is that an end point? No, but it is a beginning. Later, as a client’s culture, process, and team improve, more advance Product Cost Management tools can be added into the mix. And, we do mean TOOLS in the plural, because no one tool will solve a customer’s Product Cost Management and Total Cost of Ownership problems.
Hopefully, we will see some more academic work on the product cost problem. But, until then, we’re still searching for the original DARPA Study. Anyone know where it is?
- Design Determines 70% of Cost? A Review of Implications for Design Evaluation, Barton, J. A., Love, D. M., Taylor, G. D., Journal of Engineering Design, March 2001, Vol. 12, Issue 1, pp 47-58
- Symon, R.F. and Dangerfield, K.J., 1980 Application of design to cost in engineering and manufacturing. NATO AGARD Lecture Series No. 107, The Application of Design To Cost And Life Cycle Cost to Aircraft Engines (Saint Louis, France, 12-13 may, London, UD 15-16), pp. 7.1-7.17
- Thomas, A.L., 1975, The FASB and the Allocation Fallacy, Journal of Accountancy, 140, 65-68.
- Ulrich, K.T., and Pearson, S.A., 1993, “Does product design really determine 80% of manufacturing cost? Working Paper WP#3601-93 (Cambridge, MA: Alred P. Sloan School of Management, MIT).
It’s been a couple of weeks, since we discussed the Voices series, so if this post is interesting to you, you may want to go back and read the first two:
- Do you hear the voices? (Voices Series, Part 1)
- The Voices of Discord in Product Cost Management (Voices Series, Part 2)
In these first two articles we introduced several of the voices that are always present in the Product Cost Management conversation, including:
- The Voice of Hopefulness – the Pollyanna voice that assumes product cost will just work itself out in the end. It is a voice of justification to ignore Product Cost Management, because the team is just too busy at XYZ point in the development process to seriously consider product cost. Hope is NOT a strategy.
- The Voice of Resignation – the nihilist voice that assumes that you have to accept high prices because the three suppliers that purchasing quoted gave you pricing far higher than what seems reasonable
- The Voice of Bullying – the seemingly unreasonable scream of the customer telling you what your product should cost — not based on reality, but based on the customer’s own financial targets.
However, there is another voice in the conversation that can bring some reason to the cacophony. It is a voices of reason — the Voice of Should-cost.
Buck-up Cowboy. The Voice of Should-cost Can Help
Should-cost is just what it sounds like, using one or more techniques to provide an independent estimate of what the cost of a part or product “should” be. The question is, what does “should” really mean? For many, the definition depends on the type of cost being calculated, as well as personal should-cost calculation preferences. I will provide my own definition here, mostly targeted at providing a should-cost for a discretely manufactured part.
Should-Cost – The process of providing an independent estimate of cost for a part, assembly, component, etc. The should-cost is based on a specific design, that is made with a specific manufacturing process, and at a supplier with a specific financial structure. Or, the should-cost is calculated assuming a fictitious supplier in a given region of the world that uses the best manufacturing technology, efficiency operating at maximum sustainable capacity.
I realize that this is a broad definition, but as I said, it depends what you want to estimate. For instance, do you know the supplier’s exact manufacturing routing, overhead and labor rates, machine types, etc.? In this case, do you want to estimate what it “should” cost to manufacture the part under these conditions? OR… do you want to know what the cost “should” be for a new supplier who is well-suited to manufacture your design and has a healthy but not overheated order book? Although you could make many other assumptions, the point is: KNOW YOUR ASSUMPTIONS. You will note that I said nothing about margin. Some people call this a “Should-Price,” while others call it a “Should-Cost” referring to what they will pay vs. what the part costs the supplier to make. The only difference is that you will also make an assumption for a “reasonable” margin for a Should-Price.
The important point is that the team relying on the should-cost information must define the scenario for which they want a should-cost estimate. There is nothing wrong with wanting an answer for all these scenarios. In fact, it’s preferable. Run the calculation / estimate more than once.
Should-cost, Should Be a Choir, not a Solo Act
Manufacturing cost is a very tricky thing to calculate. I often say that the true cost of the economic resources to make a part or product is a number known but to God. Put statistically, you can’t know the true meaning or standard deviation of a population, you can only estimate it from the samples that you take. People take two common approaches to should-cost.
The Pop Star Solo Act
The popular solution that too many people pursue is the solution pictured at the right.
They want the easy button — the single source of truth. They want the plasticized overproduced solo pop star version of should cost, i.e. the easy button tool. There’s nothing wrong with this and there are some really good should-cost solutions available, but none of them are infallible. In addition, it is not appropriate to put the same should-cost effort into each part or assembly in a problem. One should focus where the money is. However, too many people, especially cost management experts, become sycophants of one particular tool to the exclusion of others.
Looking at the diagram to the left, you can see what the landscape looks like when you make your comparisons to one point in cost space. It is an uncertain, scary world when you only have one point of reference. In this case, all one can do is try to force a supplier to match the should-cost output of your favorite tool.
The Andrews Sisters, Competitive Trio Quoting
The other very popular approach comes from the purchasing department: three competitive quotes. If the auto-tuned single pop star should-cost is too uncertain, purchasing will listen to a trio instead. Why three quotes?
No one seems to know, but in EVERY purchasing department with which I have ever worked, three shall be the number of the quoting, and the number of the quoting shall be three. [If you are an engineer, you know my allusion. If not, watch the video to the left!] The trio of quotes in the diagram to the right do help clarify the picture a little better, but there is still too much uncertainty and what I call “commercial noise” to really believe that the quotes alone bound what the should-cost plus a reasonable margin is in reality.
An Ensemble of Should-Cost Estimates
Returning to our statistics example, one of the first things you learn in statistics is that it takes about 33 samples to characterize a bell curve distribution. At 33 samples, you can start to approximate the true mean and standard deviation of the actual population. I am not saying that one needs 33 estimates of should-cost to triangulate on the true cost, but you should get as many as you can within a reasonable time frame. Have a look at the diagram at the right to see this illustrated. Instead of the single pop star approach or the Andrews Sisters trio of quotes, hopefully what you get is a well-tuned small chorus of voices who start to drown out the Voices of Resignation, Hope, and Bullying. The chorus of should-cost estimates start to bound the “true” should-cost of the part or product and can give the team a lot more confidence.Sometimes the team does not have time to assemble all the voices of should-cost. Not all parts or products are worth assembling the full choir. More often than not, the organization is either unaware of the should-cost voices at its disposal, or are just too lazy to assemble them.
Don’t let your organization be lazy or sloppy with respect to should-cost, and remember that the best music is made when groups of instruments and voices work together, not when one person sings in isolation.
p.s. Bonus PCM points if you can guess what a cappella group is pictured in the thumb nail to the post
I happened to stumble upon an article on SpendMatters from a few weeks ago by Sheena Moore:
The article about the manufacturing cost versus the price of a new bracelet at Tiffany. If you don’t know what Tiffany is, you’re probably unmarried and have not been dating. Some say you can’t put a price on love; Tiffany disagrees and will help you do it! The first great thing about the article is Sheena’s calling out of Tiffany’s deceiving marketing. Apparently, they told her the bracelet is made of a golden “metal” called “Rubedo.” No ladies it’s not gold; it’s something better; it’s Rubedo. (Rubedo is actually just an alloy that helps Tiffany water down the gold to make more $$$. Sheena and I had a good laugh about this on the phone).
Sheena’s article caught my eye for two reasons. First, I’m just really cheap, and the idea of a $7,500 bracelet made of 55% Copper and 31% Gold flabbergasted me. However, more interesting than my miserly instincts was that Sheena does a nice little product cost analysis of the bracelet. In doing so, she highlights another form of fool’s gold: Material Cost Multipliers.
The Material Cost Multiplier
Material Cost Multipliers are a simple idea. They postulate that one can first calculate the cost of a product’s raw material and multiply it by a number to get the overall “Piece Part” cost. But wait, you may object: how can this be valid? Why would someone vastly oversimplify the product cost calculation like that? That’s simple: calculating actual cycle times and tooling costs for each machine needed in the product’s manufacture is HARD, and it requires a lot of manufacturing knowledge.
Material Cost Multipliers just sweep all that nastiness under the rug… or into the multiplier, in this case. They have the following assumptions:
- Assumption 1: Parts is Parts. Remember the old Wendy’s commercial making fun of the contents of Chicken McNuggets? No? Well I do, and you can too, by watching the video below.
The Material Cost Multiplier inherently assumes that all parts that you are manufacturing require the same processes and have the same complexity of design. For example, assume that our Tiffany bracelet and this Gucci Earring had the same mass:
Would you guess that both of these items take the same effort to make? If you said ‘no,’ you are right.
- Assumption 2: The Biggest Loser – The Material Cost Multiplier also assumes that the part mass is very highly correlated to the part’s processing costs. Therefore, the more mass you lose, the more your processing cost goes down in DIRECT correlation. There is no doubt that many manufacturing costs do have a correlation to the mass of the part, but many do not. For example, the polishing or burnishing of the Tiffany bracelet is much more dependent on the surface area burnished, the complexity of the surface, and the hardness (composition) of the material than the mass of the item.
The Cost of the Tiffany Bracelet
Sheena received notice from a colleague that material is only about 25% of the cost of an item. So, Sheena first did a nice material cost analysis of the bracelet. She says that the cost of material is $1,500. Although, she does not account for scrap or loss, this is a pretty good assumption, given that this type of material which can be re-melted. Also, the manufacturing process is likely a net form process, where there is virtually no loss in specific design). I would, however, question the assumption that:
- Material Cost = 25% * Piece Part Cost.
- Or, Materal Cost * 4 = Piece Part Cost. Basically, 4 is her Material Cost Multiplier.
First of all, that seems backwards in the world of simple metal part manufacturing which, in my experience would be more likely to have:
- Material Cost = 75% * Piece Part Cost
- Materal Cost * 1.33 = Piece Part Cost).
In fact, I think the processing costs are even lower than my general assumption. Just looking at the picture of the bracelet, my guess is that this is made by a routing such as:
Extrusion is very efficient and cheap, especially for a straight cylinder. I would shoot from the hip and say the processing is definitely under $20 (probably under $10). Let’s say we have the $1500 raw mat’l cost + $20 processing/logistics + $100 for marketing (which might be outrageously high). That’s a $1,600 Fully Burdened Cost for the high class Wonder Woman wrist bracer (you’ll need 2 for Halloween). At a price of $7,500, just one bracelet is generates $5,900 PROFIT (370+% margin)! I did a product cost analysis in one of the commercial Product Cost Estimation tools for a very similar looking part to the Wonder Woman Tiffany Bracer, and I got a result of $5.25 (Extrusion = $2.20, Flaring = $0.7, Marking = $0.50, Polishing = $1.30, Packaging – $0.55). My former co-founder’s wife owns a florist and gift shop and once told me told me once that typical mark-up for jewelry is ~50%, so the bracelet should be priced (at max) at $3,200, not $7,500.
So are Material Cost Multipliers bad?
No, they are not necessarily bad or inaccurate… but they often can be because they are misapplied. It’s important to know:
- What processes will be used to make a product? Each major process probably needs its own multiplier for accuracy.
- What physical part attribute most strongly drives cost in each process?
- Make sure if someone gives you a multiplier that it is based on these considerations?
Consider the differences:
- Sheena’s general manufacturing Material Cost Multiplier = 4x –>Processing Cost = $6,000!
- Eric’s general simple part metal manufacturing Material Cost Multiplier = 1.33x –> Processing Cost = $1,900!
- Eric’s manufacturing “judgment” from experience and given the the routing Eric assumed Processing Cost = $20 –> Material Cost Multiplier = 0.013x!!!
- The Product Cost Estimation Tool’s estimate of Processing Cost = $5.25 –> Material Cost Multiplier = 0.003x!!!
There is no doubt in my judgment that the Product Cost Estimation tool is the closest to reality. Regardless, a fast back-of-the-envelope calculation is far better than nothing. I am a big fan of common sense and back-of-the-envelope reality checks. I applaud Sheena’s effort, which, honestly, is more than many design engineers or purchasing engineers would do in considering the profit impact of their decisions.
- Material Cost Multipliers are useful, but can be dangerous. They should be applied by experts or with expert guidance.
- My analysis shows that Sheena is even MORE correct in that the bracelet is not worth it.
- Kudos to Tiffany for Jedi Mind Tricking girls into believing a $1,600 bracelet is worth 3x as much.
- Ladies, your boyfriend’s/fiancee’s/husband’s willingness to buy you the Tiffany Rubedo bracelet may mean he’s filthy rich, desperate, or not too smart… but it may not necessarily mean he loves you. Admittedly, that’s just my guess… but then again, I’m a product cost guy, not the love Dr.)
p.s. Guys, perhaps you would be interested in buying the woman of your dreams the Hiller Associates RubedA bracelet. It’s just like the Tiffany RubedO bracelet, but MINE is 35% gold, not 31% like Tiffany. The only difference is my bracelet will say “H&CO” where Tiffany’s says “T&CO”, and likewise mine says Hiller’s, instead of “Tiffany’s”. It’s a bargain at $4,999, versus Tiffany’s $7,500. H&CO: “Don’t just show her your love; show her your intelligence.”