measuring product design performance through metrics

I am in the midst of researching the topic of metrics as it applies to product design. Measuring the performance of product design whereby one version of one product is chosen over another can an elusive target. If the mandatory parameters of the product design goal have been determined and achieved, then what is the process by which the outcomes are measured against each other?

I am a sole designer for a large corporation. The product is retail packaging goods. Let’s say I design 12 iterations from design briefs that fall into the parameters of the goals of the product, all different yet plausible and viable for manufacturing, with thoughts into factors such as: sustainability, case/pallet count, ease of production, assembly, rubustness, label area and ergonomically sound. Internal and counsumer testing can whittle the field down to 4 or less, but from there, its more of a subjective beauty contest. Some designs are just going to be chosen over others.
The question that I want to focus on is: How is it determined that a product design is deemed best, or backing up a bit in the design process, how does the best product become designed. The question then becomes - Is the designer here really doing an average job of providing the results in an on time ordely manner, or is the designer striving to achieve the best he can, and in doing so, should be recognized for above performance work behaviour?

This, my friends, is the real root of the issue, here’s why: My manager (an Engineer by training) is a data driven individual, always asking for numbers, proof and results. In order to evaluate the performance of my designs, and the performance of my position within the company, there is a need to provide a sense that an evaluation process is in place. Where there is packaging that new designs are replacing, the task of evaluating one verses the other is easy, purchase interest can be measured. For new product development into an area that is different or has no competition, the product can be measured against other brands or stand on its own. What I would like to discuss is the criteria with which the products are compared. Beyond plain purchase interest, which can often be based on price alone, there are other factors that lead to choosing a product versus another.

The research that I have come across has been thin. Design metrics often miss the point above, instead they focus on how many patents are rewarded, or the quantity of new product launches.

I’ll stop here to allow others to comment on what I have laid out. thanks for your time

Sorry I didn’t see this sooner. Let’s see if it flies in Design Research (where it probably should have been posted to begin with). :wink:

Great question. I haven’t seen much research into either. At the end of the day, design is a qualitative subject, not quantative. Even then, as for reviews, it’s easy to measure productivity, but much harder to measure performance. If one goes by sales, who is to say that the design was the key difference. What about marketing, sales, engineering, etc.

I like to post the following articles from a few years ago, whenever the topic comes up:

You could use the logic of Net Promoter Score- ask the question of which product/ design would the respondent recommend.
The question framed from an advocacy p.o.v. may get different responses than from a practical p.o.v.

When I was working as an engineer, my company used something I can only think of as a combination of pairwise comparison and analytic hierarchy process to determine designs. If someone knows the correct terminology for the method, please share.

The best way I can describe it is that we have our criteria, which are weighted with percentages that totaled 100%. Each criteria had a scale rating (i.e. 1 to 5) on how the design fit the criteria. This was an inter-department process to get input how each department felt the design fit its goals. Final result was adding up all the ratings for each criteria from each department, apply the weighted percentage, and get a final totaled score. Final scores compared amongst the designs with highest one winning.

However, the comparison was based on gut feeling and how it worked with that particular dept. The more I tried to apply analysis, facts, and calculations to a design, the more it seems the evidence converges on a single design and immediately rejects the others. I guess basically what Sigma Six wants to accomplish if I’m reading NURB’s articles correctly. As an engineer, that’s what I’d want, and feels opposite for a designer.

As for what your manager was asking for, it seems odd to ask for performance data when the item isn’t out in the market. It’s not like there’s data showing that symmetrical heart packages are bought X% more than asymmetrical hearts or people are more likely to buy items with a bow. Sales data is probably the closest I can think of, but then things like branding and advertising get involved to weaken the data.

Nailed it. I feel the same.

I also think this realization is in part the reason why large companies are hiring Chief Design Officers or Chief Creative Officers. The quality of design is subjective (or put in the preferred parlance, qualitative). As such they need to be JUDGED not MEASURED. Judgment is passed hopefully by an expert and arbetier.

Agreed. Though I also think good design strategy can include quantitative analysis of cost, price, line planning. The combo of both can be amazing. I helped a company increase profits 170% by analyzing such metrics and slimming down skus, making a cohesive line plan and actually introducing higher priced special editions models that had more revenue than the high volume low price stuff sales was always pushing for. More info = more power.


Richard: That’s another problem with design, where do we stop? A lot of people would say that a designer has no responsibility in planning or deciding what SKUs a company needs. Many would say that’s marketing’s job. I’m confident there are many companies where your annual review would be negative because you didn’t “stick to design”.