TaguchiLossFunction
ThoughtStorms Wiki
Taguchi introduced by Robert Cringely :
https://web.archive.org/web/20040213194723/http://www.pbs.org/cringely/pulpit/pulpit20030925.html
Cringely :
The short version is that however they work, the Taguchi Methods can take a project with thousands, even millions of combinations of variables, and quickly reduce it to a couple dozen simple experiments that can be run simultaneously and will determine the cheapest way to achieve a goal. Instead of considering one variable at a time, Taguchi is able to test many variables at once, which is why the number of tests can be so small. It's a bloody miracle.
Strikes me as pretty miraculous too. Must be some information theoretic limit on the number of bits of data that can be squeezed out of an experiment, however much fancy statistics you apply to it, no? Maybe should compare NetworkEpistemology?
More detailed intro https://web.archive.org/web/20001209104600/http://www.mv.com/ipusers/rm/loss.htm
Design of experiments : https://web.archive.org/web/20021202033628/http://www.wtec.org/loyola/polymers/c7_s6.htm
But if it even has some of the virtue claimed for it, what are the implications.
- scientifically monitored, more effective advertising.
- how could it be applied to computer science?
- to analysing markets? social networks? etc.
- interesting comparison with SettingPrices
See also QualityControl.
Backlinks (4 items)