In 1975 Charles Goodhart, a chief economic advisor of the Bank of England, posited “When a measure becomes a target, it ceases to be a good measure”. This idea came to be known as Goodhart’s Law and is recognized as a risk associated with key performance indicators (KPI) and implementing analytics. Any Metric applied to a competitive or adversarial system will change behavior if it is perceived to make decisions that affect the system. If your adversary has a good chance of figuring out your metric, how can you keep your system from being gamed?
This post first appeared on Elder Research Data Science & Machine Learning Blog, please read the originial post: here