Get Even More Visitors To Your Blog, Upgrade To A Business Listing >>

Statistics are Misleading 100% of the Time (Part 1)

Part Two and Part Three of this series is now available!

Numbers are everywhere.

4 out of 5 dentists agree that we should be buying more chewing gum. 55% of voters prefer Candidate X to Candidate Y. Including a statistic with an argument can lend credibility to a claim that it would otherwise lack. Perhaps we shouldn't be surprised, then, that numbers are used as weapons by pollsters, advertisers, journalists and all manner of "experts": a mathematical saber designed to drive that final point home to whatever part of our brain is responsible for agreeing, or purchasing, or pulling the lever in the ballot box.

After all, words are just somebody's opinion. Numbers are facts. You can't just make up numbers.

The discussion surrounding the use of lethal force by Police is no exception to this rule. Articles about police shootings are rife with statistics: which itself is a bit surprising, given how unreliable the records surrounding police violence remain.

As part of a series of post, The Puppycide Database Project will take a look at two of the most widely used statistics related to the killing of animals by police. We will discover where these numbers came from, who came up with them and how they became widely accepted. Each of the examples we will review has been repeated by news outlets and by respected activist groups across the country. Each of the statistics is cited with the utmost confidence, either as simple fact or as the result of pain-staking scientific research.

But, despite frequent claims to the contrary, the statistics we will discuss were not devised through studies. And each of these statistics is completely wrong.

To be fair, some research was involved in creating and certainly in circulating these statistics, but only the sort of research just about anyone could do with a computer and an internet connection, not the sort of research that is done by people with white lab coats in a laboratory. Research is a tricky word: "My research was able to prove that quantum entanglement plays a role in the ongoing molecular stability of DNA" is just as valid as "My research on Groupon was able to prove that I can order an extra large pizza for under $12". When we hear the word research, sometimes we think quantum DNA when we should be thinking deep-dish or thin crust?

The creator of one of these statistics explained to a reporter that he deliberately fabricated the statistic, justifying his claim with the promise of future study. His excuse was the intellectual version of "I will gladly pay you Tuesday for a hamburger today", but somehow that did not prevent it from being repeated for years.

The second statistic is not the result of deception, but was rather caused by a failure to understand a correct statistic. That failure was repeated over and over again, without ever consulting the original source of the statistic, until the make-believe version was more widely accepted than the real version.

In this post Puppycide Database Project will identify the original sources for all of these numbers and we will prove them to be false with a little research and critical thinking. The case against these statistics is air-tight.

Despite the evidence we will provide here, it is almost guaranteed that these statistics will continue to be cited. The statistics are simply too compelling, too in agreement with preconceived notions and too useful to those publishing them. In fact, this post will almost certainly be used as evidence that the Puppycide Database Project is biased (the exact nature and degree of our bias we leave as an exercise for the reader). After all, we don't agree with the statistics and statistics are facts.

That is the power of numbers.

"A witty statesman said you might prove anything with figures." - Thomas Carlyle

At it's core, The Puppycide Database Project is an exercise in basic arithmetic. The Project is a glorified abacus. We hope to count every time that a police officer in the United States has killed, or tried to kill, an animal. We seek to count how many of these officers belong to each police department. The results are then added together in our database, which is in reality a very fancy, computerized spreadsheet.

Each killing of an animal by a police officer is accompanied by a set of circumstances which must be added to the equation if we want to understand the why and how of puppycide instead of just the what. Was a human being hurt? Was a firearm used? Did police engage the animal in a public area (like a street), a private area (like the inside of a home) or somewhere in-between (like an apartment complex commons)? Was a warrant involved? Was a SWAT team used? Each of these circumstances is to be added, one after the other, to it's own ledger.

Just counting 'puppycides' is important. Putting a number to the issue allows the public to appreciate the scale of the problem objectively. By keeping the count going over a period of time, we can determine whether the problem is getting better or worse.

Simple addition is the starting place of our efforts here at Puppycide Database Project, but we hope to do more. By using statistics, we can understand more than merely how many animals are killed. By reviewing the circumstances surrounding police/animals violence, we can determine the relationships between those circumstances. These relationships can suggest avenues for further inquiry. The can even suggest ways to prevent violence in the future.

Let's consider a few of these relationships and what can be gained by understanding them.

  • Are police departments that fire officers who kill animals more or less likely to be sued for killing animals? We can understand if it more expensive for police departments to retain police officers who are frequently involved with excessive force allegations
  • Are police officers who kill animals more or less likely to engage in excessive force toward human beings? We can understand whether the killing of an animal is a 'warning sign' that an officer is likely to hurt human beings
  • When police officers shoot an animal, how likely is it that one of their bullets will hit a human being? Is a human being shot during a 'puppycide' more or less likely to die than a human being who is attacked by, for instance, a dog? We can understand if the police response to dangerous animals is more likely to injure or kill human beings than the animals themselves are.

As you can see from these examples, examining more nuanced relationships between quantities offers us the opportunity to address questions that we would not be able to answer otherwise. Because the topic of our inquiry is violence, answering these questions is critical. Both animals and human beings are dying as the result of our current law enforcement policies regarding animals. Understanding why is our best - perhaps our only - hope to reduce the amount of that violence.

"It ain't so much the things we don't know that get us in trouble. It's the things we know that ain't so. - Artemus Ward

The Puppycide Database Project began in earnest some 18 months ago. What started with one website attracted volunteers from different walks of life all over the country. People whose pet had been killed by police, or who had seen a killing in the news, provided the Project with details of incidents in all 50 states. We added each incident, one after the other, until the total was in the thousands.

Each Puppycide Database Project volunteer chooses how much time they would like to commit to the project. Some volunteers only have a few minutes to add a single record to the database. Other volunteers have read through hundreds of news reports as part of reviewing the information in our database. A few of those volunteers began to come across articles in news papers that referenced the sorts of statistics that the Puppycide Database Project had hoped to uncover with further research. Each of the statistics related to national trends in police shootings of animals - how often police killed dogs across the country and how often police shot dogs as compared to how often police shot human beings.

The news was exciting, but also confusing. In order to provide reliable statistical information, researchers need information about either the entire population or a representative sample of that population. We hadn't heard of anyone other than the Puppycide Database Project involved in a large-scale research project of police shootings of animals. Perhaps a group at a university had been working on the project diligently and quietly, relying on private information requests from government agencies rather than public requests for crowd-sourcing that would have brought them to our attention. Or maybe the government had begun tracking use of force toward animals in a meaningful way.

Our excitement won out over our worries of where the data came from. After all, releasing the data was just the first part. Having finally counted every shooting, or at least enough shootings to reliably determine national trends, further analysis would unlock answers to harder and more complex questions about police use of force that had remained conjectural. Which departments shot more dogs, and why? Did dog shooting rates tracked by geographic area correspond to the information we had about injuries caused by dog attacks? Was puppycide getting worse and if so why?

We hoped to reach out to the same people that were releasing these statistics. If other researchers were working from a sample, we could provide them with our own data to increase the information available. We kept our fingers crossed: would these researchers would allow us to look at their data as well? If so, Puppycide Database Project would have to completely switch gears. We would continue tracking new shootings, but most of our energy would be devoted to analyzing this new data instead of researching past animal killings.

There were two statistics that appeared regularly in both newspapers and in statements from respected activist groups like the ASPCA.

  • A dog is killed by a police officer every 98 minutes - which we explore in Part Two of this series
  • Half of all police shootings involve a dog - which we investigate in Part Three and Part Four of this series

We needed to find out who had devised the statistics as soon as possible. These were big claims. Somebody had done their homework to get those numbers, we thought.

In Part 2 of our series "Statistics are Misleading 100% of the Time", we will learn how a pair of film makers convinced the country that "A dog is killed by a police officer every 98 minutes". Continue reading the next issue from Puppycide Database Project's ground-breaking series. Or, take a look at Part Three of our series, in which we examine the claims of a highly respected Department of Justice policy paper and learn a few things about probability, selection bias and randomness - particularly how we can better evaluate the claims made by statistics.



This post first appeared on Puppycide Database, please read the originial post: here

Share the post

Statistics are Misleading 100% of the Time (Part 1)

×

Subscribe to Puppycide Database

Get updates delivered right to your inbox!

Thank you for your subscription

×