We’ll be at Compliance Week National 2024 in Washington, D.C., April 2-4. Learn more or schedule a time to meet with us at the show here.

#Article

Predictive Profiling: Who Needs Evidence?


Predictive Profiling: Who Needs Evidence?

The potential and danger of leveraging big data

Posted by on

Imagine if you could determine, on the basis of eye color, personal habits or clothing choices, whether or not somebody has committed a crime. Now imagine if you could actually predict, based on those qualities, a person’s likelihood of committing a crime in the future. It's predictive profiling, and while it may sound like science fiction, it’s exactly what Jim Adler, a former privacy officer for a background check firm, has developed, and with reasonable success.

Predicting Crime

After running statistical analyses on big data sets of individuals in Kentucky, Adler was able to make connections between seemingly unimportant personal traits, such as eye, skin and hair color, and the likelihood that they had a felony record. He then created software that can estimate the chances of someone committing a crime in the future, based on those personal traits.

FREE Investigation Report Template

Prepare thorough, consistent investigation reports with our free report template.

Download Template

Dangerous Data

The implications of Adler’s project are alarming. As more and more personal information is collected from e-mail providers, social networks and financial institutions, the volume of available data about individuals can provide detailed insight into how we behave and extrapolation of that data can actually be used to predict future behavior. Under these circumstances, the government’s collection and use of big data is a growing concern for many, especially in the wake of the NSA spying revelations earlier this year.

Profiling the Innocent

For an investigator who documents evidence, believes in the sanctity of proof and never assumes anything, the idea of predictive profiling can be alarming. In fact, any kind of profiling goes against the nature of impartial investigations. There’s an obvious danger that big data will be used to profile innocent people, even as American society already struggles with problems of discrimination and police profiling of minorities.

A Double-edged Sword

Adler created this program to bring forward discussions about how big data could be used to prevent crimes before they occur. But he also wanted to highlight the dangers of predictive technologies and profiling.

Some of the individual features Adler was able to correlate to having a felony record in Kentucky were:

  • being male
  • having hazel eyes
  • previous minor offenses beyond traffic tickets
  • having tattoos
  • being light-skinned

It turns out that in Kentucky, most of the inmate population is white, but statistics from other states would likely differ. Whatever the statistics, it raises some interesting ethical and practical questions about society should do with knowledge that can be dangerous. Just because it's possible doesn't mean it's advisable.

What do you think? Do you think the possibility of being able to prevent tragedies such as the Boston bombing justifies using this kind of technology to profile and target people who are seen as likely to commit crimes, despite its potential to implicate innocent people?