The false positives syndrome In the Era of Big Data, false positives mean Big Money

11 May 2016

There is the funny side of false positives - Ben Stiller’s baffled expression when discovering that the mysterious activities of his stepfather-to-be, sneaky Robert de Niro, camouflaged under the code name “Operation Koh-Samui”, relate in fact to the secret planning of his own honeymoon trip, for example.

Very often, however, false positives cause a moral problem. Think of the healthy patient diagnosed with liver cancer: she sees her world crumble, falls into depression. The apologies from her physician a few months later won’t change anything. She will never envision life in the same way as before. 

In the business world, false positives mean money. In the era of Big Data, they even mean Big Money. Billions of dollars of missed sales for online merchants when the system rejects credit card transactions from lawful owners ($118 billion in 2014 as quoted by some sources); millions of working hours wasted in addressing erroneous malware alerts ($1.3 million on average per year per company); millions and millions of unnecessary costs to social security systems because of wrong treatment prescriptions… 

Credits: Nick Farnhill, creative commons

Banks are one of the victims that suffer most from the false positives syndrome. Large banks nowadays have no other solution than engaging the heavy artillery to keep their compliance monitoring up to the flood of regulations meant to prevent the next Libor scandal or foreign exchange manipulation. They pay expensive software licenses, hire hundreds of people to monitor the flurry of alerts popping up each day on their screens and assess whether an alarm is false or true - the latter being a rare occurrence, but not to be missed.

Note that those millions invested to improve their surveillance program do not guarantee at all that they’ll, one day, find the one malevolent message threatening to drag them in front of the Federal Court. Why? Because the heavy artillery, the so-called legacy systems, are based on a more or less sophisticated mixture of rules, statistics and human annotations. They require loads of time and processing power to perform their tasks. They make it physically, as well as economically, impossible to process the millions of emails sent and received by the hundreds of thousands of bank employees every day. As a result, even the largest bank corporations, despite their colossal budget and manpower, are reduced to monitoring a sample of the messages - keeping fingers crossed that their efforts, at least, will convince the court that, well, they did their very best. 

It’s like shooting at a swarm of tsetse flies with a bazooka: you might kill dozens of the beasts, but get bitten by a single survivor and die.

Now, imagine that, instead of a bazooka, you put in the hands of the compliance patrol a loose net, large enough to capture the whole swarm, flexible enough to follow its movements and adapt to the changes in its shape. The threads of that net are so tightly woven that not one tsetse fly can pass through them and bite you. On top of that, imagine that the net is an intelligent device capable of recognizing the shape, the odor, the color and the sound of a tsetse fly and adapts its aperture to let harmless insects like dragonflies and bees escape.

We call this adaptive, intelligent net the Retina engine. Because it is intrinsically efficient, the Retina engine enables the surveillance team to monitor the entirety of the messages and unveil any real threat - with a fraction of the computing power required by legacy systems. Because it is highly accurate, the Retina engine drastically reduces false positive alerts, making it possible to reallocate human resources to more demanding tasks than boring false alarms screening.

As I see it, the Retina Engine is the cure for the false positives syndrome.

Author:
Marie-Pierre Garnier,
VP Marketing & Communications

Keep me updated please!

* required fields