Streaming media has taken over from network TV. Among the many shows we’ve binged on Netflix, one of our favorites is Black Mirror.
Each episode is a unique story, much like The Twilight Zone and others long ago. Uniquely, the Black Mirror stories are each cautionary tales about technology in our lives – the risks of misuse, loss of privacy, loss of intimacy.
One episode for example follows a mother who tracks her daughter through an implant and tablet app that allows for real-time geolocation and vitals, but also displays what her daughter sees and even blocks disturbing content from her vision. Other episodes also extend the reach of today’s technology to fictionalize uncontrollable security robots, intrusive virtual dating apps and other scenarios that focus generally on the dark side of ‘future’ technology adoption by consumers. In nearly every episode, the focus is on consumer devices, phones, pads, sensors, and the use of massive amounts of machine data spewing from these devices, shown for either better or usually detrimental impacts on the individual.
In reality, even with the technology – devices, software, analytics and machine learning we have today, we face these ethical dilemmas. My kids, both millenials, give their data freely, and expect to gain advantages from its mining. And having worked at Splunk, understanding the potential of ‘big data” analytics and artificial intelligence, I am of like mind. Sharing freely with attendant benefits outweighs security concerns – the exception being behaviors which can directly lead to identity theft.
A recent news show segment featured a British security expert explaining what data we are sharing via Fitbit and similar devices, how our whereabouts and travels could be shown on a heat map, what implications that has for military personnel, etc. Yet the benefits of using a Fitbit and openly sharing geolocation and your vitals is well established. Another positive example of using analytics and AI to mine data for its potential was highlighted in a show about Chicago police, social workers, and clergy who have teamed together to mine data collected on potential felons in order to predict criminal behavior by these individuals (yes, without the imprisoned beings depicted in Minority Report!). Once they have a list of high risk subjects, a member of the police squad, a social worker, clergy, etc. actually visit the subject at home and try to convince them to enter into counseling, job training, and other programs. It’s not even at a 50% acceptance rate, but every point on that graph matters, and lives are saved. These points offer some light to go with what is often assumed to be a darker path via big data. And the implications for running a better business, endless!
Doug Harr