Reflections on Black Mirror – Cautionary Tale about Tech in Our Lives

Streaming media has taken over from network TV. Among the many shows we’ve binged on Netflix, one of our favorites is Black Mirror.

Each episode is a unique story, much like The Twilight Zone and others long ago. Uniquely, the Black Mirror stories are each cautionary tales about technology in our lives – the risks of misuse, loss of privacy, loss of intimacy.

One episode for example follows a mother who tracks her daughter through an implant and tablet app that allows for real-time geolocation and vitals, but also displays what her daughter sees and even blocks disturbing content from her vision. Other episodes also extend the reach of today’s technology to fictionalize uncontrollable security robots, intrusive virtual dating apps and other scenarios that focus generally on the dark side of ‘future’ technology adoption by consumers. In nearly every episode, the focus is on consumer devices, phones, pads, sensors, and the use of massive amounts of machine data spewing from these devices, shown for either better or usually detrimental impacts on the individual.

In reality, even with the technology – devices, software, analytics and machine learning we have today, we face these ethical dilemmas. My kids, both millenials, give their data freely, and expect to gain advantages from its mining. And having worked at Splunk, understanding the potential of ‘big data” analytics and artificial intelligence, I am of like mind. Sharing freely with attendant benefits outweighs security concerns – the exception being behaviors which can directly lead to identity theft.

A recent news show segment  featured a British security expert explaining what data we are sharing via Fitbit and similar devices, how our whereabouts and travels could be shown on a heat map, what implications that has for military personnel, etc. Yet the benefits of using a Fitbit and openly sharing geolocation and your vitals is well established. Another positive example of using analytics and AI to mine data for its potential was highlighted in a show about Chicago police, social workers, and clergy who have teamed together to mine data collected on potential felons in order to predict criminal behavior by these individuals (yes, without the imprisoned beings depicted in Minority Report!). Once they have a list of high risk subjects, a member of the police squad, a social worker, clergy, etc. actually visit the subject at home and try to convince them to enter into counseling, job training, and other programs. It’s not even at a 50% acceptance rate, but every point on that graph matters, and lives are saved. These points offer some light to go with what is often assumed to be a darker path via big data. And the implications for running a better business, endless!

Doug Harr

Owning All Clouds

cloud-computing-multiple-clouds

By Doug Harr

As part of my career as an IT executive for the last dozen plus years, I’ve led several companies through a process of migrating their business application portfolio to the cloud.  At Portal Software, that meant deploying SuccessFactors for HR performance reviews, and OpenAir for Professional Services Automation.  At Ingres that meant deploying Intacct for Financials, Salesforce for CRM, and lots of other cloud solutions. The approach for me reached its zenith at Splunk, where we had a 100% cloud business application portfolio, and where 50% of our compute and store capacity was at Amazon. With so much functionality in the cloud the question of roles and responsibilities became a focus for the company. In this very cloud-friendly shop, what should IT’s focus be? What level of administration of these solutions could actually be owned and delivered by departmental owners, such as Sales Operations, Customer Support Operations or HR administration?

As one example, both at VMware, where I was program manager for their Salesforce implementation, and at Splunk, where I was the CIO, we had very strong sales operations teams, and fairly complex Salesforce environments. In those environments Sales Op’s began to take ownership of more functionality in the Salesforce suite. This included user administration, assignment of roles to users, territories to reps, and just about all reporting. This grew to include modifying page layouts, and other configuration capabilities normally owned and controlled by IT. In my view the idea of enabling the Sales Op’s team was attractive for several reasons: (1) they wanted the power to do these things (2) they were not waiting for IT on the things they felt were high priority (3) they were closer to the sales teams who actually worked inside the tool, and so they were good at interpreting issues and acting – as good certainly as an IT Business Analyst, or even someone with fairly good technical skills. In these scenarios it freed IT to work on deeper technical issues, level 3 incidents, environment management, integration, reliability, etc.

In another example, at Splunk we made wide use of Amazon EC2 for compute and storage capacity. In these cases, IT System Admins were not needed – environments were spun up and used directly by personnel in Engineering and Customer Support. This was an amazing success, and it freed IT to work on monitoring usage, working deals on cost, and managing the overall vendor relationship.

Not every department has a team or individual ready to own or take a major role in the management of a SaaS or IaaS platform. For every HR department that manages Workday, there’s a finance department that does not manage Netsuite. It depends on the tool, and the personnel. What I’ve found is it can also depend on the CFO and management of a business function – some execs are happy to have these resources placed in the business, some are more afraid of  “shadow IT spend” or they’re caught up suggesting that IT can’t deliver and granting this power is a cop-out. Actually, I had a moment like this at Splunk, where I had not adequately updated two peer execs on our intent to get more deep IT skills hired into Sales Op’s, and had to sort that one out, to make sure everyone understood this was not a shadow operation! So there can be bumps in the road, but in my view adopting this approach is inevitable really, as software platforms and micro apps are becoming widespread, and so is the ability and desire by departmental teams to be more involved in the direction of how those tools, platforms, and apps are rolled out and used.

All this speaks to the future role of IT, and I for one have lived that future, as least in part. It’s one where IT is more strategic, focused on vendor/portfolio management, integration and security. To be sure some functions that are broadly used across all departments, and some that are task specific, still accrue to IT in most cases, or to partners that offer elements of typical IT as a service (think Help Desk). But done well, each department owns more of its technology, feels more in charge of its future, its technology adoption, its responsible use, along with other benefits. And, IT focuses less on being everything to everybody, maintaining disparate queues of backlogged work, and more focused on higher level matters, transforming the business for the digital age, and accompanying delivery of more complex technical solutions.

Right where we should want to be.

@douglasharr

Big Data that Support Key Business Results

By Doug Harr

Word Cloud "Big Data"

CIOs have a tremendous opportunity to harness Big Data. But CIOs are also wary of buzz words and heavily marketed trends which often lead to pursuits that are secondary rather than those aligned to key results. And while it may not be clear to everyone in the executive ranks, CIOs are keenly aware that all systems (not just business systems) in an organization spew out data, much of which can be mined for useful information. When I was CIO at Splunk, we called this systems-generated data “machine data” and I had the chance to witness just how many brilliant things can be done by harnessing it. So when and where does it make sense for CIOs to embark on data driven projects? How can a CIO choose where to focus efforts?

In a typical corporation, CIOs look after everything from business applications, operations and infrastructure, security, and the infrastructure that supports their web presence. Looking across the vast portfolio of services they support, a CIO’s primary concern will be to properly implement capabilities, and then manage them in such a way that the business is effectively and efficiently supported. Taking on analytics becomes the next layer to tackle once each fundamental service is in place. Where the rubber meets the road is when you can use machine/big data to determine more than just the status of your infrastructure. That is, when you can see the opportunity to mine data for services that support the portfolio and ultimately the corporation’s key results.

Getting Started

Select a Use Case: Focus on high-value use cases first. External-customer facing use cases are particularly well suited as first forays into data mining programs. Making the customer experience as compelling as possible is key for all organizations. Developing deeper insights into this experience has enormous potential and will garner support from your marketing team and other internal customers.

Work with Your Internal Business Partners: Meet with your internal team, and departments such as marketing and engineering, to select a use case they care about. Choose a project that will impact their external customers—typically the customers of your company. While internally focused use cases for Finance, HR, Sales or other teams can be instructive, prioritize programs that address the company’s core product or service and customer experience.

Put the Technology in Place: Don’t place all your bets on one solution. Consider your approach and look at real-time products (such as Splunk), cloud offerings, and batch-oriented systems (such as Hadoop). Before you make any purchases, do a proof-of-concept. Ensure you have support staff from the vendor working with you and try a sample set of your data in their engine.

Review the Reports: Step back and review reports from the solutions you are considering. Analyze the insights, both qualitative and quantitative. For example, if you use a customer support system for your proof-of-concept, ask questions like these:

  • How long does it take a customer to get through the online sales cycle? How much time elapses from engagement to first customer support call?
  • How long are customers spending in our systems?
  • How many orders are placed per month? What’s the typical amount of time it takes to book an order? How long does it take to book an order at month end?
  • Does it appear anyone is trying to infiltrate our systems?

Demonstrate What You Can Produce: Share your proof-of-concept results with your internal team. There’s no greater fun than giving your sales and marketing customers something they didn’t have before, something that helps them make better decisions more quickly. Note that there are some use cases you will never be able to share widely. For example, security use cases can only be shared with security personnel and auditors.

Delivering Value

Bringing Big Data programs into your company is worth the effort. These data can tell you things about your business and systems you can’t learn any other way. Chosen and managed carefully, these programs can improve customer service (internal and external) provide a qualitative view into the customer experience, offer clearer insight into the products and services, and even enable a company to better understand its own employees.

Doug Harr is a partner at StrataFusion. He has more than 25 years of technology leadership experience both as an executive-level technology practitioner and in senior leadership roles for professional services organizations. Contact him at dharr@stratafusion.com; follow Doug at twitter.com/douglasharr.

More BIG Data

Not Just a Buzz Word for CIOs

Doug Harr

Big Data 2

 

What do CIOs do with Big/Machine Data?

In 2010, most of us were deleting machine log data from our systems as soon as it was clear that processes had survived the night – very frequently this data was being tossed in the trash daily. Now a short four years later, we’ve all learned that there is information in that data, and that by saving it and using search and analytics to mine it, an amazing number of things are possible.

splunk-logo

“splunking”

As CIO at Splunk (a rapidly growing company that makes a platform aiming to make machine data available, usable and valuable for everyone) the first example I saw of the use of the the solution within company itself was related to their go-to-market model. Splunk had and has a “free-mium” model where customer and prospects can download Splunk software to their PC/Mac or host, then run machine data into it to search or analyze the data. We were “splunking” those downloads – for example taking the Apache web log from the Splunk web site, contact feeds from our CRM system, Salesforce, for a lookup table, and communications back to our site which come back from Splunk itself once up and running. With just these three types of machine data records, one being a “lookup” table to enrich the data, we were able to produce an amazing array of analytics and reporting used by IT, product management, marketing, and the others in measuring the download experience, uptime, and capacity, but also the actual sales pipeline, and understanding the company’s prospects.

Downloaded Experiences – Visualized

Downloads Experience

Stats

Since IT was responsible for making sure that the free Splunk software download function was operating properly, we were interested in the download experience – things like average minutes per download, and how that differed by platform.

 

 

 

We also liked seeing activity via geo-mapping, and other dashboard visualizations, as shown below:

Downloads by CRM Region

CRM Map 2

 

 

 

 

 

 

Real-time Data – Driving Business Excellence

Over the years the use of Splunk internally was expanded to address needs for both IT and business constituents providing customer insight, protecting against intrusion and malware, enhancing operations effectiveness, and other uses, falling into these categories:

  • Monitor and manage infrastructure – capacity, uptime, project delivery
  • Deliver application management – health of business apps, usage statistics, even some missing reporting
  • Provide analytics on security posture – identify and eradicate malware, APT’s (advanced persistent threats), and other threats
  • Provide business analytics – most of these derived by departments – people in sales, marketing, and engineering analyzing business trends, product delivery, customer support and more
  • Internet of Things – we even “splunked” our headquarters building to review temperature and C02 levels

These examples roughly match the broad spectrum of what can be done when ingesting and analyzing machine data in real time. Stay tuned for more examples in posts to come. Now with StrataFusion, I will be consulting and teaching more on these topics!

 

 

Big Data

Not Just a Buzz Word for CIOs

Doug Harr

Big Data box image

Four wonderful years at Splunk as CIO. Splunk? Splunk is a rapidly growing company that makes a platform aiming to make machine data available, usable and valuable for everyone. While there, I built the IT and Real Estate/Facilities teams and solidified an “all cloud” business applications portfolio. This advanced my knowledge of all things cloud, this time including the appropriate use of Amazon’s EC2 (Amazon Elastic Compute Cloud*) for compute and storage needs. At Splunk everything but Engineering applications were delivered via cloud subscriptions, and half of the compute and storage needed for Engineering, from EC2. More on that in future posts.

Harness Opportunity

The most impactful thing I learned at Splunk is the tremendous opportunity CIO’s have to harness what the market is calling “Big Data” and which Splunk refers to as “machine data.” In this context, “machine data” can be thought of as system logs, sensor readings, results of polling and measuring machine behavior. Every computer system, storage, device, web, app, and database spews forth machine data – much of it delivered via a constant, real-time stream from the machine – and almost all of it in text format. The original application of Splunk was for data center management. What was built worked equally as well for application management, security, business and web analytics, and more recently, to monitor and analyze devices connected as “the internet of things.” Results come from searching through the data and formulating analytics from its content – ranging from things like “are the machines up? Are there signs of imminent failure? Are there attempts to infiltrate and hack the system? …..to “Has Joe taken his heart monitor off?”   Uses are limited only by the imagination. What can you do with you data? Learn more in my next post.  Or, visit me at StrataFusion.

*Amazon Elastic Compute Cloud (Amazon EC2) provides scalable computing capacity in the Amazon Web Services (AWS) cloud.