Tuesday, December 01, 2009

The Top 10 Trends for 2010 in Analytics, Business Intelligence, and Performance Management

In the wake of the long-running massive industry consolidation in the Enterprise Software industry that reached its zenith with the acquisitions of Business Intelligence market leaders Hyperion, Cognos, and Business Objects in 2007, one could certainly have been forgiven for being less than optimistic about the prospects of innovation in the Analytics, Business Intelligence, and Performance Management markets.  This is especially true given the dozens of innovative companies that each of these large best of breed vendors themselves had acquired before being acquired in turn.  While the pace of innovation has slowed to a crawl as the large vendors are midway through digesting the former best of breed market leaders, thankfully for the health of the industry, nothing could be further from the truth in the market overall.  This market has in fact shown itself to be very vibrant, with a resurgence of innovative offerings springing up in the wake of the fall of the largest best of breed vendors.

So what are the trends and where do I see the industry evolving to?  Few of these are mutually exclusive, but in order to provide some categorization to the discussion, they have been broken down as follows:

1.  We will witness the emergence of packaged strategy-driven execution applications. As we discussed in Driven to Perform: Risk-Aware Performance Management From Strategy Through Execution (Nenshad Bardoliwalla, Stephanie Buscemi, and Denise Broady, New York, NY, Evolved Technologist Press, 2009), the end state for next-generation business applications is not merely to align the transactional execution processes contained in applications like ERP, CRM, and SCM with the strategic analytics of performance and risk management of the organization, but for those strategic analytics to literally drive execution.  We called this “Strategy-Driven Execution”, the complete fusion of goals, initiatives, plans, forecasts, risks, controls, performance monitoring, and optimization with transactional processes.  Visionary applications such as those provided by Workday and SalesForce.com with embedded real-time contextual reporting available directly in the application (not as a bolt-on), and Oracle’s entire Fusion suite layering Essbase and OBIEE capabilities tightly into the applications' logic, clearly portend the increasing fusion of analytic and transactional capability in the context of business processes and this will only increase.

2.  The holy grail of the predictive, real-time enterprise will start to deliver on its promises.  While classic analytic tools and applications have always done a good job of helping users understand what has happened and then analyze the root causes behind this performance, the value of this information is often stale before it reaches its intended audience.  The holy grail of analytic technologies has always been the promise of being able to predict future outcomes by sensing and responding, with minimal latency between event and decision point.  This has become manifested in the resurgence of interest in event-driven architectures that leverage a technology known as Complex Event Processing and predictive analytics.  The predictive capabilities appear to be on their way to break out market acceptance IBM’s significant investment in setting up their Business Analytics and Optimization practice with 4000 dedicated consultants, combined with the massive product portfolio of the Cognos and recently acquired SPSS assets.  Similarly, Complex Event Processing capabilities, a staple of extremely data-intensive, algorithmically-sophisticated industries such as financial services, have also become interesting to a number of other industries that can not deal with the amount of real-time data being generated and need to be able to capture value and decide instantaneously.  Combining these capabilities will lead to new classes of applications for business management that were unimaginable a decade ago.

3.  The industry will put reporting and slice-and-dice capabilities in their appropriate places and return to its decision-centric roots with a healthy dose of Web 2.0 style collaboration.  It was clear to the pioneers of this industry, beginning as early as H.P. Luhn's brilliant visionary piece A Business Intelligence System from 1958, that the goal of these technologies was to support business decision-making activities, and we can trace the roots of modern analytics, business intelligence, and performance management to the decision-support notion of decades earlier.  But somewhere along the way, business intelligence became synonymous with reporting and slicing-and-dicing, which is a metaphor that suits analysts, but not the average end-user.  This has contributed to the paltry BI adoption rates of approximately 25% bandied about in the industry, despite the fact that investment in BI and its priority for companies has never been higher over the last five years.  Making report production cheaper to the point of nearly being free, something SaaS BI is poised to do (see above), is still unlikely to improve this situation much.  Instead, we will see a resurgence in collaborative decision-centric business intelligence offerings that make decisions the central focus of the offerings.  From an operational perspective, this is certainly in evidence with the proliferation of rules-based approaches that can automate thousands of operational decisions with little human intervention.  However, for more tactical and strategic decisions, mash-ups will allow users to assemble all of the relevant data for making a decision, social capabilities will allow users to discuss this relevant data to generate “crowdsourced” wisdom, and explicit decisions, along with automated inferences, will be captured and correlated against outcomes.  This will allow decision-centric business intelligence to make recommendations within process contexts for what the appropriate next action should be, along with confidence intervals for the expected outcome, as well as being able to tell the user what the risks of her decisions are and how it will impact both the company’s and her own personal performance.

4.  Performance, risk, and compliance management will continue to become unified in a process-based framework and make the leap out of the CFO’s office.  The disciplines of performance, risk, and compliance management have been considered separate for a long time, but the walls are breaking down, as we documented thoroughly in Driven to Perform.  Performance management begins with the goals that the organization is trying to achieve, and as risk management has evolved from its siloed roots into Enterprise Risk Management, it has become clear that risks must be identified and assessed in light of this same goal context.  Similarly, in the wake of Sarbanes-Oxley, as compliance has become an extremely thorny and expensive issue for companies of all sizes, modern approaches suggest that compliance is ineffective when cast as a process of signing off on thousand of individual item checklists, but rather should be based on an organization’s risksAll three of these disciplines need to become unified in a process-based framework that allows for effective organizational governance.  And while financial performance, risk, and compliance management are clearly the areas of most significant investment for most companies, it is clear that these concerns are now finally becoming enterprise-level plays that are escaping the confines of the Office of the CFO.  We will continue to witness significant investment in sales and marketing performance management, as vendors like Right90 continuing to gain traction in improving the sales forecasting process and vendors like Varicent receive hefty $35 million venture rounds this year, no doubt thanks to experiencing over 100% year over year growth in the burgeoning Sales Performance Management category.  My former Siebel colleague, Bruce Cleveland, now a partner at Interwestmakes the case for this market expansion of performance management into the front-office rather convincingly and has invested correspondingly.
  
5.  SaaS / Cloud BI Tools will steal significant revenue from on-premise vendors but also fight for limited oxygen amongst themselves.  From many accounts, this was the year that SaaS-based offerings hit the mainstream due to their numerous advantages over on-premise offerings, and this certainly was in evidence with the significant uptick in investment and market visibility of SaaS BI vendors.  Although much was made of the folding of LucidEra, one of the original pioneers in the space, and while other vendors like BlinkLogic folded as well, vendors like Birst, PivotLink, Good Data, Indicee and others continue to announce wins at a fair clip along with innovations at a fraction of the cost of their on-premise brethren.  From a functionality perspective, these tools offer great usability, some collaboration features, strong visualization capabilities, and an ease-of-use not seen with their on-premise equivalents whereby users are able to manage the system in a self-sufficient fashion devoid of the need for significant IT involvement.  I have long argued that basic reporting and analysis is now a commodity, so there is little reason for any customer to invest in on-premise capabilities at the price/performance ratio that the SaaS vendors are offering (see BI SaaS Vendors Are Not Created Equal ) .  We should thus expect to see continued dimunition of the on-premise vendors BI revenue streams as the SaaS BI value proposition goes mainstream, although it wouldn’t be surprising to see acquisitions by the large vendors to stem the tide.  However, with so many small players in the market offering largely similar capabilities, the SaaS BI tools vendors may wind up starving themselves for oxygen as they put price pressure on each other to gain new customers.  Only vendors whose offerings were designed from the beginning for cloud-scale architecture and thus whose marginal cost per additional user approaches zero will succeed in such a commodity pricing environment, although alternatively these vendors can pursue going upstream and try to compete in the enterprise, where the risks and rewards of competition are much higher.   On the other hand, packaged SaaS BI Applications such as those offered by Host Analytics, Adaptive Planning, and new entrant Anaplan, while showing promising growth, have yet to mature to mainstream adoption, but are poised to do so in the coming years.  As with all SaaS applications, addressing key integration and security concerns will remain crucial to driving adoption.

6.  The undeniable arrival of the era of big data will lead to further proliferation in data management alternatives.  While analytic-centric OLAP databases have been around for decades such as Oracle Express, Hyperion Essbase, and Microsoft Analysis Services, they have never held the same dominant market share from an applications consumption perspective that the RDBMS vendors have enjoyed over the last few decades. No matter what the application type, the RDBMS seemed to be the answer.  However, we have witnessed an explosion of exciting data management offerings in the last few years that have reinvigorated the information management sector of the industry.  The largest web players such as Google (BigTable),Yahoo (Hadoop), Amazon (Dynamo), Facebook (Cassandra) have built their own solutions to handle their own incredible data volumes, with the open source Hadoop ecosystem and commercial offerings like CloudEra leading the charge in broad awareness.  Additionally, a whole new industry of DBMSs dedicated to Analytic workloads have sprung up, with flagship vendors like NetezzaGreenplumVerticaAster Data, and the like with significant innovations in in-memory processing, exploiting parallelism, columnar storage options, and more.  We already starting to see hybrid approaches between the Hadoop players and the ADBMS players, and even the largest vendors like Oracle with their Exadata offering are excited enough to make significant investments in this space.  Additionally, significant opportunities to push application processing into the databases themselves are manifesting themselves.  There has never been the plethora of choices available as new entrants to the market seem to crop up weekly.  Visionary applications of this technology in areas like metereological forecasting and genomic sequencing with massive data volumes will become possible at hitherto unimaginable price points.

7.   Advanced Visualization will continue to increase in depth and relevance to broader audiences.  Visionary vendors like TableauQlikTech, and Spotfire (now Tibco) made their mark by providing significantly differentiated visualization capabilities compared with the trite bar and pie charts of most BI players' reporting tools.  The latest advances in state-of-the-art UI technologies such as Microsoft’s SilverLight, Adobe Flex, and AJAX via frameworks like Google’s Web Toolkit augur the era of a revolution in state-of-the art visualization capabilities.  With consumers broadly aware of the power of capabilities like Google Maps or the tactile manipulations possible on the iPhonethese capabilities will find their way into enterprise offerings at a rapid speed lest the gap between the consumer and enterprise realms become too large and lead to large scale adoption revolts as a younger generation begins to enter the workforce having never known the green screens of yore.  


8.  Open Source offerings will continue to make in-roads against on-premise offeringsMuch as Saas BI offerings are doing, Open Source offerings in the larger BI market are disrupting the incumbent, closed-source, on-premise vendors.  Vendors like Pentaho and JasperSoft are really starting to hit their stride with growth percentages well above the industry average, offering complete end-to-end BI stacks at a fraction of the cost of their competitors and thus seeing good bottom-up adoption rates.  This is no doubt a function of the brutal economic times companies find themselves experiencing.  Individual parts of the stacks can also be assembled into compelling offerings and receive valuable innovations from both corporate entities as well as dedicated committers:  JFreeChart for charting, Actuate's BIRT for reporting, Mondrian and Jedox's Palo for OLAP Servers, DynamoBI's LucidDB for ADBMS, Revolution Computing's R for statistical manipulation, Cloudera's enterprise Hadoop for massive data, EsperTech for CEP, Talend for Data Integration / Data Quality / MDM, and the list goes on. These offerings have absolutely reached a level of maturity where they are capable of being deployed in the enterprise right alongside any other commercial closed-source vendor offering.

9.  Data Quality, Data Integration, and Data Virtualization will merge with Master Data Management to form a unified Information Management Platform for structured and unstructured data.  Data quality has been the bain of information systems for as long as they have existed, causing many an IT analyst to obsess over it, and data quality issues contribute to significant losses in system adoption, productivity, and time spent addressing them.  Increasingly, data quality and data integration will be interlocked hand-in-hand to ensure the right, cleansed data is moved to downstream sources by attacking the problem at its root.  Vendors including SAP BusinessObjects, SAS, Informatica, and Talend are all providing these capabilities to some degree today.  Of course, with the amount of relevant data sources exploding in the enterprise and no way to integrate all the data sources into a single physical location while maintaining agility, vendors like Composite Software are providing data virtualization capabilities, whereby canonical information models can be overlayed on top information assets regardless of where they are located, capable of addressing the federation of batch, real-time and event data sources.  These disparate data soures will need to be harmonized by strong Master Data Management capabilities, whereby the definitions of key entities in the enterprise like customers, suppliers, products, etc. can be used to provide semantic unification over these distributed data sources.  Finally, structured, semi-structured, and unstructured information will all be able to be extracted, transformed, loaded, and queried from this ubiquitious information management platform by leveraging the capabilities of text analytics capabilities that continue to grow in importance and combining them with data virtualization capabilities.

10. Excel will continue to provide the dominant paradigm for end-user BI consumption.  For Excel specifically, the number one analytic tool by far with a home on hundreds of millions of personal desktops, Microsoft has invested significantly in ensuring its continued viability as we move past its second decade of existence, and its adoption shows absolutely no sign of abating any time soon.  With Excel 2010's arrival, this includes significantly enhanced charting capabilities, a server-based mode first released in 2007 called Excel Services, being a first-class citizen in SharePoint, and the biggest disruptor, the launch of PowerPivot, an extremely fast, scalable, in-memory analytic engine that can allow Excel analysis on millions of rows of data at sub-second speeds. While many vendors have tried in vain to displace Excel from the desktops of the business user for more than two decades, none will be any closer to succeeding any time soon.  Microsoft will continue to make sure of that.

And so ends my list of prognostications for Analytics, Business Intelligence, and Performance Management in 2010! What are yours? I welcome your feedback on this list and look forward to hearing your own views on the topic.

Tuesday, November 24, 2009

Perspectives from DreamForce ’09: On the current state of SaaS in the Large Enterprise with a focus on current SaaS BI trends

I had the pleasure of conducting an interview at the DreamForce '09 event with Dennis Howlett, who writes the well-written and widely-read Irregular Enterprise blog on ZDNet.  The video of the interview is embedded immediately below and a transcript of our discussion follows.




NDB: Hi. I’m Nenshad Bardoliwalla and I’m the author of Driven to Perform and formerly an executive with SAP.

DAH: Ok.  So, Nenshad, what I’m interested in is how you see the difference between the cloud operators that are presenting at events like SalesForce (DreamForce '09) compared to the more traditional events that we see around the place.  So what are the real differences you are seeing?

NDB: One of the key things we saw in yesterday’s keynote as well as today’s is the pace of innovation.  What you are seeing with Salesforce.com with their Sales Cloud, Service Cloud, Custom Cloud, and then with introduction of Chatter, is that they are able to introduce large pieces of functionality to their customer base in very quick deployment cycles that is just almost impossible for the on premise vendors to do.  So, in 6 month to 1 year intervals, to release major pieces of functionality that they can deploy to their entire installed base, versus the 12-18 month cycles of the on premise vendors that require an upgrade, is a very signficant differentiator.

DAH:  So what does that mean in real terms for people who are buying technology today and going in to the future?

NDB:  I think that if I look at the amount of money that customer are spending on the existing on-premise applications that they have versus the amount of money they spend on the cloud (based offerings), they are able to realize value much quicker than they could with the on-premise technologies they have.  Because the cloud vendors are able to distribute the cost of the platform across a very, very large economic base, you are able to get the economies of scale based efficiencies passed to the customer.  So you wind up paying a lot less up front, on a very consistent, metered basis with the cloud offerings, and you are also able to absorb the innovation much quicker and deploy it to your business.

DAH:  It’s very obvious to me that start-ups and young companies are going to be immediate targets for this kind of thing, but what about companies that have been around 20, 30, 50, 100 years?  Do you see those (companies) making a move into this area any time soon, and if so, what sort of areas would you suggest?

NDB:  I think what we’re seeing in the large enterprise space is that there has certainly already been considerable penetration of human capital management solutions, as well as in the CRM area.  I’ve heard fairly consistently at SalesForce.com (at the DreamForce ’09 event) of 5000, 10000, even 12000 user deployments, etc..  I think even flagship, marquis customers of the on-premise vendors like Siemens adopting 420,000 seats of a SaaS vendor suggest that the large enterprise is definitely willing to absorb the innovations of the cloud computing model.  That being said, I think you find areas like financials, an area that you (DAH) and I both share a keen interest and expertise in, kind of coming up along besides the human capital management and sales force areas, and now starting to take hold with vendors like Workday, but also others like Intacct, you wrote about FinancialForce earlier, etc.

DAH:  Your interest obviously tends to be in the BI area.  What sort of impact do you see these kinds of technologies having in that area and what benefits will customers see as we move forward?

NDB:  In the business intelligence area you have seen an explosion, especially in the last year, in terms of investment in companies who are providing SaaS BI capabilities; companies like Birst, PivotLink, Good Data, Indicee, which suggests that there is a very, very healthy market for people who are trying to get business intelligence capabilities but have never been able to afford to do so in the small and medium sized businesses, but then in the larger enterprise, without being able to change their deployments.  I know in my experience in 10 years in the large enterprise, though not in SaaS, customers not being able to change the system quickly enough, which for (reporting and) analysis is a very key differentiator.  Having the ability to do that with the SaaS vendors is a key step up I think for business intelligence technologies. 

DAH:  It’s a very fast moving area.  What do you see in the immediate future, say in the next 1 to 3 years (for the BI space)?

NDB:  I think right now we are seeing a very strong focus on tools, and as you know, Dennis, in most markets, things that start off as tools eventually become packaged and move upstream and become applications.  I think we will see in the next 1 to 3 years a couple of key marquis applications players in the BI space, people who are doing Corporate Performance Management, people who are doing Sales Performance Management, or even Supply Chain Performance Management.  We'll have real established application vendors for BI applications that can run alongside the SalesForce.com’s, run alongside the Workday’s, and other marquis Tier 1 SaaS platforms.

DAH:  And they would, in that sense, need to be pretty affordable, I would guess, as compared to what we see at the moment with the traditional vendors, yes?

NDB:  Absolutely.  I think affordability is one of the key areas where there will be differentiation, but also another is “legs and regs” (legislation and regulations, especially important in Financials).  I think we are seeing now a fairly high degree of volatility in terms of the move to IFRS, people publishing via XBRL, other types of standards where people can absorb that and let their vendor take care of that for them instead of them reconfiguring their chart of accounts, them trying to map to XBRL taxonomies, etc.

DAH:  Great.  Thanks very much, indeed.

NDB:  Thank you.

Tuesday, November 17, 2009

The Road to Strategy-Driven Execution - Part II - History Lesson – From Embedded Analytics to Strategy-Driven Execution


At Siebel Systems, where I worked from 2000-2005, after we purchased nQuire in 2001 (which has since become Oracle Business Intelligence Enterprise Edition), we progressed through through four phases of linking the strategic (analytic applications) and execution (sales force automation, call center) systems.  


 

  • In the first phase, we enabled users to go to Siebel Analytics directly from tabs in Siebel Call Center, a very minimal IFRAME based integration. 
  • In the second, we created a concept of an “action link”, which allowed the user from a Siebel Analytics screen to navigate directly from a result record (e.g. the top opportunity in a top 10 list) to the actual opportunity in the Siebel Sales Force Automation application, where the user could then take immediate action.  This was a very powerful concept that remains key to the Oracle Fusion Applications value proposition.
  • In the third phase, based on technology that I along with my colleague developed, we provided customers the ability to embed contextual analytics from Siebel Analytics directly into the SFA or Call Center application.  For example, the top part of the screen would have a list of the 10 opportunities assigned to a sales rep, and in the bottom part, a set of contextual analytics about that opportunity.  As each new opportunity was highlighted, the analytics would update without the user ever leaving their context.  This opened up a whole new range of possibilities.





  • In the fourth phase, leveraging technology from Sigma Dynamics and now called Oracle Real-Time Decisions, using a combination of predictive analytics and business rules (see James Taylor’s excellent blog for a wealth of information on the topic), we were able to prescribe at the moment of contact with the customer a recommendation for how to treat them (e.g. upsell, free offer, etc.) based on a combination of their current context (for example what they pressed in the IVR tree) along with their historical data from the data warehouse.  Here then was the combination of real-time, predictive analytics enacted at the moment of insight that we had all been waiting for in the industry, delivered in a packaged fashion.


From a technology perspective, what was great about the Siebel Analytics offering was the common information model and how rich it was combined with a very rich web layer.  Through a combination of internal development as well as the purchase of Informatica’s analytical applications IP, we were able to develop a comprehensive package of analytic applications that spanned many roles, processes, and metrics from sales to service to supply chain to financials to human resources, which Oracle still sells quite sucessfully today.  However, it wasn’t until I went to Hyperion Solutions that I understood just how rich the analytical world could be.  The reporting, dashboards, and packaged analysis capabilities of Siebel Analytics were just a small part of a much larger world that contained scorecarding, initiative management, business modeling, activity-based costing, planning, budgeting, and forecasting, and financial consolidation and reporting.  But Hyperion played almost exclusively in finance, and my immediate thought at the time was, “how do we layer the Hyperion scorecarding, planning, and modeling solutions across the entire enterprise value chain just like we had done with Siebel Analytics?”  If a company could do that, and link to the transactional systems, I believed it would be industry changing.


Sure enough, right around that time, SAP announced their xApp Analytic Applications, and I was stunned.  This was a family of analytic applications, built using a visionary model-driven tool called Visual Composer, that was designed from the outset to integrate analytics and transactions using web services.  Even more interesting was that the applications had an incredible Flex-based user experience that was easily competitive with all the leading BI vendors.









I was convinced at that time, that if any vendor could deliver on the vision of Strategy-Driven Execution, it had to be SAP.  They owned the largest transactional applications installed based in the world, understood processes across many horizontal and vertical slices, were service-enabling all of those processes, and now had an exciting analytic capability.  I jumped at the chance to be a part of this in 2005.

In short order, things changed very dramatically in the technology landscape.  Oracle acquired PeopleSoft, Siebel, and then Hyperion, thereby also amassing an incredible set of strategic and execution assets, and SAP acquired Virsa, Pilot Software, OutlookSoft, and Business Objects.  The pieces were now all in place for SAP and/or Oracle to realize the vision of Strategy-Driven Execution in their comprehensive product portfolios.

Tuesday, November 10, 2009

Palladium 2009 Americas Summit - How To Take Kaplan and Norton's Management System from "Execution Premium" to the Next Level with "Driven to Perform"

It is impossible to be a student of the performance management discipline without knowing the names of Robert Kaplan & David Norton, whose multiple best-selling books such as The Balanced Scorecard, Strategy Maps, and numerous others are required reading for anyone in this industry.  Together, Kaplan & Norton have made numerous seminal contributions such as the balanced scorecard, strategy maps, and time-driven activity-based costing.  I had the honor of attending Professor Kaplan's executive education course at Harvard Business School called Driving Corporate Performance with co-author Denise Broady in 2007 and it was a wonderful experience.  While in the course, Professor Kaplan hinted at and I started to see how all the pieces that had been articulated could be synthesized into a unified framework and asked Professor Kaplan about this.  He smiled and told me that this would be the topic of his next book, which was Execution Premium, released in the middle of 2008.  I view this book as truly the crowning synthesis of his work with Norton.  It is no secret that my co-authors and I began writing Driven to Perform: Risk-Aware Performance Management From Strategy Through Execution right around the time the book came out, and speaking for myself only, their book was certainly influential because of this synthesis just as the others were.

On the eve of the Palladium 2009 Americas Summit, which promises to be a phenomenal event where both Kaplan and Norton will be giving keynotes, I wanted to point out, with the greatest humility possible, how  Driven to Perform builds on the ideas of Execution Premium and takes it to the next level. What are the three big ideas in Driven to Perform that take the Management System from Execution Premium to the Next Level?

- Driven to Perform unifies Performance, Risk, and Compliance Management in a process-based framework.  At today's opening sessions at the event, Professor Kaplan noted that risk management was an area he hoped to focus on in the future as it's an even bigger challenge than ABC and the Balanced Scorecard.  In Driven to Perform, performance management, risk management, and compliance management are woven together into a single strategic management process as shown in the diagram below:

 - Driven to Perform shows how this unified Performance, Risk, and Compliance process-based framework applies to nine different areas of the value chain:  Sales, Marketing, Service, Supply Chain, Product Development, Procurement, Finance, HR, and IT.  We include the processes, roles, metrics, collaboration points, and maturity models of each different line of business and their interlinkages, as you can see in the diagram below:

 -  Driven to Perform shows you how to literally drive execution with your strategy using the actual transactional business processes of the modern corporation to enable what we call strategy-driven execution.  For example, we show this transactional order process where we overlay the goals, risks, forecasts, and controls directly on top to show how they should intelligently drive the process directly:


My co-author Stephanie Buscemi will be at the event signing books.  Make sure you get a copy or you can order yours online!

A Recipe for Guaranteeing Failure - The Misalignment of People, Process, and Projects with Performance

I had the pleasure of recording a podcast led by Michael Krigsman with Naomi Bloom entitled Enterprise unplugged: Riffing on failure and performance.  Michael Krigsman is CEO of Asuret, Inc., a software and consulting company dedicated to reducing software implementation failures and writes the popular ZDNet blog on IT Project Failures. I have been most impressed by his steadfast dedication to cataloging in great detail the root causes of IT project failures in an effort to ensure a higher success rate.  If you are a practitioner in any aspect of project implementation, including Analytics, Business Intelligence, and Performance Management, his blog is a must read.

Naomi Bloom is a top consultant, analyst, writer, and thought leader throughout the HRM delivery system (HRMDS) industry.  She just launched a great site at In Full Bloom that I am certain will become one of the de facto destinations for HRM related information in short order.  To launch her blog, she wrote two fantastic posts on HRM measurement, The Road From HRM To Business Results Is Littered With Misguided Metrics Part I and Part II, that honestly could have come directly from Driven to Perform.  I sincerely respect her integrity, no-nonsense approach, and phenomenal amount of domain knowledge in her space.

Given our three areas of expertise, we converged fairly quickly on the theme "A Recipe for Guaranteeing Failure  - The Misalignment of People, Process, and Projects with Performance".  Every successful initiatve in the enterprise must start from the outcomes a business is trying to achieve, whether its a new series of HR initiatives, a new series of IT projects, or a new series of Business Intelligence initiatives.  Without constant, focused, and diligent effort to ensure alignment between the elements of people, process, projects, and performance, failure is all but guaranteed, a message that I internalized from my time working closely with Jonathan Becher, former CEO of Pilot Software and now SVP of Enterprise Solution Marketing at SAP.  We also discuss some really exciting ideas of the visual metaphors by which performance management can evolve into a discipline that can truly touch every person in a company by helping them understand exactly how one change in the enterprise impacts all the others.

Please listen read Michael's blog post and listen to the podcast and as always, your feedback is welcomed in the comments.  Enjoy!

Thursday, November 05, 2009

Is Enterprise 2.0 a Savior or a Charlatan? How Strategy-Driven Execution can pave the path to proving legitimate business value

I have followed the evolution of the topic since Andrew McAfee coined the phrase “Enterprise 2.0” in the spring 2006 Sloan Management Review article to describe the use of Web 2.0 tools and approaches by businesses.  I was also really excited to attend that Enterprise 2.0 2009 conference in San Francisco for the first time.  In this post, I want to describe what I saw at the conference, what I believe to be the missing components of the full Enterprise 2.0 picture, and also discuss how becoming "Driven to Perform" by understanding Strategy-Driven Execution is the best way to justify the value of Enterprise 2.0 in your organization.


It Starts With The Seminal Definition of Enterprise 2.0

First, we should start with a definition of Enterprise 2.0.  There are as many definitions as there are pundits, and I think it's important to stick to a definition that has been fairly widely adopted by a reputable authority.  Therefore I will use Andrew McAfee's definition that he provided here that most closely resonates with my own:

Enterprise 2.0 is the use of emergent social software platforms within companies, or between companies and their partners or customers.

Social software enables people to rendezvous, connect or collaborate through computer-mediated communication and to form online communities. (Wikipedia’s definition).

Platforms are digital environments in which contributions and interactions are globally visible and persistent over time.

Emergent means that the software is freeform, and that it contains mechanisms to let the patterns and structure inherent in people’s interactions become visible over time.

Freeform means that the software is most or all of the following:
  • Optional
  • Free of up-front workflow
  • Egalitarian, or indifferent to formal organizational identities
  • Accepting of many types of data
The organizers of the Enterprise 2.0 Conference delineate the difference between 1.0 and 2.0 as below:

Enterprise 1.0
Enterprise 2.0
Hierarchy
Friction
Bureaucracy
Inflexibility
IT-driven technology / Lack of user control
Top down
Centralized
Teams are in one building / one time zone
Silos and boundaries
Need to know
Information systems are structured and dictated
Taxonomies
Overly complex
Closed/ proprietary standards
Scheduled
Long time-to-market cycles
Flat Organization
Ease of Organization Flow
Agility
Flexibility
User-driven technology
Bottom up
Distributed
Teams are global
Fuzzy boundaries, open borders
Transparency
Information systems are emergent
Folksonomies
Simple
Open
On Demand
Short time-to-market cycles

The false dichotomy between the definitions of Enterprise 1.0 and Enterprise 2.0

These definitions of Enterprise 2.0 and their juxtaposition against the definitions of Enterprise 1.0 are misguided.  I am certain based on my experience that the free form emergent world depicted as Enterprise 2.0 is NOT an evolution from the structured world of Enterprise 1.0, but rather, the two will exist in an intertwined tapestry that defines the full breadth of what today's enterprises need to look like.  It's extremely unhealthy for our industry to pit these two worlds against each other because they will perpetually co-exist.

I believe a significant part of the problem that crops up in the Enterprise 2.0 value discussions stems from the fact that the champions of Enterprise 2.0 significantly underweight the complexity and pervasiveness of the existing information technologies in the enterprise and the reasons why these technologies evolved. Earlier this week, Miko Matsumura wrote an excellent blog entry entitled Top 5 Definitions of Enterprise: focusing on the Enterprise in "Enterprise 2.0" that really went to the heart of the matter, prompting Michael Krigsman to also reproduce it in his own blog entry to underscore its importance.

The modern information technology environment in the Enterprise looks a lot more like this diagram below from Driven to Perform than just a collection of e-mail, blogs, wikis, etc. that any Enterprise 2.0 overemphasize, although they are clearly part of the fabric woven together in what is commonly referred to as information work.



The Components of a Modern Automation and Information Infrastructure


The rich repository of highly-structured and semi-structured business processes and associated applications and infrastructure necessary to run a modern business can not and likely will not ever migrate to the Enterprise 2.0 definition, nor should they ever, but they will be augmented increasingly by it.  We discuss many of them in Driven to Perform and orient them in a cross-enterprise value chain based on Michael Porter's seminal work. 


The Integrated Business Processes of the Modern Enterprise

They key activity steps of enterprise business processes embodied into today's ERP, CRM, SCM et al software, such order-to-cash, procure-to-pay, hire-to-retire, or record-to-report need to be highly structured for a variety of reasons, not the least of which is efficiency, their primary reason for being, but also for the significant compliance concerns they address.  I don't foresee a point any time in the near future where enterprises will leverage Enterprise 2.0 principles in the core of accounting, or payroll, or order management because there are serious risks to doing so for a business.  These enterprise business processes are complicated enough without any unstructured processes surrounding them, as you can see here in this offer creation process, which we diagrammed in Driven to Perform in our chapter on Risk-Aware Marketing Performance Management:



Creating the Optimal Offer

However, what you notice in any enterprise business process like the one above is that there is a lot of white space.  The white space is where people are human integrators.  Where the various folks from marketing, the contact center, sales, and operations fill in the gaps that their enterprise software does a poor job of addressing today.  It is in these process contexts that wikis, blogs, instant messaging, etc. can perform a brilliant and valuable service, and for certain processes, form the entire substrate upon which the enterprise process can be manifest.  Thus, ultimately, the real Enterprise 2.0, the weaving of both the structured and unstructured worlds together, really looks a lot more like this:


The real Enterprise 2.0:  A combination of structured and unstructured processes 
and technologies woven together to achieve desired business outcomes

Enterprise 2.0 technology is only starting to become really enterprise ready

Because of a lack of understanding of how the modern enterprise infrastructure works, my belief is that a lot of the existing E2.0 offerings have gaping holes in them that must be addressed if they are to survive long term in the environment they wish to play in beyond the proverbial "server under someone's desk".  I was surprised, for example, during the Google Wave keynote that the speaker referred to the desire to put as few restrictions in place in the security model as possible to ensure the maximum degree of collaboration.  Examples like these, among many others, show that there is a long way to go before these tools can be used pervasively in the enterprise without serious repercussions.  I am certain that regulations around archiving, audit, document retention, privacy regulations etc. along with technical requirements like delegated authentication, encryption, etc. can not be adequately addressed with many of these tools in their current ungoverned state in the enterprise and this will be a liability in these tools adoption until it's addressed.

It should be noted that the more sophisticated vendors absolutely understand what they need to do to be viable in a truly enterprise context but they are decidedly in the minority.  Linden Lab, creators of the Second Life 3D virtual world, had a major announcement at the conference in unveiling Second Life Enterprise that had nothing to do with sexier avatars, but instead decidedly focused on the unsexy topics like providing a private and secure virtual environment with enterprise manageability capabilities.  Similarly, Novell, long a networking and infrastructure stalwart from Enterprise 1.0, unveiled Novell Pulse, with a set of enterprise class capabilities on top of Google Wave.  These vendors realize that completely unstructured capabilities that do not bolt into the enterprise mechanisms of governance have little chance for broadscale adoption.

Conversely, leading enterprise application players like Workday are starting from the robust ERP and HCM process perspective of the so-called "Enterprise 1.0" world and layering many social constructs such as tagging, inline collaboration, etc. into their applications to deliver on this converged Enterprise 2.0 notion I describe above.  This movement from the process-based applications to embrace the unstructured social world is well summarized by R Ray Wang's notion of Social Enterprise Apps, which combine the process and social worlds:


Source: Software Insider's Point of View - 10 Elements Of Social Enterprise Business Solutions and Platforms


The advocates and skeptics talk past each other when it comes to Enterprise 2.0

Unfortunately, the current Enterprise 2.0 dialogue has much room for improvement because we are all talking past each other.  When I was asked by Jennifer Leggio to contribute to her blog entry entitled 2010 Predictions: Will social media reach ubiquity? , I responded:
In 2009, a tremendous amount of noise in the marketplace surrounding social media has reached a fever pitch and this threatens to drown out its potential effect to be transformative in the enterprise. Those projects and vendors that customers were willing to experiment with in 2009 will need to tie their efforts to concrete performance improvements in order to remain viable as social media’s sheen of being the new kid on the block wears off.


I realize that Jennifer's request was about social media, and social media does not equal Enterprise 2.0, but I think it's fair to equate the usage of social media technologies within the Enterprise as a loose approximation of the term, and that is what spurred my response.


To be certain, there is a raging debate in response to this hype in the enterprise and as supporters become more zealous the detractors become more incendiary.  Dennis Howlett kicked things off with his August 26, 2009 post Enterprise 2.0: what a crock, arguing that:

Like it or not, large enterprises - the big name brands - have to work in structures and hierarchies that most E2.0 mavens ridicule but can’t come up with alternatives that make any sort of corporate sense. Therein lies the Big Lie. Enterprise 2.0 pre-supposes that you can upend hierarchies for the benefit of all. Yet none of that thinking has a credible use case you can generalize back to business types - except: knowledge based businesses such as legal, accounting, architects etc. Even then - where are the use cases? I’d like to know.

Needless to say, it received a lot of attention.  In response, an entire panel discussion was convened at the Enterprise 2.0 Conference literally entitled "Is Enterprise 2.0 A Crock?".  In the panel, an excellent summary of which you can read here by my ex-SAP colleague Timo Elliott, it was absolutely clear that the customers, represented by marquis brand names like EMC, Eli Lilly, and Alcatel-Lucent, saw clear value in their deployments, but it was also clear that it was very hard to quantify what that value was.  What was most disappointing however was that there was no debate.  All the participants were predisposed to believing Enterprise 2.0 was not a crock, which defeated the point of such a session.  This spurred yet another round of salvos such as Enterprise 2.0 - the non-debate from Dennis Howlett, where I was quoted about my disappointment in the lack of debate, and Miko Matsumura's balanced perspective in The Enterprise 2.0 Crock.  This prompted Susan Scrupski, who has done a remarkable job in a short period of time with the 2.0 Adoption Council, to fire back with her own post entitled "Checkmate" which listed a veritable who's who list of incredible brand names who are all part of the council and who clearly see the value in Enterprise 2.0.


Who's right? They are actually both legitimate perspectives, and I am convinced they can be reconciled.  Fortunately, I am not the only one.


Strategy-Driven Execution is the path to legitimizing Enterprise 2.0 business value

Sameer Patel who writes the excellent Pretzel Logic blog and Oliver Marks, whose Collaboration 2.0 blog is very popular are both highly-regarded thought leaders in the Enterprise 2.0 space, reflected in them being nominated to the conference's advisory board.  What I appreciate about both of their perspectives is that they clearly understand the enterprise in Enterprise 2.0 and have a very pragmatic approach to championing adoption, driven by prioritizing business value, which aligns exactly with my way of thinking.  This mindset of business-value first first was clear even in the session names of their track, such as "Selling the Case for Accelerating Business Performance with Enterprise Collaboration and 2.0 Technologies", "Collaboration at Scale", "Lowering Customer Service Costs Via Social Tools", and "Launching Winning Products in the Market: How Social Software Improves Your Odds".  Notice that the sessions explicitly talk about how collaboration facilitates existing enterprise business processes.  With that as a lens, it suddenly does become possible to quantify the value of Enterprise 2.0 tools.

In Driven to Perform, we looked at dozens of business processes and described the key roles, collaboration points, and metrics that surround those business processes to drive business performance.  Using this framework, we can now start to ask questions that put serious metrics on top of Enterprise 2.0 investments.  Let's return to the example of the offer management process.  In the original "creating the optimal offer" business process diagram above, there is a lot of white space because the existing structured marketing enterprise systems do not address the entire end-to-end offer management process.  I argued above that this white space could be filled by collaborative tools.  What are the metrics that a head of marketing might care about surrounding offer management?  The head of marketing would certainly care about the speed of campaign development as measured by the time to develop new offerings.  For example, today, it might take her and her organization 6 weeks to develop a full offering, which is much too slow in the competitive environment her company is in.  Can an Enterprise 2.0 tool like wikis significantly reduce the offer creation time, perhaps even by half, by opening up the process and allowing people to collaborate to generate better offer ideas faster?  Absolutely.  Could an Enterprise 2.0 community forum significantly reduce the time to get responses to various offers from trusted customers?  Absolutely.  CRM thought leaders like Esteban Kolsky on his CRM Intelligence and Strategy blog and Paul Greenberg on his blog Social CRM: The Conversation would tell you the same thing in their excellent treatises on Social CRM.  And there is the business value.  Hard, quantified ROI.  You can only get that once you understand the goals of your organization, the current business processes in place that are set in place to achieve them, and the metrics used to instrument those processes.  That's what I call Strategy-Driven Execution, and you will need to understand it if you want Enterprise 2.0, the real Enterprise 2.0, to flourish in the enterprise.


Driven to Perform is filled with literally hundreds of examples of the ways Enterprise 2.0 tools can actually drive quantified business value by combining the unstructured and structured processes that drive business performance, but that that is not the way Enterprise 2.0 tools are being sold today.  If an Enterprise 2.0 tool can:  increase the average deal size, reduce cost to serve, increase customer loyalty, decrease new product development time, etc., than there is a legitimate business use case and a hard ROI associated with it. So my advice to both the zealots and naysayers of Enterprise 2.0 would be to take an existing, legitimate pain point, like offer creation, or product development, or customer service, and start by benchmarking your current metrics.  If an Enterprise 2.0 tool can move those metrics in the right direction in a provable way, you will have real, hard ROI. If the tool doesn't contribute to moving those process metrics in the way you hoped, then you might have a problem with your executive sponsor.


So, Is Enterprise 2.0 a Savior or a Charlatan?

It could be either.  The keys to answering this question lie in your understanding of your goals, your current enterprise processes, and their associated metrics.  Without knowing this, we'll all continue talking past each other to our mutual detriment.

 

Thursday, October 29, 2009

"Driven to Perform" Podcast with Jon Reed - EPM, GRC, and the Future of SAP in a SaaS World

Jon Reed is a guru on the subject of SAP implementation skills and runs a great site called Jon Reed on SAP Consulting.  He has been deservedly recognized for his contributions by SAP by being nominated as an SAP mentor.  We got together to do podcast on a wide variety of subjects, including the motivation and methodology behind Driven to Perform: Risk-Aware Performance Management From Strategy Through Execution, how SAP's EPM and GRC offerings fulfill the vision behind the book, and the skills necessary to implement the products.  The conversation then turned to a topic that is hot on everyone's minds:  the hype versus reality of SaaS and how SAP is responding to this important trend.  You can download the podcast here or go to this page to read a fairly thorough transcript of the discussion that Jon was kind enough to create, which I would encourage anyone interested in these subjects to do.  I hope you enjoy it!  Let me know your feedback in the comments.

Tuesday, October 27, 2009

PivotLink Blog - Is Reporting Overrated? YES! Drilling down even further

I had the chance to meet with Ajay Dawar a couple of weeks ago, who is an old friend and someone I looked up to when we were both at Siebel Systems, and i am glad to still be in touch with him now.  Ajay was an expert on Siebel's Marketing Analytics among numerous other areas, and was one of the early employees of LucidEra, one of the earliest pioneers in SaaS BI.  There are few people in the industry who know about SaaS BI than him.  Ajay now works for PivotLink, a leading vendor in the On Demand or SaaS Business Intelligence world with a recently revamped and now very strong management team. 

After our conversation, Ajay posted an intriguing blog post on PivotLink's blog entitled "Is Reporting Overrated?" that has already got a great comments thread running including James Taylor (his excellent blog here) and Jerome Pineau (his excellent blog here):

Late last week I had lunch with Nenshad Bardoliwalla, an ex-VP from SAP’s Business Intelligence group. He has written a great book on Corporate Performance Management . He said (and I paraphrase)” that users don’t really know what to look for in a report and that information is useless without context. So even if you gave all the required reports to a customer they wouldn’t know all the right things to look for.” His point was that the BI industry needs much more than reports. Customers need guidance on what to look for.
If you've read this blog or Driven to Perform: Risk-Aware Performance Management From Strategy Through Execution, you'll know that reporting is one small part of a very rich set of capabilities needed to manage a business effectively:  goal setting, risk management, compliance management, initiative management, planning, budgeting, and forecasting, predictive analytics, data mining, simulation and other types of modeling, and optimization. 

But as to reporting specifically, for as long as I've been in this industry, and from what I can tell, as long as this industry has been around, we have not been able to get more than 20% of the users in an organization to use query, reporting, analysis, etc. tools despite continuous attempts.  We've made the reporting tools significantly easier to use, with attractive options available from SAP BusinessObjects, Oracle, and IBM Cognos having been available for years.  We now have a new generation of SaaS BI players like PivotLink as well as Birst, and Good Data that also do a credible job of providing the functionality that the on-premise vendors do, but with the significant TCO advantages that SaaS can provide, with a much lower time to implement, compelling UIs, and nowhere near the manageability headaches of their on-premise counterparts.  Will these newer BI tools be able to break the barrier of increasing the adoption of BI technologies in the enterprise?

While I have good friends at every single vendor above and wish all of them nothing but the utmost success, unfortunately, I don't believe so.  I think even vendors with very low costs and very compelling options like the aforementioned SaaS BI vendors will still hit the 20% adoption wall because although their reporting capabilities are excellent, the metaphor of the report is not what end user's want.  Reports, the kind that have a grid with a lot of numbers, a number of dimensions, a few metrics, etc. require too much work to create and too much work to interpret.  If I'm a sales manager in the middle of a task and I have to look at the sales pipeline report, I have to expend a lot of cognitive effort to figure out if the right filters have been applied, what this number means versus another, etc.  It's a very specific metaphor whose rightful place is on the analysts' desk, not on the desktop of every user in the enterprise.  Those end users definitely need the information that is contained within those reports, but delivering it in the report format is unlikely to work, no matter how easy it is to create.  But, I readily acknowledge that it is too early to tell.  I am an empiricist and always welcome contrarian opinions and more importantly data that refutes my hypotheses.

For those users in the enterprise who DO need reports, such as analysts, I think the SaaS BI offerings like PivotLink's and Good Data's are very compelling from a value perspective compared to their on-premise brethren, and all have their sweet spots in terms of differentiation.  But that post is for another day!