Friday, July 12, 2013

The Once and Future Prototyping Tool of Choice

When I first started using Tableau in 2006 Big BI was in its heyday—business data analysis was centered around building data warehouses and implementing the BI platforms fronting them from which developers, frequently in another country where they worked for less than local 'resources', would program reports for the business data consumers.

My initial attraction to Tableau was in its ability to instantly connect me with, organize, and analyse the data I needed to understand without needing to get elbow-deep in database internals or SQL coding. I was looking for then-modern tools that could replicate my experience with FOCUS, the original 4GL designed as a replacement for COBOL programming for data analysis reports. FOCUS and Tableau share the design principle of presenting the organizing and quantifying fundamental data analytical operations as first class entities. In FOCUS this takes the form of a programming language of English terms, Tableau models these as UI objects and an organizing framework that its creators termed VizQL.

Prototype or Real?

With FOCUS, this code:
 
when runs organizes and sums Employee data, just as it reads:
TABLE FILE EMPLOYEE
BY DEPARTMENT
BY JOB_CODE
BY JOB_DESC
SUM SALARY
END
 
Department Job Code Job Description Salary
IT MM1 Manager $227,062.00
 MS1 Admin Assistant $49,000.00
 TITD2 Developer $318,480.00
 TITDBA DBA $138,200.00
 TITSA Systems Analyst $121,780.00
Operations MM1 Manager $179,700.00
 MMA1 Assistant Manager $126,862.00
 MS1 Admin Assistant $117,000.00
 OF Line Foreman $121,120.00
 OP1 Punch Operator $189,500.00

As good as FOCUS was, and it was the best of its kind, the Tableau of its day, it was a programming language and I was looking for something modern, something elegantly designed to move the data-analytical operations to the cognitive surface providing the direct-action affordances that minimized the concessions I needed to make to the machine in order to get the results I was after.

In 2006 Tableau was the best of the bunch. Not perfect, it was leaps and bounds ahead of its ancestors, simpler and cleaner than its peers. And in its sweet spot its still the best of the bunch, even as the space is getting crowded with new entrants.

In Tableau – the configuration and data analytical presentation are consolidated into a single coherent whole:

In both cases the outcome is the same: a real, live data analysis.

Whether or not the analytic is a prototype or not depends upon factors extrinsic to whether or not it provides the appropriate information.

Sometimes it's as simple as the perspective of whomever is considering the situation, e.g. that anything 'real' needs to be built from the ground up as a software development project.

Sometimes the fact that something can be done quickly and easily makes it seem somehow flimsy and unsubstantial.

And sometimes there are desirable contextual considerations other than the core data presentation.

Tableau as Data Analysis Tool.

Tableau excels at data discovery—finding interesting, relevant, and valuable aspects of the data. It was so far ahead of anything that had ever been that the world took a fair bit of time to even realize that things had changed and easy, effective, efficient data analysis of data in its native state was possible. (and too many of the old guard still hold that analysis of data in its native state is pointless and of no real value; they're wrong but they're losing).

As a BI consultant Tableau was my secret weapon, my competitive advantage. With Tableau in my toolbox I could access and understand my clients' data faster, easier, and better then they believed possible, and in many cases they were so conditioned to the idea that they'd have to wait until the developers could get to implementing their reports that they couldn't even comprehend that I was able to shed light into their data darkness. So I used Tableau wherever and whenever I could, and for a time had a thriving practice rescuing traditional Big BI projects that had gone off the rails by employing Tableau across the full spectrum of project activities.

Sharing one's discoveries was possible by either creating static representations—PDFs, images, etc., or by packaging the data and analytics together so that Tableau Reader could be used to see them.

Which brings us around to this post's topic: Tableau's value as a prototyping tool, used for rapid data analysis and identification of those analyses that convey meaningful information after which the 'real' analytics would be re-implemented with a more appropriate tool or technology.

Tableau as Prototyper - Round 1.

As it became more widely known in corporate circles Tableau's rapid analytical powers were seen as a way to facilitate traditional BI. In this role Tableau was used to create preliminary analytics, taking advantage of its abilities to work against data not yet incorporated into the corporate data warehouse and the speed at which analyses could be created. These analyses would then be used as the models for the Cognos, Business Objects, or other platform reports which would then be the "real" BI outputs. One of my projects was working as the sole Tableau consultant for a Fortune 50 company in this fashion; the company was launching a new web site and had XXX (mega-big IT vendor) come in to do some analysis of the information requirements, resulting in a Word doc with some Visio sketches of various charts (most of which were less than useful). Working directly with the business stakeholders I developed a set of Tableau dashboards that were then turned over to the XXX offshore Cognos team for re-implementation.

All in all, this was a natural progression in the life cycle of a disruptive technology. Even the forward corporate (vs individual) thinkers and early adopters had to try to find a way to integrate Tableau into their existing way of doing things, if for no other reason than the enormous investments in the existing BI approach couldn't be seen as being somehow misguided.

Tableau was very successful in this prototyping role, as long as its relationship to Big BI was handled properly, it couldn't be perceived as a threat or it would get squashed, ignored, or sidelined out of the mainstream as a pretty toy but not fundamentally or strategically valuable.

Tableau as Prototyper - Round 2.

The introduction of Tableau Server made it possible to share ones Tableau analytics without the muss and fuss of distributing PDFs, packaged workbooks with data snapshots, or any of the other means. And yet Tableau Server wasn't ready for prime time, so the Tableau Server published dashboards became the prototype candidates for re-implementaton, with Tableau Server seen as a faster, more efficient way of distributing them for review and comment.

Round 3 - More than Prototyping.

Tableau Software kept busy adding functionality, particularly to Tableau Server, aiming it directly at corporate interests in an effort to expand its range of applicability and increase its attractiveness to corporate interests. As we know now, Tableau Software's primary goal always was to maximize its market value, and the corporate market, particularly the American coporpate business market is an almost bottomless well of riches to be tapped.

As Tableau became more and more suited to not only finding the interesting and valuable information and insights in an organization's data, but also in being the end-to-end delivery vehicle for the analytics that deliver them, it became embraced by more and more organizations as a viable BI platform with Tableau Server at the center of a constellation of Tableau Desktop users creating the vital dashboards that the organization's decision makers would rely upon.

This model has worked splendidly for Tableau Software. So well that Tableau made the big time—Gartner's Magic Quadrant and Forrester's Wave, and the increased recognition in the wider business community, along with aggressive corporate marketing led to the recent IPO (I'll be publishing my thoughts on this soon).

Tableau has emerged as a bona fide BI solution capable of supporting the full range of corporate data analytical needs. At least from the corporate perspective.

As Tableau has expanded the reach of Desktop and Server by adding attractive corporate features it has presented itself more and more as a one stop shop for the full spectrum of business data analysis and communication needs. With Tableau 8 one can now connect to managed corporate data, analyze it effectively and efficiently, and publish dashboards that convey interesting and valuable data based information.

And yet...

I've been working overtime for the past months helping my primary client integrate Tableau into their data analysis and communication work. They have large investments in pretty much all of the major tools, technologies, and approaches to business data analysis. There is a large staff of highly educated and skilled professional researchers—many of whom are PhD economists and statisticians—whose interest is in using whatever best helps them discover and communicate meaningful information from their data. There is a corporate BI group charged with the usual responsibilities of managing the corporate data and ensuring that it's provisioned and employed properly and well – it's important to recognize that these two groups' data interests only partially overlap. There's a data research group, where I work, whose mission it is to support the organization and provide public access to the full range of the organization's public policy related data – it's an internationally chartered organization charged with improving living conditions worldwide and holds decades of data of all sorts related to economic, political, social, and human welfare.

I've been able to work with stakeholders across the organization to help them achieve whatever benefits Tableau can provide. Sometimes this is in helping understand differential poverty measures in population subgroups in Eastern Europe in order to identify how different policy recommendations can have maximal poverty reduction impacts. Sometimes it's helping them build Tableau skills so that they can use it effectively. As the Tableau Server administrator I'm responsible for managing everything from installation and upgrading to content management and problem resolution.

I've spent a lot of time building increasingly detailed and interactive workbooks that contain multiple interrelated dashboards designed to provide a coherent exposition of data sets more complicated than simple single-grained record sets.

Round 4 - Tableau as Prototyper Redux

There are circumstances where Tableau isn't well suited to being the end-to-end solution. These are usually those outside of Tableau's sweet spot, where it's possible to use Tableau but for one reason or another isn't the best tool for the job. They include, but aren't necessarily limited to, and with overlaps:

  • things that Tableau doesn't do;
  • things that Tableau doesn't do well enough;
  • when the effort to implement the overall solution in Tableau is excessive, and implementing it in another technology would be easier, cheaper, or quicker;
  • when the Tableau skills required aren't available;
  • when Tableau implementation requires too many tricks, workarounds, and special case knowledge – particularly related to the previous two items, there are too many dark corners in Tableau and the further into the darkness one goes to produce a specific effect the less likely it is that whomever follows will be able to find the path;

Examples of things better created with something other than Tableau

  • highly interactive 'applications', such as the types of fine-grained engagement and complex inter-relationships between elements in web applications;
  • replications of existing tabular reports, with cell-wise control over content formatting
    – one of my client's current projects is creating a Tableau report replica of an existing HTML table sourced from a MS Analysis Services cube that formats the single value differently depending upon the value of one of the cube's hierarchy members, i.e. "99.9%", "99.9", or "99h 99m" – there is no good Tableau solution to this, the best I've been able to come up with is a Frankensteinian monstrosity that clumsily cobbles together multiple worksheets into a single dashboard, and this wouldn't be that bad if the dashboard layout manager was up to industry standards (but that's another old dead horse story);
  • specifically branded user interfaces, with particulars of layout, modes if interaction, etc, that lay outside of Tableau's design envelope.

What to do when Tableau won't do.

This can be an awkward situation. When an organizations commits to Tableau because of what it does really, really well there's an almost inevitable sense of buyer's remorse when the realization sinks in that it's not the right tool for all of their data analysis and communication needs. This is causing some real consternation where I work; the sense of disappointment is palpable and there's a real sense of questioning whether the eggs are in the right basket. The manager of the data group, who brought Tableau in and has championed it to real success noted yesterday that the aforementioned MS cube-based tabular report could have been constructed in 10 minutes using the existing data access web application instead of the much larger effort it's taken to do in Tableau, and she's actively considering the consequences, which include highly undesirable complications of having multiple data delivery paradigms under one umbrella.

If Tableau Software really wants to be a significant player in the corporate business data analysis game, and everything they've done has pointed them that way, they need to do everything they can to ensure that Tableau fills as much of the enterprise data delivery space as possible. If Tableau is seen as a partial contributor it will remain a niche product at best. If it's seen as fracturing the existing environment it will have a much harder row to hoe to achieve real growth and acceptance.

The changing landscape of data visualization and delivery.

This is a topic for another post (most likely a series). Consider this a brief introduction.

Tableau succeeded because it subverted the existing programming-development-technology-platform Big BI paradigm. It introduced a WISIWYG space in which no programming was required to conduct efficient, effective, high quality data analysis. This was huge because, before Tableau, one either had to put up with the horribleness of Excel or be technically proficient in programming some low level technology environment.

In the past decade there have been a number of developments that have fundamentally changed the landscape for the better.

Tableau and its cousins introduced the concept that valuable, immediate data analysis—intimate analytics—is a viable concept that works and delivers huge benefits.

Flexible, high quality programming languages have proven their worth and become mainstream. Although not all that new, languages such as Ruby and Python have proven time and again that they are not just suitable for simple throw-away scripts. Java, C++, C, C#, and the other traditional programming languages and platforms sill have their place, but they're not the only game in town. One of the most important developments in this space is the evolution and emergence of JavaScript as a viable tool for creating modern web-based apps of infinitely varied form and function. From my perspective: I originally wrote TWIS in Java, but have been using Ruby to build my Tableau tools for over a year – Ruby provides a close-to-the-surface programming paradigm that lets me work very closely to the domain of the problem I'm solving, combing through Tableau workbooks, very much like Tableau provides a surface for combing through data.

Visualization technologies have evolved and matured, especially for the Web, e.g. SVG, the Canvas API, WebGL, CSS, Javascript, and HTML5 video and audio. See The Graphical Web conference site or the W3C SVG site for more information. I've been using SVG for diagramming for years and have used it in some of my Tableau tools. The richness and elegance of web-based information and user inferfaces no longer takes a back seat to traditional proprietary platforms.

Data analysis and visualization toolkits and technologies have emerged and are rapidly evolving that dramatically reduce the distance between people and data, albeit in a different way than Tableau does. JQuery, Raphael, D3, and others provide data handling and visualization capabilities that go well beyond Tableau's, although they require development effort. But that effort is constantly becoming less onerous as the approaches mature.

Tableau in the new data analysis world.

It's not inconceivable, seems almost inevitable, that the ongoing richly dynamic evolution of the entire data analytical space will result in the next generation of tools that, like Tableau, bring the basic data analytical operations to the surface as first-order user interface objects that can be composed into useful, meaningful, highly valuable data analytics. It also may be that Tableau learns from history that standing pat with its initial brilliant insight won't be enough and will adopt the best of what the ongoing evolution of data analysis has to offer. Or Tableau may not see the value in this and a new disruptive product will arise and provide users with an even simpler, cleaner, more consistent, more broadly functional, and extensible solution.

Or maybe Tableau will retain its advantage of ease and speed in its sweet spot – creating those analytics that best suit its original design paradigm, and that once these analyses are created they're used as the gold standard requirements for re-implementation in one of the new generation of web technologies. This isn't the time- and resource-black hole of the past Big BI approach, but one that can effectively meld the best of what Tableau and the new tools have to offer while minimizing their deficiencies.

Time will tell, and Tableau has a lot of friction to overcome.

6 comments:

  1. Chris,

    Very nice post. While reading it, I had flashbacks to my own experiences. I sometimes do corporate-level training for big companies that are beginning to use Tableau. I might have 15 or 20 people in the training that come from different departments, all offering me different types of data (all in different formats) that they want to load into Tableau. Some of them don't understand their own data but have an existing system(s) they use to create their reports. If I can't show how to immediately load their data into Tableau and reproduce their existing output, I can sense panic and dread! Since I have a pretty big toolbox of tricks and techniques, I usually get the job done, but there are times when Tableau falls short when trying to do production work. I guess those shortcomings could be elaborated in a blog post. Maybe we can collaborate on a post about current deficiencies!

    Ken

    ReplyDelete
  2. This is great. I mean one of the problems of many companies nowadays is the problems and flaws on their products when they are placed on the market. The damages are irreversible once the items were bought by the consumers and it will have a bad effect on your reputation. It's the reason why rapid prototype is very popular.

    ReplyDelete
  3. Very insightful. Incidentally, Chris, I also came upon you through Jeff L who works with me at Bank of America. We are remote workers who sat right by each other (pardon the pun) the other day which is 2 days ago. I didnt know at the time that it was you even though the name rang a bell because I had just stumbled upon your blog a few days ago. And yes, I am fairly new to Tableau - a month old infant at that and trying to get the hang of it. Very soon came to the realization that if your data is not clean enough then there is not much Tableau can do either, being as awesome a product as it seems to be. Again, glad to have come across you and your blog. Thanks!

    ReplyDelete
  4. Prev comment by Karthi.

    ReplyDelete
  5. Well, it was an informative post and you have explained well by dividing it in different rounds. I sometimes do corporate-level training for big companies that are beginning to use Tableau. I might have 15 or 20 people in the training that come from different departments, all offering me different types of data (all in different formats) that they want to load into Tableau. Some of them don't understand their own data but have an existing system they use to create their reports.

    ReplyDelete
  6. I have read your post. Thanks for sharing such a great post, its really informational. Well ya, Tableau Software see and understand data but somewhere it lacks. Tableau Software is not organizationally equipped to serve enterprise customers. Tableau Software lacks robust enterprise-class security.

    ReplyDelete