Skip to content

Six Principles of Communicating Data: A Checklist

2014 July 20
by Ben Jones

In a section of the first chapter of Communicating Data with Tableau (O’Reilly, 2014) I lay out six principles of communicating data: 1) know your goal, 2) use the right data, 3) select suitable visualizations, 4) design for aesthetics, 5) choose an effective medium and channel, and 6) check the results.

These principles address more than the visualization step alone, they address the whole process from crafting a message to delivering it and actually affecting another person or group of people in a meaningful way. They involve self-awareness (what am I trying to accomplish here?) as well as empathy (how will this impact my audience?). It’s not just a “numbers game”; it’s also about words, images and emotions.

I converted these six principles into a simple checklist that I could use to remind myself of all the important ingredients that go into a successful communication effort, including many that I often gloss over or forget entirely. It’s yours as a pdf to download, use the interactive version below, or view it as its own tab. I hope you find it useful.

Download the PDF file .

I don’t claim that this checklist is either exhaustive or revolutionary. There are many other principles that could be articulated, and most of them are common sense. I’d love to hear yours. If there’s anything sophisticated in these six, it’s principle #3, which is based on the work of Jock Mackinlay, my esteemed colleague here at Tableau.

We’re privileged to live in a world that comes after the previous half century, in which the work of pioneering researchers like Jock and others (Schneiderman, Card and Bertin to name a few) established the foundation of effective visual encoding of numerical information – or how we interpret charts and graphs.

Let me know what you think by leaving a comment below, and thanks for stopping by,

Now Available: Communicating Data with Tableau

2014 June 16
by Ben Jones

CDWT_CoverI’m excited to announce that my first book, Communicating Data with Tableau, has been published by O’Reilly Media and is now available to purchase in ebook or print (in full color) at the O’Reilly online store or Amazon. Many thanks to my editor Julie Steele for working with me throughout the past year of writing, and to my family – my wife Sarah and my two boys Aaron and Simon – for dealing with my insane sleeping hours and sporadic moodiness over the past twelve months.

Of course a million thanks to all of the ingenious and incredibly generous members of the data visualization community – Tableau users and employees in particular. You’ve taught me much of what I included in this book.

What is it about?

This book is my attempt to show 1) how to communicate data well, and 2) how to use Tableau to do so. It’s not intended to be a comprehensive Tableau user manual, so not every feature is covered (for that type of resource, I’d recommend Tableau Your Data! by Dan Murray, as well as the helpful online tutorials available at the Tableau Software website).

Who is it for?

I wrote this book for anyone who needs to get a quantitative message across to an audience – analysts, journalists, engineers, marketers, students and researchers. Anyone with a modern browser can view and interact with the example projects that have been published to Tableau Public (free application available here), and readers will need Tableau Desktop 8.1 or 8.2 (free 14-day trial available here) to open the accompanying Tableau workbook files. There aren’t any examples that deal with the features that are new to version 8.2, which was not available to the public when the book went to print earlier this month. Expect a revision with 8.2 examples to follow later in the year or early next year.

How is it Organized?

The book is organized into 14 chapters that each deal with a different aspect of communicating data. After covering general principles and the Tableau user interface in the first two chapters, the book touches on both traditional as well as creative examples of communicating data: simple numerical comparisons, rates, ratios proportions and percentages, central tendency and variation, multiple variables, time series data, positional data, and combining multiple visualizations in dashboards.

In What Style is it Written?

While there are occasional philosophical musings (mostly in chapter 1) and even a fictional interlude into the world of “Chesslandia” in chapter 7, the book is primarily practical in nature. In 334 pages there are over 45 examples and loads of useful tips and tricks. If you’ve read my blog posts at this site, you’ll find the tone familiar. Some of the content is pulled from blog posts I’ve written over the past three years, but most of it is brand new.

For example, the book culminates with this multi-Sheet, multi-Dashboard workbook about the global expansion of the internet, as captured by World Bank data:


What was it like writing it?

Wow, it was at times elating, at times excruciating. It took many times longer to write than I originally scheduled (no surprise there), and the review and production processes, while affording me with many learning opportunities, were not exactly painless. As I mentioned, I felt very lucky throughout the writing process that O’Reilly and Julie Steele were willing to work with me, a first-time author. I’m sure many of my multiple “new deadline resolution” emails made Julie chuckle, but I never felt rushed or pressured to write anything other than the book I wanted to write. I can’t say how much I appreciate that.

Since my day job is as Sr. Manager of Tableau Public, and considering my blog regularly features Tableau Public “vizzes”, it’s a testament to the sheer joy of the software (and my love for working with data) that I didn’t O.D. on everything somewhere along the way. By the way, this book wasn’t sponsored in any official way by Tableau Software the company – it’s my personal project entirely. That being said, I credit my colleagues at Tableau, and none more so than my VP Ellie Fields along with the entire Tableau Public team, with providing me moral support and inspiration all along the way. Also, a special thanks to Andy Cotgreave for saying a few words on the back cover. Andy, you’re a class act, and I look forward to returning the favor. Without a doubt, I’ve found my tribe.

I’d like to hear from you!

If you buy it and read it, first, thanks for considering what I had to say. It’s a great honor to me that you would devote your time to hearing me out. Second, please let me know what you think! Email me (benjones at dataremixed dot com) or tweet me (@DataRemixed), and if you write an online review, I’ll be forever in your debt for the input and feedback – both positive and constructive. I fully intend to write more books in the future, so the more I hear about what worked for you and what didn’t work for you, the better my next books will be.

Thanks for stopping by,

In Defense of Intuition

2014 June 15
by Ben Jones

Intuition is under fire

Two months ago I saw a television commercial for Business Intelligence software, and in the commercial a customer being interviewed had the following to say:

“We used to use intuition; now we use analytics.”

In other words, we’re being asked to believe that the business owner was able to make progress by replacing decision-making using intuition with decision-making using analytics.

The statement didn’t sit well with me, so I immediately tweeted the following:

Working in the BI industry, I’ve heard similar attacks on human intuition from many different sides, and I don’t agree with it at all. I had a chance to say more about my objection to this notion at the eyeo festival last week in a brief talk entitled “Intuition Still Matters: Using Tableau to Think with Your Data”. You can see the slides to this presentation here.

In this blog post I’d like to explain why I feel that in a world awash with data, human intuition is actually more valuable than ever. Briefly, human intuition is the spark plug that makes the analytics engine run.

Intuition wasn’t always a byword

Contrast the commercial’s negative attitude toward human intuition with Albert Einstein’s rather glowing appraisal:


Without a doubt, it would be hard to come up with a more positive statement about intuition than this. So which is it? Is human intuition a faulty and antiquated decision-making tool, in dire need of replacement with something better, or is it the only valuable thing there is?

Before we go any further, we should define the terms.

What is intuition?

Oxford dictionary defines intuition as follows:

“The ability to understand something immediately, without the need for conscious reasoning”

It comes from the Latin root word intuērī, mean to look at or gaze upon. Thus the etymology of the word links it with the human visual system. Sight and intuition both occur instantaneously and effortlessly. Both can also mislead, of which more later. With intuition as with sight, the awareness comes before any logical explanation.

The link between intuition and sight is often a very literal one. In social situations, we intuitively sense other people’s emotions when we first lay eyes on their facial expressions:

Fig 1. Basic facial expressions (source)

And with abstract representations of data, we spot the marks that have certain unusual attributes in an intuitive way – that is, we notice them without having to think about it. We call these attributes “preattentive”. Noticing them doesn’t take effort – it’s as if it happens to us. Here are two examples: what do you immediately notice about them?

Fig 2. Two preattentive attributes (source)

Similarly, we can feel a compelling sense of confidence about what’s going to happen in the future, and what we should do about it. This is what is commonly meant when someone says a person has a great intuition about a specific field.

Intuition is commonly contrasted with reason, “the power of the mind to think, understand, and form judgments by a process of logic.” Logic, in turn, involves “strict principles of validity”. And analytics is “information resulting from the systematic analysis of data or statistics.”

To make the best decisions in business and in life, we need to be adept at many different forms of thinking, including intuition, and we need to know how to incorporate many different types of inputs, including numerical data and statistics (analytics). Intuition and analytics don’t have to be seen as mutually exclusive at all. In fact, they can be viewed as complementary.

Let me give some examples of how intuition provides the spark for the analytical process.

Five Reasons Why Intuition Still Matters

  1. Knowing what to measure in the first place

    Any process has an almost infinite number of variables that could be tracked and analyzed. On which should we spend our time? Knowing where to start can be a problem, especially if we know very little about the subject we’re dealing with.

    One school of thought goes something like this: collect data on everything and let an algorithm tell you which to pay attention to.

    Sorry, I don’t buy it.

    • First, not even the NSA collects data on “everything”. I guarantee you a filter has been applied to narrow the set of inputs. God may have counted every hair on your head, but I seriously doubt anyone else has.
    • Second, while data mining algorithms can discover notable patterns in huge data sets, only human intuition can discern between the useful patterns and the useless ones. They get their very “usefulness” from our goals and values.
  2. Knowing what the data is telling us (and what it’s not telling us)

    Once we pick data to collect and metrics to analyze, what do the numbers tell us? We talked about preattentive attributes briefly – via data visualization, our intuition can be put to good use interpreting the important 1’s and 0’s in the databases we’ve meticulously built.

    Using intuition in this way isn’t a perfect process though. Just as we might recoil from a garden hose that our instincts tell us is a snake, we can see signals in data that aren’t really there. Alternately, we can miss really important signals that are there. Just because intuition doesn’t work perfectly, though, doesn’t mean it should be discarded. We just need to hone our intuition for working with numbers, and we need to distrust it somewhat.

  3. Knowing where to look next

    Jonas Salk, the American medical researcher who developed the first polio vaccine, had the following to say about intuition in his book Anatomy of Reality: Merging of Intuition and Reason:

    Fig 3. From Jonas Salk’s Anatomy of Reality: Merging of Intuition and Reason

    He made a discovery that has saved the lives of countless people in the world, and he chalked up an important part of his success to intuition. Often the best outcome of an interaction with data is that we sense another, even better question to ask. And the process iterates. The realization of the next place to look can form in our mind like an intuitive spark. The light bulb analogy applies.

  4. Knowing when to stop looking and take action

    For many types of questions or problems, we could continue to search for a solution ad nauseum. Think of a chess game. What’s the “best move” to make at a given point in the game? Russian chess Grandmaster Garry Kasparov knew something about this question, and here’s how he understood it, as stated in his book How Life Imitates Chess:

    Fig 4. From Kasparov’s How Life Imitates Chess

    There comes a point in time when it’s best to stop analyzing and make a move. Knowing when we’ve arrived at this point is a function of intuition. If we don’t have this intuitive switch, we can suffer from “analysis paralysis”, and then we go nowhere. We’ve all been there.

  5. Knowing what how to get our message across

    A key part of the data discovery process is communicating our findings with others. We can use our intuition to choose the best message, channel, venue, visualization types, aesthetic elements, timing, tone, pace, etc. If we have a deep understanding of our audience, we will intuitively know what will get through to them, and what will fall on deaf ears. When we get it right, it can be a wonder to behold. Think Hans Rosling.

    Crafting the communication is a creative process, and human intuition will need to be tapped to do it well.


For the reasons outlined above, I don’t believe that human intuition will ever be rendered obsolete. No matter how smart our algorithms get, no matter how sophisticated our tools or methods, the intuitive “spark” in the human mind will always be the key element in our thoughts, in our decisions and in our discoveries. Data and analytics can fuel these sparks, and they can provide a way to make sure we’re headed in the right direction, but they can’t replace human intuition. Not the way I understand it, anyway.

I don’t think the creators of the commercial would disagree with this point of view, so it likely comes down to semantics. Maybe the business owner in the commercial should have said: “We used to rely on intuition alone, now we combine it with analytics to make even better decisions.” Slightly less snappy, I know. But at least intuition doesn’t get thrown under the bus.

Dimension Line Charts: a (slight) variation on arrow charts

2014 April 18
by Ben Jones

Arrow charts are an effective way of showing how values changed from one point in time to another (like slopegraphs), and they have been touted by Naomi Robbins of NBR Graphs. They have also been created using Tableau before by zen master Joe Mako. I really like arrow charts – as a mechanical engineer, I understand the language that they’re speaking.

But there’s a way that they can speak even more clearly to me. It’s more of an accent, really.

Here’s a version of what I’ll call a “dimension line chart” (“dimension” as in GD&T, not Tableau’s dimensions and measures) that I made to show which NBA players improved the most over last season, and which players’ performance regressed the most:

How is it different from traditional arrow charts?
It’s a very subtle change that has four parts to it:

  1. There are “extension lines” added via reference lines to show the starting and stopping points of each line
  2. The arrowheads are custom shapes with tips that end at the reference line instead of just beyond it
  3. The direction (+ or -) and the magnitude of the change is shown as a dimension in the middle of the line
  4. I added a light row banding to make it easier to see which arrow applies to which player

That’s pretty much it. It’s a slight dialect of the language of arrow charts, really. Translation: it’s just a fancy arrow chart.

Here’s what the traditional arrow chart version looks like:

How do the changes help?
I always felt that traditional arrow charts seemed to imply movement beyond the end of the arrow, like a flow diagram (showing direction of wind, magnetic field, or water flow) or a traffic sign telling me which way to go, not where to stop. But, literally speaking, the data stops at the very end of the arrow. And with Tableau, if you use a default filled shape for the arrowhead, the data technically stops in the middle of the arrowhead, since that’s the center of the shape (more a detail about Tableau, not arrow charts themselves):


Using arrowhead shapes that end in the middle rather than the edges of the file corrects this small inaccuracy:


Nitpicky? A little, maybe. Click here to get the four centered arrowhead png files I created for this project.

What inspired this variation?
I enjoyed my drafting class in high school, and I went on to study mechanical engineering in college. Drafters and mechanical engineers draw diagrams of physical objects and indicate their dimensions so that machinists can make them in the real world.

There’s a correlation to what we’re attempting to do with data visualizations: we’re communicating the relative sizes of measurable quantities. Instead of feet or inches, the units can be dollars or population or whatever.

We’re giving a “blue print” to our audience and asking them to build an understanding in their mind.

Here’s an example of a technical drawing that the dimension line chart draws from:


Issues and Opportunities for Improvement
One weakness of this variation is that it doesn’t handle slight deltas very well. If the reference lines are very close together, the arrow doesn’t look very much like an arrow at all. If we remove the rank filter and look farther down the list, here’s what we get:


Let me know what you think, I’d like to hear your opinion. If you can think of a way to improve dimension line charts, leave a comment. Some ideas I have are to give the numerical values a white box background to break up the dimension lines and to change the styling of the arrowheads to be in line with GD&T standards. Also, I’ll be doing a “how-to” write-up on the Tableau Public blog as part of #TableauTipsMonth, so be on the lookout for that in the next few days.

Thanks for stopping by,

Earthquakes, Los Angeles, and Being Shallow

2014 March 29
by Ben Jones

Last night I was analyzing earthquake data from the USGS Earthquake Hazards Program. When I woke up this morning, I was assaulted with tweets about the 5.1 magnitude earthquake in La Habra. I spent most of my life in L.A. Coincidence? I think not. Okay, yes, it was absolutely a coincidence.

{A tangential rant: Obviously I’m conflating correlation and causation, but as dumb as that line of reasoning is, it underlies a huge chunk of human thinking – superstition, astrology, plenty of religious thought, ENTIRE BRANCHES OF MEDICINE. In a word: pseudoscience. Not good, people.}

Back to earthquakes: I was updating a viz about the history of recorded earthquakes around the world since 1900. Here it is:

It clearly shows the clustering of earthquakes along trenches and fault lines, a fact that we take for granted now but was not known by humanity prior to the 1960s.

It also shows what seems to be a dramatic rise in earthquakes around the world. Of course what we are looking at here is recorded earthquakes, not actual earthquakes. A minor point? Not really. The number and type of seismographs changed throughout the course of the 20th century. Simply: they got better, and more were installed. So this data set can’t really answer the question: “Are actual earthquakes increasing?” It’s only really the number of recorded 6.0 – 6.9 magnitude earthquakes that increased. We got better at detecting and measuring earthquakes in this range and below. It’s not easy to miss a 8.0.

Los Angeles
All the talk today is about earthquakes and Los Angeles. Here are the 6.0+ earthquakes in the Los Angeles area since 1900:

I remember the 1994 Northridge earthquake very well. It was pretty incredible. For all the talk of “data storytelling“, the little orange dot on the map doesn’t really tell the story of the Northridge earthquake, or any of the other ones for that matter. That’s good perspective for all of us.

I’m not saying data can’t give a human side of the story – I believe it can (I help run a conference about it). But if you want to know how people felt during an earthquake, you’d probably have to talk to them, and they’d each say something different. Data is great, but it doesn’t tell the whole story.

“Wait, the earthquake in Los Angeles was shallow…?” I mean, come on, what a total set-up, right? Even I had to love the tweets about the supposed relationship between the depth of the earthquake and the residents of L.A.

Serisouly, though, what about earthquakes, and frequency of magnitude and depth? These dot plots show how common larger earthquakes are, and how far below the surface of the earth they occur:

The moral of the story? I guess it’s not just in Los Angeles where you’ll find shallow ones that cause a relatively big stir…

Thanks for reading,

Visualizing History

2014 February 2
by Ben Jones

{Update, Feb 6th, 2014: I’ll be showing how to make the U.S. president dashboard below in a brief webinar on Friday, Feb 7th at 9:30am PT. Register for free here}

When studying history, we ask questions of the past, seeking to understand what happened in the lives of the people who have gone before us, and why. A data visualization of history suggests and answers a thousand questions. Sometimes, the value in a chart or graph of history is that it proposes new questions to ask of the past, questions that we wouldn’t have thought to ask unless the information were presented to us in a visual way.

Visualizing Presidential History
Take the history of the United States presidency. Every high school student in the U.S. learns the names of a growing list of former presidents. They learn about the founding fathers, and the various wars and movements that occurred over the course of time. They memorize the names and dates, at least until the exam is over. Wikipedia provides a lot of information about U.S. presidents in table form.

But what if the string of U.S. presidents were presented as a gantt bar chart, as shown below:

Update, Feb 18th 2014: Pretty awesome to see versions showing Denmark’s presidents (by Kasper Ulvedal) as well as Sweden’s prime ministers (by Martin Lagnerö) and now India’s Prime Ministers (by Rachana Parmar) showing up since this was published. I’m thinking of crowd-sourcing a global version…

Observations from Data
Looking at the information in this way, rather than in table format, we can very easily see a number of interesting facts and patters, such as:

  • Barak Obama is the first president who was born during the Civil Rights Movement
  • No presidents were born during the Civil War, but two died during the internal conflict (Martin Van Buren and John Tyler)
  • The first 9 presidents represented 5 different political parties
  • Ever since Franlin Pierce took office in 1853, the presidency has been occupied by either a Republican or Democrat
  • It’s pretty easy to spot the eight presidents that died while in office
  • No president has lived longer after leaving office than Jimmy Carter (33 years and counting)

A whole host of observations can be made by looking at how the bars line up and overlap with each other.

Visualizing the History of Civilizations
The gantt was the format used by 18th century theologian and chemist Joseph Priestley, one of the founding fathers of data visualization, in order to visualize the past. Priestley felt that an appreciation of human history would lead to human advancement and progress, and so he sought to educate the public about the civilizations that had existed prior to his day using this type of gantt chart:


Visualizing Biblical History
One remarkable (or fantastical, as some would say) aspect of the Old Testament is that prior to the account of the flood, the characters in the pages of Genesis are purported to have lived hundreds and hundreds of years – over 900 in fact. According to the Biblical account, these patriarchs lived the vast majority of their lifetimes after having children. After the flood, the lifespans of successive generations of people in the lineage of Noah get shorter and shorter.

If true, this would have resulted in a situation far different from the one we experience. Today, 3 or at most 4 generations occupy the globe at the same time. If the record is true, Old Testament figures were acquainted with their family tree going back as many as ten generations. This observation is never directly mentioned in the pages of the Bible, but it suggests itself rather readily from a simple visualization of the lifespans using the gantt chart:

Patriarchs of Genesis_Corrected2

Visualizing Scandalous Political History
Last year, the team at La Nacion in Buenos Aires, Argentina, won the GEN Data Journalism Award for a series of reports outlining the suspicious expenses of Argentina’s Vice President Boudou. A “smoking gun” moment was when they presented Boudou’s expenses over time using a gantt chart. What became immediately obvious was that somehow he managed to incur reimbursable expenses in 6 different cities on the same day, as evidenced by the overlapping bars in the chart below:

Visualizing Personal History
This last week, I had the pleasure of having lunch with a fellow Seattleite and Tableau Public blogger Troy Heerwagen of Troy created a visual resume in which his various professional roles are presented as a gantt bar, something that I’ve done in the past, as well as another Tableau Public author, Anya A’Hearn. Here is Troy’s version:

In Conclusion
It’s pretty simple: visualizing history using a gantt bar chart is a useful way to spot overlap in time – overlap of people with events, people with key dates, and people with other people. It’s also a useful way to spot patterns and identify impossibilities in a list of dates.

Update, Feb 6th, 2014: I saw this tweet today, and since it fits the theme of this post perfectly, I just had to add it:

Thanks for stopping by,

A Better Periodic Table?

2014 January 10
by Ben Jones

I went to SeaVis after work today, and I spent some time looking at something I haven’t looked at much since college: The Periodic Table.

We were considering the periodic table because there’s a periodic table of visualization methods out there. Turns out people have made periodic tables out of just about everything. There’s a periodic table of beer, Pokemon, The Empire Strikes Back, you name it. You know it’s way out of hand when there’s a Periodic Table of Periodic Tables. I kid you not.

Anyway, Robert Kosara gave a scathing (yet coherent) rant on why these flavors of periodic table totally miss the point. The arrangement of the elements into groups and periods actually means something in the periodic table of elements. Position doesn’t have any meaning with the goofy ones, so why put it into the periodic table format at all? You can read more about Kosara’s thoughts in his 2009 blog post.

My take on the whole thing is that yes, they’re lame, but they’re probably mostly harmless. I guess I just ignore them, really. What I WAS interested in, though, was how the actual periodic table could be improved upon: by adding the ability to size and color each elemental “square” in the grid by various physical parameters like atomic weight, atomic radius and density. I found these variables on a couple different Wikipedia pages, and I was off and running. Here’s what I made:

I like it because each square in the periodic table carries with it the ability to encode quantitative information about each element into the size and color of the square. Why not take advantage of these parameters to gain an appreciation of how the elements compare in the physical realm?

That’s what I set out to do, and I think I’ve accomplished it. Is it “better” than the original? I know, that’s probably a blasphemous notion to many, but there certainly are aspects of the interactive version which are better than the original static version. If you want to find all primordially occurring elements that are solids at standard temperature and pressure (STP), it would take a while searching the various color encodings of the original static table. With the interactive version, you must actually interact with the table, but with two simple clicks you can immediately see which elements meet those criteria, and how they compare in size or weight.

The color scheme of the original is the one thing this version lacks. What it gains is the ability to visualize relative size and weight. It’s a trade-off I suppose. Here is a version that colors the squares not by increasing density, by categorically based on description:

Here’s yet another version that makes use of the fact that the shape of the element mark can be used to encode either the Occurrence or State at STP characteristic. The shapes don’t all have to be squares. The advantage here is that the table can toggle between two modes which let you scan the entire table and see all different occurrence types or states at STP:

Of the three versions above, which do you prefer? Of the three interactive versions vs. the original static version, which do you prefer? Obviously we won’t always have the advantage of being able to interact, so the static version has a clear value, but the time it takes to answer a variety of questions can be reduced with interactivity.

Well, regardless of whether or not either of my fancy interactive periodic tables are “better” that the legendary original static version, I’m pretty sure that this little project would have made my 10th grade chemistry teacher proud. Here’s to you, Mr. Galanda from Thousand Oaks High AP Chemistry, wherever you may be. I always think of you on Mole Day, 10/23 at 6:02am.

Thanks for stopping by,

An NBA Fantasy Dashboard

2014 January 6
by Ben Jones

2013 was an amazing year for me, in many ways. One thing that happened to me was I got sucked into fantasy sports. Bad. I always knew I was susceptible to it: sports, data, sports and data together. I suppose I was preconditioned in my early years by the countless hours spent pouring over and memorizing the stats on the back of my baseball cards. For some reason, I hadn’t taken the plunge into the world of fantasy sports (addiction) until a fellow little league father invited me and my oldest son Aaron to join a father-son fantasy football league.

My sincerest apologies to my wife, Sarah, for the handful of weekends this past fall that were utterly ruined by a certain someone in the house incessantly checking his ESPN app and getting grumpier by hour. It wasn’t Aaron.

To make things even worse, I started an NBA fantasy league in November. A handful of my friends & family from L.A., some new ones from Seattle, and a Aussie joined, and we were off and running. We call our league the “Shot Callers”, and my team is known humbly as “Dunk on You”. It’s all fun, really. When I win, that is.

Anyhow, I noticed that fantasy sports is mostly tracked via tables. Endless tables of numbers on the web. If you’ve read this blog at all for the last two and a half years, you know that tables of numbers on the web are like a Chris Paul alley-oop pass to me. I made this NBA fantasy dashboard, and yes, I even shared it with my league mates:

It’s a labor of love, really. It took a while of playing the game to even know what should be on the dashboard, and it requires constant updating. Thanks to the folks at for doing the hard work of providing each day’s stats in such a timely fashion. I love it that former UCLA Bruin Kevin Love is leading the pack. Your patience is appreciated as I continue to find and add the player profile photos, as many aren’t included as of yet.

Lastly, if you, too, are a fantasy sports addict like me, I’d love to hear your tips, tricks and even maybe some recovery methods you’ve found helpful. Into basketball fantasy sports? How could this dashboard be even more useful?

Thanks for stopping by,

Slopegraphs in Tableau

2013 December 11
by Ben Jones

Andy Kirk of Visualising Data recently blogged and tweeted about his addiction to slopegraphs. In a show of support, I re-created his Barclay’s Premier League comparison chart, followed by a quick how-to tutorial below. Using the slopegraph with an added drop-down selection filter, it’s easy to figure out why Manchester United has performed so poorly in comparison to their first 15 games of last season: it’s their anemic offense.

Change the Select drop-down to “Goals For” and “Goals Against” and see which one changes the most:

Now, heres’s how to make it:

Step 1: Get the data

The Barclay’s Premier League site includes league tables for each season up to a chosen game number:

The teams’ results up to game 15 for both the 2012/13 and 2013/14 seasons were copied and pasted into an Excel spreadsheet, with an added column for Year:


Notice that this spreadsheet is structured differently thank Andy’s. Andy had one column for 2012/13 results and another column for 2013/14 results. I’ve structured the spreadsheet in this way so that I can use Year as a Measure in Tableau.

Step 2: Connect Tableau

This is a very straightforward step: Open Tableau, click Connect to Data, and find your Barclay’s results spreadsheet.

Step 3: Create a parameter and matching calculated field

Before creating the slopegraph, let’s make a Parameter that will allow users to choose which stat to chart.

Right click anywhere in the Dimensions or Measures panel to the left and select “Create New Parameter”

Fill out the dialog box as shown below:


Click OK, and then right-click the newly created Parameter in the area to the bottom left and select “Show Parameter Control”.

You’ll see a drop-down select appear in the upper right. You can use this to change the value of the Parameter.

We now need to create a Calculated Field to link to the different team stats based on the user’s choice. Right-click on the Parameter and select “Create Calculated Field” and fill out the dialog box as shown below:


Step 3: Create the basic slopegraph

Now that we have this “Selected” data field mapped to the Parameter, we can use it to create our basic slopegraph as follows:

  1. Drag “Year” to the Columns shelf, and change it to Discrete (blue pill) by clicking the down arrow and selecting “Discrete”
  2. Drag the “Selected” calculated field to the Rows shelf
  3. Change the Marks type from “Automatic” to Line
  4. Drag the “CLUB” Dimension to the Detail card and resize the view (making it wider)
  5. Drag another instance of the “CLUB” Dimension to the Label card, and then click on Label and select “Line Ends” in the “Marks to Label” area

Step 3, #1-5 are shown below:


Step 4: Add line coloring and thickness

In order to make the lines one color for increasing values and another c0lor for decreasing values and change their thickness based on the magnitude of the change, we’ll need to create three more calculated fields as shown below

“Delta”: The first calculated field computes the change in value of the selected statistic from one year to the next:


“Direction”: The second calculated field gives one string for increasing values and another for decreasing values. This will be useful for coloring the lines


{UPDATE} A comment from Jay below resulted in a change from “Direction” to “Better or Worse”, in which the color is dependent on whether a team got better or worse, not whether the chosen statistic increased or decreased:


“Magnitude”: This final calculated field yields the absolute value of the change, or the magnitude. This will be helpful for making lines thicker or thinner based on the magnitude of the change:


Now that these fields are created, let’s do the following to complete the slopegraph itself:

  1. Drag “Direction” to Color
  2. Drag “Magnitude” to Size
  3. Drag “Selected” to Label and change the Label so that the Club name and the value are in line with a comma separating them
  4. Filter out the Clubs that were either promoted or relegated after the 2012/13 season
  5. Clean up the fonts (change them all to Gill Sans MT)

Step 4: #1-4 are shown below:


I’ve also formatted the tooltips to yield a nice result when mousing over any of the line ends, and I’ve hidden Marks that were placed in awkward positions on the slopegraph that I couldn’t adjust.

Step 4: Design the dashboard

Now that the Slopegraph itself has been created, I prefer to place it on a Dashboard, add the Parameter control and a Drop-Down filter for Clubs as floating dashboard objects, and add a title and data source / reference information at the bottom:


With this view, we can do a whole lot more than find out what’s behind Manchester United’s poor performance, we can also notice other big changes, such as Liverpool’s suddenly prolific offense (Select “Goals For”), or Southampton’s dramatic improvement in defense (“Goals Against” through 15 drops from 32 last year to only 14 so far this year).

This is the value of the slopegraph. It allows us to make a whole host of point-to-point comparisons, and the largest magnitude changes literally jump to the surface.

In closing, many thanks to Andy for this awesome work and for coming clean about his love affair with this neat little chart type.


Tapestry 2014 Announced

2013 November 6
by Ben Jones

annapolis-maryland-innOne of the best things about joining Tableau at the beginning of the year was that I got to be a part of the inaugural Tapestry Conference in Nashville. For those of you who don’t know, the goal of Tapestry is to bring together a handful of journalists, academics and practitioners who are all interested in this emerging thing called “data storytelling”.

Tapestry’s rookie year was a big success, and I’m excited that we just announced it’s coming back for a sophomore season.

Tapestry, part deux will be held on February 26th, 2014 at the Historic Inns of Annapolis. It’s a one day conference with a nominal $100 fee to help partially defray the costs of the conference. Attendance is invitation-only since the venue is so small – more on that below…


What was so great about Tapestry 2013?

Here’s what I liked most about last year’s event:

  • Great speakers meant great presentations. Watch the videos and see the slides on the Tapestry blog. My favorite was Nigel Holmes on “Why 29 is such a stunning number“.
  • Small group meant one experience. Everyone in the same room, all experiencing the same conference. No professional tracks. No decisions about where to go. Well within Dunbar’s magic number, so you actually get to know people a little.
  • Boutique venue meant intimate experience. This isn’t your standard convention center experience. No herds, no vendors scanning badges, no forklifts hauling in boxes of swag. Last year we hung out in a converted train station. Seemed like an appropriate place to discuss storytelling.
  • One of the best aspects is that many attendees also presented demos, and this year there will also be a poster session.

What excites me most about Tapestry 2014?

  • We’re building on the momentum of last year. Tapestry was an unknown thing in 2013. This time it’s on people’s radar.
  • All three of the above still apply to this year’s conference.
    • Keynotes by Alberto Cairo, Aron Pilhofer and Jake Porway. That says it all, no?
    • Keeping it real…small. Dunbar’s number won’t be broken for a 2nd straight year.
    • Next year we’ll be staying at the Historic Inns of Annapolis. Check out the website to get a sense of the setting.
  • There’s been a lot of focus on “Data storytelling” in the past year, so there should be even more material, tools and ideas to discuss.

How can you get involved?

It’ll be a little bit like Willy Wonka’s Golden Ticket due to the limited space (100 will technically fit in the meeting room, but only barely), so if you want to go, submit a request for an invitation on the Tapestry website and keep your fingers crossed. Bonus points if you have something to contribute in either the demo session or the poster session. Tweet about it using the hashtag #tapestryconf, and get involved in the extended conversation.

Hope to see you there!