Here are 10 rules to design great, simple, efficient, adaptive dashboards.
- Decide which metrics should be included in your dashboard. Some metrics such as measuring the impact of sharing and likes, on email marketing campaigns, are not easy to estimate. Out-of-dashboard statistical modeling might be best for these sophisticated predictions. Some other metrics are easy to estimate with basic models, and can be integrated in the dashboard, using in-dashboard analytics and data science for predictions. An example that comes to my mind is the Google AdWords dashboard: clicks count forecast per ad group, based on keywords purchased, and on your average CPC.
- Provide potential explanations / recommended actions when forecasted numbers are very different from actual numbers. Strangely, Google AdWords predictions (see item #1) are either quite accurate or totally wrong: some new ad groups supposed to produce thousands of clicks (according to Google's dashboard forecasted numbers) are producing less than 5 clicks. We suspect that some factors - maybe impression fraud, or keywords / ads / landing pages blocked by Google, or too little historical data available resulting in poor machine learning performance - makes some ad groups to fail. I think Google's AdWords dashboard should provide explanations / recommendations (e.g. wait a few more days) when the discrepancy between observed and actual traffic volume is huge - there is a lot of potential revenue to be made by Google, should they improve the AdWords dashboard accordingly. Note that this issue mostly impacts new ad groups that have little history.
- Become a user expert about your dashboard - Outsmarting the dashboard architects, about how their dashboard can be used. In our case, we designed ISP segments in our VerticalResponse (VR) dashboard, to optimize our email marketing campaigns, and create a feature not available on VR. This was the best feature that we managed to get out of our VR dashboard. With Google Analytics, we found a way to get traffic trends by region (as we are trying to increase US traffic and decrease traffic from Asia, and measure our success) despite the fact that no such report is available from the dashboard; another solution is to hire an AdWords expert to extract the data in question. Another problem that we faced is comparing trends regarding two 2nd tier traffic sources , say LinkedIn vs. Twitter: because traffic referral charts include all referrals, 2nd tier sources are dwarfed by the top 3 referrals and the chart is useless; a workaround consists in downloading the data in Excel formart (from the dashboard), and creating your own charts.
- Prioritize user requests about new reports. In our case, we tried to get click counts as well as number of uniques clicking regarding our VR campaigns, to identify outlier users who generate 1,000+ clicks from a single account. The purpose was to identify fake traffic generated by a bot, or very popular users sharing our newsletter, generating thousands of clicks that can be attibuted to a few popular subscribers. This feature (tracking number of uniques clicking) was not available on VR (unless you download very granular reports - something business analysts are not good at), but discussing this issue with VR (and the fact that their competitors such as iContact offer this feature on their dashboard) helps improve the VR dashboard.
Decide on optimum refresh rate for each report - Some users like real-time, I do, because it helps me detect right away if a marketing campaign or blog post is going to work or not. But if producing real-time reports is very expensive, maybe end-of-day is good enough. In my case, I enjoy Google Analytics real time reports, but it's like a nice perk, and I can do equally well (even better, by not wasting time being stucked on true real-time stats), by looking at daily or weekly stats. Real time offers some value, such as about which blogs I need to promote, but it is worth the price? In the case of Google Analytics, the answer is yes because it is free, though (as an executive) I feel I'm wasting too much time of these reports, given the relative value that they provide.
- Try to identify great, burried reports available on your dashboard - In our case, IP addresses of new members, provided by our Ning dashboard, has proved to be a very detailed, yet powerful and burried report that helps us eliminate spammers and scammers signing up on DSC. It also involves merging data from external vendors such as StopForumSpam or SpamHaus. This brings a new rule: build a meta-dashboard that blends both internal and external data, rather than working with isolated (silo) dashboards.
Customize reports, priority and user access - Allow users to design their own reports; display reports that are associated with top KPI's, and related to 50% of more of the revenue (nobody cares about the bottom ten IP addesses visiting your website). Give access to reports based on needs and user security clearance.
- Create a centralized report center - Merge silos, have a centralized dashboard that accesses data from various sources, both internal and external. In our case, Google AdWords, AdSense and Google Analytics are three separate dashboards that do not communicate. Fix issues like this one.
Send email alerts to clients (internal or external) - Allow clients to choose which reports they want to recieve, as well as frequency (daily, weekly, or based on emergency). Prioritize your email alerts depending on recipient: urgent, high priority or not urgent. Train users to create specific email folders for your email alerts. Do A/B testing to see what kind of alarm system (frequency, type of reports) is most useful to your company.
- Create actionable dashboards offering automated actions - For instance we'd like to have our most popular tweets automatically advertised on our Twitter advertising account. This would be a win-win both for us and Twitter, but currently, the process is still very manual, because the Twitter advertising dashboard does not seem to provide a solution. Maybe it does, but if users can't find the magic button, it is useless. Training users (provide online help, but also offering AND reading user feedback) is a great way to make your dashboard successfull.
- Fast retrieval of information. One-time reports that are created on the fly by a user (as opposed to reports automatically populated every day) should return results very fast. I've seen Brio (a browser-based dashboard to create SQL queries) take 30 minutes to return less than a megabyte of data, even though there was no SQL join involved: when this happens, discuss with your sys admin, or use another tool (in my case, I trained the business analyst to directly write queries on the Oracle server, via a template Perl script accepting the SQL query as input, and returning the data as a tab-separated text file).
- 11 Features any database, SQL or NoSQL, should have
- Data science apprenticeship
- Data science certification
- Previous digests
- Data science resources
- Competitions and Challenges
- Salary surveys
- Data science books
- How to detect spurious correlations, and how to find the real ones
- Data science job ads that do not attract candidates, versus those that do
- Data Science and Analytics Jobs
- Hadoop resources
- 17 short tutorials all data scientists should read (and practice)
- 10 types of data scientists
- 66 job interview questions for data scientists
- Our Wiley Book on Data Science
- Data Science Top Articles
- Our Data Science Weekly Newsletter
- Practical illustration of Map-Reduce (Hadoop-style), on real data
- What makes up data science?
- DSC webinar series
Originally posted on Data Science Central