Guest blog post by Vincent Granville
Questions to ask when considering visualization tools:
- How do you define and measure the quality of a chart?
- Which tools allow you to produce interactive graphs or maps?
- Which tools do you recommend for big data visualization?
- Which visualization tools can be accessed via an API, in batch mode? (for instance, to update earthquake maps every 5 minutes, or stock prices every second)
- What do you think of Excel? And Python or Perl graph libraries? And R?
- Are there any tools that allow you to easily produce videos about of your data (e.g. to show how fraud cases or diseases spread over time)?
- In Excel you can update your data: then your model and charts get updated right away. Are there any alternatives to Excel, offering the same features, but having much better data modeling capabilities?
- How do you produce nice graph structures - e.g. to visually display Facebook connections?
- What is an heat map? When does it make sense to use it?
- How do you draw "force-directed graphs"?
- Good tools for raster images? for vector images? for graphs? for decision trees? for fractals? for time series? for stock prices? for maps? for spatial data?
- How can you integrate R with other graphical packages?
- How do you represent 5 dimensions (e.g. time, volume, category, price, location) in a simple 2-dimensional graph? Or is better to represent fewer dimensions if your goal is to communicate a message to executives?
- Why visualization tools used by mathematicians and operations research practitioners (e.g. Matlab) are not the same as tools used by data scientists? Is it because of the type of data, or just historical reasons?