[Ieee_vis] Call for papers: IVI special issue on Visualization Evaluation

Tobias Isenberg isenberg at cs.rug.nl
Tue Dec 4 13:55:55 CET 2012


Visualization Evaluation
========================

A Special Issue of Information Visualization

(Please note that, while this is a call for the "information 
visualization" journal, we of course also accept and specifically 
encourage entries that are more in the SciVis or Visual Analytics area.)

Guest Editors: Enrico Bertini, Petra Isenberg, Tobias Isenberg, Heidi 
Lam, Adam Perer.

Visualization has recently gained much relevance for its ability to cope 
with complex data analysis and communication. Thanks to its ability to 
convey complex concepts in an efficient manner it has been adopted in 
many contexts; from scientific laboratories to newspapers.

While the overall use of visualization is accelerating, the growth of 
techniques for the evaluation of these systems has been slow. How do we 
measure the quality of a visualization? How do we decide on competing 
designs? How do we know whether a visualization meets its goal? These 
are only some of the questions researchers and practitioners are 
confronted with when designing visualization systems.

To understand how, when and why visualization works, evaluation efforts 
should be targeted at the component level, the system level, and the 
work environment level. The commonly used evaluation metrics such as 
task time completion and number of errors, when used in isolation, 
appear to be insufficient to quantify the complex quality and usage of a 
visualization system, therefore new metrics and methods to better 
evaluate visualizations are needed.

This special issue calls for innovative ideas about how to evaluate 
visualization and reflections about its state of the art. We call for 
papers dealing with evaluation in the fields of scientific 
visualization, information visualization, and visual analytics. We 
explicitly discourage submissions describing exclusively the process and 
outcome of a given evaluation unless the methodology itself is innovative.

Topics include, but are not limited to:

     Evaluation in the visualization development lifecycle
     Utility characterization
     Evaluation metrics
     Insight characterization
     Synthetic data sets and benchmarks
     Taxonomy of tasks
     Computational evaluation
     Benchmark development and repositories
     Methods for longitudinal studies and adoption
     Evaluation of early prototypes
     Evaluation heuristics and guidelines
     Evaluation of storytelling visualization
     Evaluation of visualization as art

Paper Submission

Submissions due: Feb 8, 2013
Acceptance notices: Apr 15, 2013
Final revisions due: May 13,  2013
Publication: late 2013

Enquiries should be made to the guest editors by sending an email to 
chairs at beliv.org

Electronic submissions of manuscripts in PDF should be made using the 
online submission system and the papers should be formatted according to 
the journal standards. For details on the submission process, please 
visit the Journal’s instruction website at:

http://ivi.sagepub.com/


More information about the ieee_vis mailing list