Abstract
Evaluation initiatives have been widely credited with contributing highly to the development and advancement of information access systems, by providing a sustainable platform for conducting the very demanding activity of comparable experimental evaluation in a large scale. Measuring the impact of such benchmarking activities is crucial for assessing which of their aspects have been successful, which activities should be continued, enforced or suspended and which research paths should be further pursued in the future. This work introduces a framework for modeling the data produced by evaluation campaigns, a methodology for measuring their scholarly impact, and tools exploiting visual analytics to analyze the outcomes.
Originalsprog | Engelsk |
---|---|
Tidsskrift | Procedia Computer Science |
Vol/bind | 38 |
Sider (fra-til) | 133-137 |
Antal sider | 5 |
ISSN | 1877-0509 |
DOI | |
Status | Udgivet - 2014 |
Begivenhed | Italian Research Conference on Digital Libraries - Padua, Italien Varighed: 30 jan. 2014 → 31 jan. 2014 Konferencens nummer: 10 |
Konference
Konference | Italian Research Conference on Digital Libraries |
---|---|
Nummer | 10 |
Land/Område | Italien |
By | Padua |
Periode | 30/01/2014 → 31/01/2014 |