Measuring and Analyzing the Scholarly Impact of Experimental Evaluation Initiatives

Marco Angelini, Nicola Ferro, Birger Larsen, Henning Müller, Guiseppe Santucci, Gianmaria Silvello, Theodora Tsikrika

Research output: Contribution to journalConference article in JournalResearchpeer-review

4 Citations (Scopus)
225 Downloads (Pure)

Abstract

Evaluation initiatives have been widely credited with contributing highly to the development and advancement of information access systems, by providing a sustainable platform for conducting the very demanding activity of comparable experimental evaluation in a large scale. Measuring the impact of such benchmarking activities is crucial for assessing which of their aspects have been successful, which activities should be continued, enforced or suspended and which research paths should be further pursued in the future. This work introduces a framework for modeling the data produced by evaluation campaigns, a methodology for measuring their scholarly impact, and tools exploiting visual analytics to analyze the outcomes.
Original languageEnglish
JournalProcedia Computer Science
Volume38
Pages (from-to)133-137
Number of pages5
ISSN1877-0509
DOIs
Publication statusPublished - 2014
EventItalian Research Conference on Digital Libraries - Padua, Italy
Duration: 30 Jan 201431 Jan 2014
Conference number: 10

Conference

ConferenceItalian Research Conference on Digital Libraries
Number10
CountryItaly
CityPadua
Period30/01/201431/01/2014

Fingerprint

Benchmarking

Keywords

  • Scholarly Impact
  • Experimental Evaluation
  • Experimental Data
  • Visual Analytics

Cite this

Angelini, Marco ; Ferro, Nicola ; Larsen, Birger ; Müller, Henning ; Santucci, Guiseppe ; Silvello, Gianmaria ; Tsikrika, Theodora. / Measuring and Analyzing the Scholarly Impact of Experimental Evaluation Initiatives. In: Procedia Computer Science. 2014 ; Vol. 38. pp. 133-137.
@inproceedings{8466da54d0fb4e2abd46b27212aa8af2,
title = "Measuring and Analyzing the Scholarly Impact of Experimental Evaluation Initiatives",
abstract = "Evaluation initiatives have been widely credited with contributing highly to the development and advancement of information access systems, by providing a sustainable platform for conducting the very demanding activity of comparable experimental evaluation in a large scale. Measuring the impact of such benchmarking activities is crucial for assessing which of their aspects have been successful, which activities should be continued, enforced or suspended and which research paths should be further pursued in the future. This work introduces a framework for modeling the data produced by evaluation campaigns, a methodology for measuring their scholarly impact, and tools exploiting visual analytics to analyze the outcomes.",
keywords = "Scholarly Impact, Experimental Evaluation, Experimental Data, Visual Analytics",
author = "Marco Angelini and Nicola Ferro and Birger Larsen and Henning M{\"u}ller and Guiseppe Santucci and Gianmaria Silvello and Theodora Tsikrika",
year = "2014",
doi = "10.1016/j.procs.2014.10.022",
language = "English",
volume = "38",
pages = "133--137",
journal = "Procedia Computer Science",
issn = "1877-0509",
publisher = "Elsevier",

}

Angelini, M, Ferro, N, Larsen, B, Müller, H, Santucci, G, Silvello, G & Tsikrika, T 2014, 'Measuring and Analyzing the Scholarly Impact of Experimental Evaluation Initiatives', Procedia Computer Science, vol. 38, pp. 133-137. https://doi.org/10.1016/j.procs.2014.10.022

Measuring and Analyzing the Scholarly Impact of Experimental Evaluation Initiatives. / Angelini, Marco; Ferro, Nicola; Larsen, Birger; Müller, Henning; Santucci, Guiseppe; Silvello, Gianmaria; Tsikrika, Theodora.

In: Procedia Computer Science, Vol. 38, 2014, p. 133-137.

Research output: Contribution to journalConference article in JournalResearchpeer-review

TY - GEN

T1 - Measuring and Analyzing the Scholarly Impact of Experimental Evaluation Initiatives

AU - Angelini, Marco

AU - Ferro, Nicola

AU - Larsen, Birger

AU - Müller, Henning

AU - Santucci, Guiseppe

AU - Silvello, Gianmaria

AU - Tsikrika, Theodora

PY - 2014

Y1 - 2014

N2 - Evaluation initiatives have been widely credited with contributing highly to the development and advancement of information access systems, by providing a sustainable platform for conducting the very demanding activity of comparable experimental evaluation in a large scale. Measuring the impact of such benchmarking activities is crucial for assessing which of their aspects have been successful, which activities should be continued, enforced or suspended and which research paths should be further pursued in the future. This work introduces a framework for modeling the data produced by evaluation campaigns, a methodology for measuring their scholarly impact, and tools exploiting visual analytics to analyze the outcomes.

AB - Evaluation initiatives have been widely credited with contributing highly to the development and advancement of information access systems, by providing a sustainable platform for conducting the very demanding activity of comparable experimental evaluation in a large scale. Measuring the impact of such benchmarking activities is crucial for assessing which of their aspects have been successful, which activities should be continued, enforced or suspended and which research paths should be further pursued in the future. This work introduces a framework for modeling the data produced by evaluation campaigns, a methodology for measuring their scholarly impact, and tools exploiting visual analytics to analyze the outcomes.

KW - Scholarly Impact

KW - Experimental Evaluation

KW - Experimental Data

KW - Visual Analytics

U2 - 10.1016/j.procs.2014.10.022

DO - 10.1016/j.procs.2014.10.022

M3 - Conference article in Journal

VL - 38

SP - 133

EP - 137

JO - Procedia Computer Science

JF - Procedia Computer Science

SN - 1877-0509

ER -