Back to previous page
Workshop Publication

Designing the Evaluation of Operator-Enabled Interactive Data Exploration in VALIDE

AUTHORS:
CNRS, France
Yogendra Patil
CNRS, France
Sihem Amer-Yahia
MPE, Germany
Srividya Subramanian
ADDITIONAL AUTHORS:
PUBLISHED IN:   
accepted in:
HILDA 2022: Workshop on Human-In-the-Loop Data Analytics
CURRENT STATUS
Yet to be published
DATE:   
May 3, 2022
Read full article

Interactive Data Exploration (IDE) systems are technologies that facilitate the understanding of large datasets by providing high level easy-to-use operators. Compared to traditional querying systems, where users have to express each query, IDE systems allows users to perform expressive data exploration following the click-select-execute paradigm. Today, there exists no full-fledged evaluation framework for operator-enabled IDE. Most previous works are based on either logging user actions implicitly to compute quantitative metrics or running user studies to collect explicit feedback. Hence, there is a pressing need to articulate an evaluation framework that collects and compares quantitative human feedback along with system and data-centric evaluations. In this paper, we develop VALIDE, a preliminary design of a unified framework consisting of a methodology and metrics for IDE systems. VALIDE combines research from database benchmarking and human-computer interaction and will be demonstrated with a real IDE system.

Will be available soon to download

Get in touch

Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form, try again please.