Interactive Data Exploration (IDE) systems are technologies that facilitate the understanding of large datasets by providing high level easy-to-use operators. Compared to traditional querying systems, where users have to express each query, IDE systems allows users to perform expressive data exploration following the click-select-execute paradigm. Today, there exists no full-fledged evaluation framework for operator-enabled IDE. Most previous works are based on either logging user actions implicitly to compute quantitative metrics or running user studies to collect explicit feedback. Hence, there is a pressing need to articulate an evaluation framework that collects and compares quantitative human feedback along with system and data-centric evaluations. In this paper, we develop VALIDE, a preliminary design of a unified framework consisting of a methodology and metrics for IDE systems. VALIDE combines research from database benchmarking and human-computer interaction and will be demonstrated with a real IDE system.