You are here: SE » TeachingHome » VorlesungEmpirie2012

V+Ü Empirical Evaluation in Informatics (Empirische Bewertung in der Informatik) SS 2012

This is the homepage of the lecture (Vorlesung) "Empirische Bewertung in der Informatik" (Empirical Evaluation in Informatics) and its corresponding tutorial (Übung).

Description

As an engineering discipline, Informatics is constantly developing new artifacts such as methods, languages/notations or concrete software systems. In most cases, the functional efficiency and effectiveness of these solutions for the intended purpose is not obvious -- especially not in comparison to other already existing solutions for the same or similar purpose.

For this reason, methods for evaluating the efficacy of these solutions must be a routine part of Informatics -- a fact which unfortunately only slowly has become recognized. Evaluation is needed by those who create new solutions (that is in research and development), but also by the users, as these need to evaluate the expected efficacy specifically for their situation. These evaluations need to be empirical (that is based on observation), because the problems are nearly always too complicated for an analytical (that is a purely thought-based) approach.

This lecture presents the most important empirical evaluation methods and explains where these have been used (using examples) and should be used, how to use them and what to consider when doing so.


Administration

Lecturers

Requirements/target group, classification, credit points etc.

see entry in the KVV course syllabus

Registration

  • All participants need to be member of the mailing-list SE_V_EMPIR. (Please enter both your given name and family name.) Via this list important information and announcements will be sent. Please sign in individually.

  • KVV (course syllabus): All participants need to have registered in the KVV.

  • For the tutorials every participant need to have registered in the Blackboard.
    • Subscribe to »Empirische Bewertung in der Informatik« (MATHINF_Ue_19541a_12S Empirische Bewertung in der Informatik).

Dates

  • The lecture is held Mondays from 10:15 to 11:45 in room 049, Takustr. 9,
  • The tutorial takes place Mondays from 12:15 to 13:45 in room 049, Takustr. 9
  • Written exam: Monday, 2012-07-16, 11:59, room 005, Takustr. 9, Takustr. 9
  • Post-exam review (Klausureinsicht): Wednesday, 2012-10-17, 15:59 until at least 16:30, room 006, Takustr. 9

Examination modalities

Necessary criteria for obtaining the credit points:
  • Completion of at least 80% of the tasks on the practice sheets
  • active participation in the tutorial
  • passing of the written examination


Content

Some of the linked documents can only be accessed from the FU net (externally you receive a 403/Forbidden: "You don't have permission to access ...").

Attention: The practice sheets are now to be found in a separate section Practice Sheets.

Lecture topics

The lecture divides into three sections:
  • Introduction (3 weeks): Introduces the basic ideas of empiricism and discusses quality characteristics for empiricial studies (lectures 1 to 3).
  • Methods (7 weeks): Presents basic aspects of and approaches to various empirical methods and illustrates them with concrete examples from the scientific literature.
  • Data analysis (2 weeks): Empirical studies always generate raw data first which may partly be of qualitative and partly of quantitative nature. The research results only arise from the data's analysis and interpretation. The topic of the analysis of quantitative data is so comprehensive that you may dedicate an entire degree to it (statistics).
    This section gives the first introduction to the analysis of quantitative data. (The completely different analysis of qualitative data is beyond the scope of this lecture.)

The individual lectures:
  1. (16.4.2012) Introduction - The role of empiricism:
    • Term "empirical evaluation"; theory, construction, empiricism; status of empiricism in Informatics
    • Hypothetical examples of use
    • quality criteria: reliability, relevance
    • Note: scale types
  2. (23.4.2012) The scientific method:
    • Science and methods for gaining insights; classification of Informatics
    • The scientific method; variables, hypotheses, control; internal and external validity; validity, reliability, relevance
  3. (30.4.2012) How to lie with statistics:
    • When looking at somebody else's conclusions from data: What is actually meant? What specifically? How can they know it? What is not said?
    • Does the measurement distort the meaning? Is the sample biased?, etc.
    • Material: book on the topic; Study on alternative ink; article with arguments against hypothesis testing: "The earth is round (p < 0.05)".

  4. (7.5.2012) Empirical approach:
    • steps: formulate aim and question; select method and design study; create study situation; collect data; evaluate findings; interpret results.
    • example: N-version programming (article, reply to the criticisms against it)
  5. (14.5.2012) Survey:
    • example: relevance of different topics in Informatics education (article)
    • method: selection of aims; selection of group to be interviewed; design and validation of the questionnaire; execution of the survey; evaluation; interpretation
  6. (21.5.2012) Controlled experiment:
    • example 1: flow charts versus pseudo-code (article, criticized prior work)
    • method: control and constancy; problems with reaching constancy; techniques for reaching constancy
    • example 2: use of design pattern documentation (article)
  7. (4.6.2012) Quasi experiment:
    • example 1: comparison of 7 programming languages (article, detailed technical report)
    • method: like controlled experiment, but with incomplete control (mostly: no randomization)
    • example 2: influence of work place conditions on productivity (article)
  8. (11.6.2012) Benchmarking:
    • example 1: SPEC CPU2000 (article)
    • Benchmark = measurement + task + comparison; problems (costs, task selection, overfitting); quality characteristics (accessibility, effort, clarity, portability, scalability, relevance) (article)
    • example 2: TREC (article)

  9. (18.6.2012) Data analysis - basic terminology:
  10. (25.6.2012) Data analysis - techniques:
    • Samples and populations; average value; variability; comparison of samples: significance test, confidence interval; bootstrap; relations between variables: plots, linear models, correlations, local models (loess)
    • Article: "A tour through the visualization zoo"

  11. (2.7.2012) Case study:
    • example 1: Familiarization with a software team (article)
    • method: characteristics of case studies; what is the 'case'?; use of many data types; triangulation; validity dimensions
    • example 2: An unconventional methods for für requirements inspections (article)
  12. (9.7.2012) Other methods:
    • The method landscape; simulation; software archeology (studies on the basis of existing data); literature study;
    • example simulation: scaling of P2P file sharing (article)
    • example software archeology: code decline (article)
    • example literature study: a model of the effectiveness of reviews (article)

  13. (oops, term is already over!) Summary and advice:
    • Role of empiricism; quality criteria; generic method; advantages and disadvantages of the methods; practical advice (for data analysis; for conclusion-drawing; for final presentation); outlook

Aims of the tutorials

  • Tutorial 1 to 3 (concerning R)
    • To get to know the possibilities of a free, comprehensive and modern statistics software and gain basic skills with it.
    • To get to know a new way of thinking for programming ("programming with data") and practice it.
    • Realize how enlightening a data analysis may be in some cases and how useless in others.
  • Tutorial 4 to 9 (project: empirical study)
    • To have gone through the design process of an empirical study oneself and to realize how many aspects must be considered.
    • To experience how many good ideas you may have and how many others possibly are still missing.
    • To realize how important it is to work accurately (because a correction of mistakes is often impossible and usually causes a huge amount of extra work).
    • To have had the gee-whiz experience of analysing data which nobody else in this world has seen so far.

Practice sheets

(These links will be added continuously as the course proceeds.)

Changes over the years

  • 2010: Lecture and tutorial both held in English.
  • 2004: Lecture first held.
  • 2005: Lecture: only minor changes. Tutorial: broader choice of topics for the surveys.

Literature


(Comments)

Should you have comments or suggestions concerning this page, you may add them here (possibly with date and name):