Real Time Interaction - 30876

Interacció en Temps Real
Interacción en Tiempo Real
Carles Fernández DTIC
Sergi Jordà DTIC
Brief Summary: 

We analyse real time stimuli generation techniques, as well as the system requirements needed to offer the user an interaction close to null delay perception in each of the possible modalities.

Detailed Description: 

This module focuses on the study of real-time interaction from several perspectives, both conceptual and technological.

The conceptual part starts discussing the concept of real-time showing how relative and subjective this concept can be. It then concentrates on real-time musical interaction because, for millennia, musical performance has constituted (and still can be considered) the paradigm of rich and complex real-time human-machine-interaction. From this musical perspective the concepts of 'controller device' and 'mapping' are studied in depth, and both concepts are also extended to non-musical contexts and situations. The musical context is still important for studying expert interaction, analyzing concepts such as playability, explorability, non-linearity, control, expressiveness or virtuosic interaction. These concepts are then translated to a more general, non-musical domain, where multi-user, multipoint, multidimensional and continuous interaction is studied, with a special focus on exploratory search. To conclude this part, tabletop interfaces are introduced as a type of interfaces that favor all the previously studied concepts.

The technological part of the course starts by defining and studying the more important technical concepts and aspects of real-time interaction and implementations, such as time resolution, latency, jitter, synchronous vs. asynchronous models, single-threaded vs. multithreaded architectures, polling vs. interrupts, etc. After that, different programming languages paradigms (such as visual data flow languages, and scripting languages), and different real-time communication protocols between applications (MIDI, UDP, TCP, OSC…) are studied. This part concludes with the study of the real-time programming in portable devices such as PDAs or mobile phones.

The technological part is complemented with some practical exercises which cover hands-on studies on programming languages and an in-depth exercise on mapping issues.

Part I: Theory and concepts

Introduction to real-time and complex Interaction

  • What is real-time? How real it is? How instantaneous must it be?
  • The perception of time from the different senses - Study of human input and output senses
  • Data generation, transmission and interpretation. A controller-proc
  • Generative vs. selective interaction
  • Abstract and multimodal vs. “realistic” interaction
  • Degrees of freedom, multidimensional and continuous interaction
  • Communication bandwidth, human bandwidth and ergonomics

Musical Interaction

  • A historical and taxonomical study on musical instruments
  • The digital musical instrument
  • Interactive music historical overview
  • Interactive music metaphors
  • The multithreaded digital musical instrument and shared musical control
  • Multi-user musical instruments

Controllers and mappings

  • Controllers: taxonomies, examples and case-studies
  • Controllers: resolution, dimension coupling, time and space multiplexing
  • Divergent, convergent and many-to-many mappings
  • Mappings: Derivative, integrative, memory, non-linearity
  • Controllers and Feedback: Haptic, Visual, Audio

Towards expert interaction: expressivity, non-linearity, control and virtuosity

  • Interaction efficiency
  • The frustration-boredom equilibrium
  • Playability, explorability, progression and learnability
  • Expressive and virtuosic interaction

Complex non musical interaction

  • Multipoint and multi-user interaction
  • Multidimensional and continuous
  • Interfaces for complex interaction
  • Exploratory search: a case study

Tabletop interfaces and the reactable: a case study

  • Tabletop interfaces: multiuser, multipoint and tangible interaction
  • Tabletop interfaces, multimodalism and bandwidth maximization
  • The reactable as an interface for shared multithreaded control
  • The importance of feedback

Part II: Technology and practice

Real-time systems technical concepts and architectures

  • Feedback loop (introduce this in 1 and 3), periodicity and stability
  • Sampling frequency, time resolution, latency and jitter
  • Filter techniques, data smoothing
  • Synchronous vs. asynchronous models
  • Single-threaded vs. multithreaded architectures
  • Models of synchronicity in multithreaded architectures
  • Polling and interrupts
  • Timing, clocks and callback functions
  • Examples of synchronicity management in several programming languages (including visual-data flow languages)
  • Event based programming

Real-time programming, examples and paradigms

  • Issues in Realtime programming, garbage collection, memory managment, Performance issues.
  • Visual data flow languages for RT programming: Pure data and Max/MSP
  • High Level languages and realtime programming: Java
  • Scripting languages and realtime programming: Python, Ruby, Lua
  • Communication protocols between applications: MIDI, UDP vs. TCP, OSCLow latency streaming.

Real-time programming issues on Handheld Devices, mobiles and PDAs

  • Controllers and sensors in handheld devices
  • Screens and output on small devices
  • Technologies (Java, Symbian, WindowsMobile)
  • Performance issues
  • Practical examples of realtime interaction on handheld devices (iPod shaker, usage of advanced sensors, beyond the keyboard-screen approach, audio input)

Practical exercises

Introduction to Visual data flow programming languages

Hands-on on Pure Data. Exercices on data flow, event based programming and scheduling. Pure Data sports two different programming approaches in one system.

One is stream based dataflow (for audio calculations) the other event based. Most of the issues in multithreaded realtime programming can be exemplified with an easy to use system.

Introduction to scripting languages for real-time

Event handling in a pure visual language can get complicated. We show ways how to use scripting languages in order to react timely to any kin dof event.Exercices on data flow, event based programming and scheduling.


Given several multidimensional and continuous input devices (e.g. joystick, P5 data glove, webcam…) and a predefined synthesis engine (e.g. a VST plug-in), implement and experiment with different mappings, applying concepts such as convergence, divergence, many-to-many, feedback, memory or hysteresis, derivation and integration, using a given programming environment such as Pd or Python.

Evalutation Criteria: 

The course is divided in two parts, a theorethical one and a practical one. Both parts are evaluated separately and the final grade will be the combination of both parts.

In the theorethical part, students will be required to write an essay or paper on a relevant topic chosen in consultation with the professor. Complementarily, students will be required to study the course literature and to participate actively in the discussions.

In the practical part, students will be required to conceive and develop a short practical project.

Course Structure: 

Week 1: Theory class: Introduction to "Interaction"

In this first class will start from scratch, discussing about the concepts of "Interaction" and "Interactivity".

For that, I would recommend you the following 3 readings.

If you still want more, you may also read this one :)

As you will hopefully notice, opinions and points of view can be quite different, even on such an apparently basic topic. So, please, do not follow or trust anything blindly (specially academic papers!).

Think for yourself. Think about these readings, be (constructively) critical, and try to build your own ideas.

Try also to formulate questions that could be later discussed in class.

Think also perhaps about this one: "how much of what is typically considered as interactive - sometimes even considered as the paradigm of interactivity -, is really interactive?"

For next theory class (week 3), prepare the questions of slides #17, 18 and 19.

What do you understand by "real-time" and "real-time interaction"? (you can do some Googling...)


Week 2: Practice class

Material for the session (including homework!)

Download pd-extended  ( mirror )


Week 3: Theory class: "Real-time Interaction"


Week 4: Practice class:



Assigment: Create the Input part so it uses input methods other than PD controllers (like sliders and such).

The devices must be useful to control the logic part, while respecting the limitations of 3 outlets with a range 0-100. You can change the logic part if you want. You can use multiple devices or just one. Experiment for a while to see how it feels, and choose the mapping that you are more confortable with.

Send it to me before 30 Oct.

Instructions to use Kinect (via OSCeleton)


Week 5: Theory class: "Real-time Musical Interaction / Digital music performance / NIME"

The next class we will cover a very wide topic, that of New Interfaces for Musical Expression (NIME). These are some recommended readings:

This is probably the first paper that introduces the idea of "Interactive Music" and "Interactive Composition". It is rather concrete and specific, but some of its general ideas are still completely relevant. 

This is an academic paper that does not look like one. It covers different aspects of the design and conception of Musical Interfaces, and it is mostly based on personal and empirical experiences, rather than on systematic research. That said, it is probably very hard to do a systematic reserach along these lines, and in that sense, the paper is full of very useful and relevant information, that every NIME designer should know and take into account. Things haven't changed as much as we might think in the last 20 years!  

This paper complements very well the previous one. It somehow covers the same aspects, but from a more systematic point of view. Also very recomandable for every NIME designer. 

This is a book chapter I wrote for the "Cambridge Companion to Electronic Music". It provides a very basic and introductory overview of the field. Here are the references of the whole book (including those of the current chapter).

If you want to read more, you can find some guidelines (together with hundreds of references) on different subtopics in my PhD thesis, specially in the first chapters.


The assignment (for Monday 12th October) can be done in two alternative ways:

  1. Imagine a interactive musical system or device you would like to see/play/conceive/design... Justify why you would like it, how you imagine it should work, etc.) . After that, try to find something related (there are MANY chances that what you have imagined, has already been done). Don't cheat! i.e.: first imagine, later do your research.
  2. Analyse an existing interactive musical system you specially like. Try not to focus on the commercial products (which are rather limited), but check for some alternative/academic production instead.

Where to find information / Some links:

Send me a PDF by e-mail (, before Monday November 12th at 14.00.

It doesn't have to be a scientific paper! It can just be a draft with ideas, drawings, links, etc.


Week 6: Practice class:



Please complete the third part of your interactive system: output. Try to integrate the three different parts.

You will have to present your work in class, so if you need extra material to do so, bring it with you.

Also, start thinking about your final project!



Week 7: Theory class: Interactive Music


For next class, you can refine & resubmit the "interactive music proposal" you have already submitted, based on some of the topics we have discussed in class.

Week 8: Practice class:

Presentations of your work!


Remix your project with parts of other groups.

Proposed exercises:

  • Use your Input and output parts with the logic of another group.
  • Combine 2 Logic parts together (Input -> Logic1 -> Logic2 -> Output)
  • Combine your input with somebody's output and logic.

What I want from you:

Take some of the most interesting combinations and describe the resulting program. How YOUR parts are supposed to behave and how are behaving now? What did you expect and what did you found?

Write a short report and send it to me in two weeks.

Complete projects: B C D E F G H J K L N


Week 9: Theory class: Interactivity & Control

Class slides

Additional information about mMTCF (Pd programming on Androids). This topic will not be covered in class. It is included here just in case anyone would be interested.

Before last theory class (Monday 10th December at 14:00), remember to mail me a draft for a Mobile App proposal. This app should make extensive use of motion sensing (ie. not only multitouch!) and should not be musical control or videogame app; neither a "real world" app (ie. not Augmented Reality, no geolocalization, etc.). It is probably not an easy task, so if you don't come with any idea outside these application domains, try it again being a bit less restrictive (but indicate it so in you document). For the draft file, send me a PDF with any format and content you want/think will be useful. Drawings, photos, mock-ups, hand-written notations, etc. will be specially welcome.

Week 10: Practice class:

Please tell us your availability for the final presentation: