Expressive Interfaces

Dr Charles Martin

Announcements

Plan for the class

  • conceptualising expressive interactions
  • drawing interactions
  • music interactions
  • dance interactions
  • installed interactions
  • playful interactions
  • human-AI creative interactions

Conceptualising Expressive Interactions

why? expressive, artistic experiences can be drivers for HCI.

First head-mounted AR display system (Sutherland, 1968).
Virtual Reality in 1987: The Sound of One Hand performance (Lanier, 1993), video at Moogfest 2016.

Supporting Creativity

Movement towards studying tools to support creativity.

there is a move from routine work and productive concerns to human and creative ones.” (Edmonds, 2018)

Expression and creativity let us get inside the process of an interaction. People are interested in expressive experiences, leading to critique and understanding.

Iamascope as shown in Edmonds (2018)

Principles for Creativity Support Tools (CSTs)

Shneiderman (2007) posits principles for developing CSTs

  1. Support exploration.
  2. Low threshold, high ceiling, and wide walls.
  3. Support many paths and many styles.
  4. Support collaboration.
  5. Support open interchange.
  6. Make it as simple as possible—and maybe even simpler.
  1. Choose black boxes carefully.
  2. Invent things that you would want to use yourself.
  3. Balance user suggestions with observation and participatory processes.
  4. Iterate, iterate—then iterate again.
  5. Design for designers.
  6. Evaluate your tools.

Artists as Power Users

Artists are “creative power users” (Linda Candy in Shneiderman (2007)).

Artists show us the boundaries of human-computer interaction

Studying interactive art gives us insight into the potential of creativity support tools in the hands of experts.

This translates into findings about “everyday creativity” (Edmonds, 2018)

there are three levels of design: standard spec, military spec and artist spec… the third, artist spec, is the hardest (and most important) (Buxton, 1997)

What is an expressive interaction?

Mapping sensed gestures to an expressive output that is fed back to the user.

  • gestures: the use of motions by the limbs or body as a means of expression
  • can be unintentional, control, or ancillary gestures
  • from non-human actors (e.g.,the movement of a leaves on a branch of a tree)
  • “any sort of motion, that may be understood as an expression of something”

The interaction itself is expressive, and the output is an expression as well. We consult Composing Interactions (Baalman, 2022) as a resource.

Sensing movement and touch to create music, Atau Tanaka performing in 2010 (Tanaka, 2010)

Mapping from Gesture to Output

Why is mapping an important consideration?

Considering a performer performing making gestures on a stage, which gestures effect changes in the output medium of sound, which can be heard by the performer and audience in the real, physical environment.

The connection between a gesture in the environment to output media (Baalman, 2022).

Steps in Mappings

Baalman (2022) expands the mapping process into a cycle.

  • A gesture is performed in the environment;
  • This is captured by a sensor that translates the gesture is processed by an electronic circuit, often to digitalise it;
  • Next, the signal enters some sort of computational model that translates the data to parameters;
  • These parameters control an output medium such as sound, light, video, or mechatronics.

how is output from one connected to input? what happens in each step?

Steps in the mapping process (Baalman, 2022)

Drawing Interaction

MicroJam drawing/music app in 2018 (Martin & Torresen, 2020)

Surface Drawing System

Drawing with hands, moving, scaling, erasing (a pair of kitchen tongs) (Schkolne et al., 2001)

Hands define a stroke.
Kitchen tongs move a drawing.
A variety of shapes created with Surface Drawing (Schkolne et al., 2001).

AirPens

  • AirPens: Musical Doodling
  • Make music with mark-making.
  • Use IMU sensors to convert movement into sound; explore different mappings between movement and sound.
AirPen at NIME25.

MicroJam

App for making short (5s) musical performances with a sketch (video and info). (Martin & Torresen, 2020)

  • Uses touch location and movement to create sound.
  • Replay performances by rewinding the sketch and viewing it again.
  • Social media interface: view other people’s drawing and add layers as a musical “reply”
  • Analysed 1600 tiny jams to understand musical drawing behaviour.
Analysing the kinds of drawings
Analysing the speed of swipes

Music Interaction

The New Interface for Musical Expression (NIME) research community.

  • Research into musical instrument design that explores how technological innovation can enable new musical expression, enhance performer control and intimacy, and shape the musician-instrument relationship.
  • Digital Musical Instruments(DMIs): digital piano, drum pad.
  • Augmented instruments: magnetic resonator piano (grand piano -> string instruments).
  • Novel instrument: lady’s glove, magnetic AI instrument thales (Privato et al., 2023), percussive instrument PhaseRings (C. Martin, 2018), AR instrument cube (Wang & Martin, 2022).
Magnetic Resonator Piano

PhaseRings: natural gestures on big touchscreens

How can we perform music on big touchscreens?

  • Rather than re-implement music production apps, looked at natural gestures in long-term artistic practice.
  • Lots of apps from 2011–2014 and lots of performances e.g., Martin (2016)
  • Initially, sought to understand how percussionists would touchscreens (C. Martin et al., 2014)
  • Then explored networked connections in ensemble performance (C. Martin et al., 2015)
  • Then compared how networked ensembles supported by the interface (C. Martin et al., 2016)
PhaseRings played by visitors to a gallery workshop (2015)

Hyperinstruments: non-guitar guitars

  • When is a Guitar not a Guitar? (Harrison et al., 2018): novel instruments and controllers resemble traditional instruments.
  • Four designs examining variation in form (held vs tabletop) and interaction (strings vs touch sensor)
  • Guitarists prefer technical familiarity of stringed instrument (guitarists)
  • Non-musicians prefer touch-interface: ease of use, cultural load of the guitar form
Four “guitar-like” instruments from Harrison et al. (2018). strings vs no strings, held vs tabletop.

Authentic musical instruments for AR

cube system: authentic design for a head-mounted AR musical instrument (Wang & Martin, 2022)

Autobiographical design (Desjardins et al., 2021) to compare three interfaces candidates

  1. Physical interface: slow accurate manipulation due to hand tracking.
  2. Spatial interaction: emerged from bodily movement allowing ease of use.
  3. Flexible freehand interaction: allow multiple notes to be played simultaneously, taking advantage of the full-hand tracking affordance in the AR headset.

Gesture to Sound Mappings in Music

The traditional DMI model separates a musician’s input action captured by the controller, the mapping engine bridg- ing the input interface and sound engine (Magnusson, 2010).

Dance Interaction

From gestures to body (embodied) movements.

CO/DA System (Françoise et al., 2022)
  • Support real-time manipulation of continuous streams of the dancers’ motion data for interactive sound synthesis.
  • Enable novel dance improvisations through live coding.
  • Live coding: interactively programming musical or visual processes as performance.

Co/da system

  • Movements are measured using motion sensors, and the live coder processes motion signals to generate feedback in real-time.
  • Enable a multitude of feedback loops: sound feedback -> movement improvisation -> the coder alters the relationships between movement and sound.
  • Dynamic improvisation that stimulates novel movements’ exploration.
Co/Da’s mapping diagram (Françoise et al., 2022)

Installed Interactions

Putting expressive interactions into public places.

Bellyhorn by Dianne Verdonk (image: Charles Martin)

Dinosaur Choir

Dinosaur Choir: Adult Corythosaurus

  • Singing dinosaur skull musical instruments.
  • Experience dinosaur vocalisation: imagine dinosaur vocal anatomy from a bird syrinx, (vocal structure open question in paleontology)
  • Microphone for user, computational vocal model, sound resonates through a 3D printed replica of the dinosaur’s nasal cavities and skull
  • Change the pitch and timbre of the vocalisation by changing the shape of the mouths, like trumpet player
Dinosaur Choir at NIME25.

Illumicube (in Canberra!)

official website

  • Interactive sculpture in Canberra by Kerry Simpson (1988)
  • Glass and sound (now movement) activated lighting
  • Location: Ainslie Avenue, Canberra

Playful installed interaction can lead to unwanted behaviour! Noise from folks exiting civic pubs!

Illumicube (link) CC BY-NC 2.0

Playful Interaction

Can silly or playful ideas turn into interesting interactions?

Can we use play to examine more serious HCI concepts?

It’s fun to make the world more fun.

The Phox Ears listening helmet by Kleinberger et al. (2015)

Breath Controlled Amusement Ride

Can an amusement ride be controlled by breath?

  • Inspired by robotic technologies for control of individual seats on rollercoasters and other thrill rides.
  • the Broncomatic is a bucking bronco game (horse riding; you try to stay on!)
  • The twist is it’s controlled by the user’s breath
  • the ride kicks you around and if you lose control it kicks harder (until you fall off)

“Breath Control of Amusement Rides” (Marshall et al., 2011)

Falling off the Broncomatic (Marshall et al., 2011)

More on the Broncomatic

  • A straightforward mapping: the rider’s breathing to the horizontal rotation of the ride.
    • Inhale: spins clockwise; exhale: spins anti-clockwise.
    • Breathing speed controls rotation speed: fast breathing faster spin; holding breath stops spinning.
    • Difficulty levels.
  • The program is a game in which the player scores more points the more that they breathe: a physical challenge vs reward dynamic.
    • More breathing for faster ride but harder to stay on. To score high, you must breathe more, but this makes the ride tougher.
Falling off the Broncomatic (Marshall et al., 2011)

Human-AI Creative Interaction

Interest incorporating AI into creative interaction since computing began.

Recent work often focuses on current genAI breakthroughs, e.g. Autolume visual generator system and Reprising Elements performance (2023)

Musicians performing with a genAI visualisation (link)

Why introduce AI into expressive interaction?

Computational creativity helps create new ideas in three ways (Boden, 1998)

  • Produce novel combinations of familiar ideas;
  • Explore the potential of conceptual spaces;
  • Make transformations that enable the generation of previously impossible ideas.

Creativity and technology: a sociotechnological perspective (Bown, 2021).

  • The social nature of human behaviour.
  • Artistic behaviour is social in nature (is it?)
Beyond the Creative Species Making Machines That Make Art and Music. (Bown, 2021) open-access link

Cobbie: co-creative robots

Cobbie system (Lin et al., 2020)

  • Motivation: Co-creative partner can reason about user’s intention and stably present novel ideas with the user initiative, which are hindered in human team due to social loafing or a resolute partner otherwise.
  • Take turns to draw ideas, give the dominant position to the user, and use movements and sound feedback for communication.
  • Three human-robot interactions: your turn, pause and draw again, progressing with feedback

Holographic dancing ghost

  • Co-creative public dancer (Long et al., 2019; Trajkova et al., 2023, 2024)
  • Explore the design of the modular AI agent to creatively collaborate with a dancer.
  • A Kinect motion capture device to detect the user’s motion, visualised as a virtual shadow on a projection screen.
  • The humanoid agent shadow dances by analysing the user’s movement and responding with a movement that it deems to be similar in terms of parameters such as energy, tempo, or size.
  • Study results showed in-the-moment influences, self, partner, environment(Trajkova et al., 2024).

Recording a video in Powerpoint

For the final project you need to record an upload a presentation video. The specification is:

  • must be no longer than 5.5 minutes (330 seconds)
  • must be no larger than 1920 x 1080 pixels.
  • must be narrated with your voice
  • must show video of you speaking

You can do this easily with Powerpoint, so let’s give it a try.

recording a presentation video in powerpoint

Questions: Who has a question?

Who has a question?

  • I can take cathchbox question up until 2:55
  • For after class questions: meet me outside the classroom at the bar (for 30 minutes)
  • Feel free to ask about any aspect of the course
  • Also feel free to ask about any aspect of computing at ANU! I may not be able to help, but I can listen.
Meet you at the bar for questions. 🍸🥤🫖☕️ Unfortunately no drinks served! 🙃

References

Baalman, M. (2022). Composing interactions: An artist’s guide to building expressive interactive systems. V2_Publishing.
Boden, M. A. (1998). Creativity and artificial intelligence. Artificial Intelligence, 103(1), 347–356. https://doi.org/https://doi.org/10.1016/S0004-3702(98)00055-1
Bown, O. (2021). Beyond the creative species: Making machines that make art and music. The MIT Press. https://mitpress.mit.edu/9780262045018/beyond-the-creative-species/
Buxton, B. (1997). Artists and the art of the luthier. SIGGRAPH Comput. Graph., 31(1), 10–11. https://doi.org/10.1145/248307.248315
Desjardins, A., Tomico, O., Lucero, A., Cecchinato, M. E., & Neustaedter, C. (2021). Introduction to the special issue on first-person methods in HCI. ACM Trans. Comput.-Hum. Interact., 28(6). https://doi.org/10.1145/3492342
Edmonds, E. (2018). The art of interaction: What HCI can learn from interactive art. Morgan & Claypool Publishers. https://doi.org/10.2200/S00825ED1V01Y201802HCI039
Françoise, J., Fdili Alaoui, S., & Candau, Y. (2022). CO/DA: Live-coding movement-sound interactions for dance improvisation. Proceedings of the 2022 CHI Conference on Human Factors in Computing Systems. https://doi.org/10.1145/3491102.3501916
Harrison, J., Jack, R. H., Morreale, F., & McPherson, A. P. (2018). When is a guitar not a guitar? Cultural form, input modality and expertise. In T. M. Luke Dahl Douglas Bowman (Ed.), Proceedings of the international conference on new interfaces for musical expression (pp. 299–304). Virginia Tech. https://doi.org/10.5281/zenodo.1302589
Kleinberger, R., Dublon, G., Paradiso, J. A., & Machover, T. (2015). PHOX ears: A parabolic, head-mounted, orientable, eXtrasensory listening device. In E. Berdahl & J. Allison (Eds.), Proceedings of the international conference on new interfaces for musical expression (pp. 30–31). Louisiana State University. https://doi.org/10.5281/zenodo.1179106
Lanier, J. (1993). The sound of one hand. Whole Earth Review.
Lin, Y., Guo, J., Chen, Y., Yao, C., & Ying, F. (2020). It is your turn: Collaborative ideation with a co-creative robot through sketch. Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems, 1–14. https://doi.org/10.1145/3313831.3376258
Long, D., Jacob, M., & Magerko, B. (2019). Designing co-creative AI for public spaces. Proceedings of the 2019 Conference on Creativity and Cognition, 271–284. https://doi.org/10.1145/3325480.3325504
Magnusson, T. (2010). Designing constraints: Composing and performing with digital musical systems. Computer Music Journal, 34(4), 62–73. http://www.jstor.org/stable/40962941
Marshall, J., Rowland, D., Rennick Egglestone, S., Benford, S., Walker, B., & McAuley, D. (2011). Breath control of amusement rides. Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, 73–82. https://doi.org/10.1145/1978942.1978955
Martin, C. (2018). PhaseRings (Version v1.3) [Computer software]. Zenodo. https://doi.org/https://doi.org/10.5281/zenodo.1237701
Martin, C. P. (2016). PhaseRings for iPad ensemble and ensemble director agent [Musical Performance]. Musical Program of the International Conference on Auditory Display, 232–233. http://www.icad.org/icad2016/proceedings/concert/ICAD2016_paper_99.pdf
Martin, C. P., & Torresen, J. (2020). Data driven analysis of tiny touchscreen performance with MicroJam. Computer Music Journal, 43(4), 41–57. https://doi.org/10.1162/COMJ_a_00536
Martin, C., Gardner, H., & Swift, B. (2014). Exploring percussive gesture on iPads with ensemble metatone. Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, 1025–1028. https://doi.org/10.1145/2556288.2557226
Martin, C., Gardner, H., & Swift, B. (2015). Tracking ensemble performance on touch-screens with gesture classification and transition matrices. In E. Berdahl & J. Allison (Eds.), Proceedings of the international conference on new interfaces for musical expression (pp. 359–364). Louisiana State University. https://doi.org/10.5281/zenodo.1179130
Martin, C., Gardner, H., Swift, B., & Martin, M. (2016). Intelligent agents and networked buttons improve free-improvised ensemble music-making on touch-screens. Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems, 2295–2306. https://doi.org/10.1145/2858036.2858269
McPherson, A., Morrison, L., Davison, M., & Wanderley, M. M. (2024). On mapping as a technoscientific practice in digital musical instruments. Journal of New Music Research, 53(1-2), 110–125. https://doi.org/10.1080/09298215.2024.2442356
Miranda, E. R., & Wanderley, M. (2006). New digital musical instruments: Control and interaction beyond the keyboard (computer music and digital audio series). A-R Editions, Inc.
Pigrem, J., & McPherson, A. P. (2018). Do we speak sensor? Cultural constraints of embodied interaction. In T. M. Luke Dahl Douglas Bowman (Ed.), Proceedings of the international conference on new interfaces for musical expression (pp. 382–385). Virginia Tech. https://doi.org/10.5281/zenodo.1302633
Privato, N., Magnusson, T., & Einarsson, E. T. (2023). Magnetic interactions as a somatosensory interface. In M. Ortiz & A. Marquez-Borbon (Eds.), Proceedings of the international conference on new interfaces for musical expression (pp. 387–393). https://doi.org/10.5281/zenodo.11189218
Schkolne, S., Pruett, M., & Schröder, P. (2001). Surface drawing: Creating organic 3D shapes with the hand and tangible tools. Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, 261–268. https://doi.org/10.1145/365024.365114
Shneiderman, B. (2007). Creativity support tools: Accelerating discovery and innovation. Commun. ACM, 50(12), 20–32. https://doi.org/10.1145/1323688.1323689
Sutherland, I. E. (1968). A head-mounted three dimensional display. Proceedings of the December 9-11, 1968, Fall Joint Computer Conference, Part i, 757–764. https://doi.org/10.1145/1476589.1476686
Tanaka, A. (2010). Mapping out instruments, affordances, and mobiles. Proceedings of the International Conference on New Interfaces for Musical Expression, 88–93. https://doi.org/10.5281/zenodo.1177903
Trajkova, M., Deshpande, M., Knowlton, A., Monden, C., Long, D., & Magerko, B. (2023). AI meets holographic pepper’s ghost: A co-creative public dance experience. Companion Publication of the 2023 ACM Designing Interactive Systems Conference, 274–278. https://doi.org/10.1145/3563703.3596658
Trajkova, M., Long, D., Deshpande, M., Knowlton, A., & Magerko, B. (2024). Exploring collaborative movement improvisation towards the design of LuminAI—a co-creative AI dance partner. Proceedings of the 2024 CHI Conference on Human Factors in Computing Systems. https://doi.org/10.1145/3613904.3642677
Wang, Y., & Martin, C. P. (2022). Cubing Sound: Designing a NIME for Head-mounted Augmented Reality. International Conference on New Interfaces for Musical Expression.
Wessel, D., & Wright, M. (2002). Problems and prospects for intimate musical control of computers. Comput. Music J., 26(3), 11–22. https://doi.org/10.1162/014892602320582945