ISIM 2015

Here is a zip file that contain pdfs of the slides from my presentation at ISIM 2015. It also includes the Max patches for the instruments I demonstrated during the presentation. The instruments have no documentation with them, so feel free to contact me at jvalbert@loyno.edu if you need info.

Complete package of ISIM 2015 files here.

Stayin’ Alive: Preserving Electroacoustic Music | NewMusicBox

Stayin’ Alive: Preserving Electroacoustic Music | NewMusicBox:

“What on earth is going to happen to compositions that are painstakingly crafted for effective live performance at the time of their creation, but which become increasingly difficult to mount live, simply due to the march of time?”

It is a question we all must face when we make piece specific software.

Fall 2013 Conference Presentations

I have a busy fall with 4 presentations dealing with aspects of my Interactive Musical Partner (IMP) research.

On Friday, September 6, I’ll present Valued Features of Improvised Musical Interactions (or What I learned from my computerized improvisation partner) at the Guelph Jazz Festival Colloquium in Guelph, Ontario, Canada.

On Monday, October 14, I’ll give a demonstration entitled Interactive Musical Partner: A Demonstration of Musical Personality Settings for Influencing the Behavior of an Interactive Musical Generation System at 2nd International Workshop on Musical Metacreation, which is part of the Ninth Annual Conference on Artificial Intelligence and Interactive Digital Entertainment, at Northeastern University in Boston, Massachusetts. The proceedings of MUME and AIIDE will be published by the Association for the Advancement of Artificial Intelligence Publications.

On Thursday, October 17, I will present a poster/paper entitled Interactive Musical Partner: A Modular Human/Computer Duo Improvisation System at the 10th International Symposium on Computer Music Multidisciplinary Research, at Laboratoire de Méchanique et d’Acoustique in Marseille, France. The CMMR proceedings will be published by Springer Verlag in their Lecture Notes in Computer Science Series.

I will present a performance demonstration entitled Interactive Musical Partner: A Look at the Components of a Human/Computer Duo Improvisation System at the Electroacoustic Barn Dance 2013, which is held at The University of Mary Washington in Fredericksburg, VA from November 7-9.

Symposium on Laptop Ensembles & Orchestras

SLEO is taking place at LSU, April 15-17. The SLEO website blurb:

an international workshop on music performance using mobile devices and laptop computers. The symposium, April 15-17, 2012, will offer workshops for those wishing to learn about mobile and laptop ensembles, as well as peer-reviewed papers and panels to discuss the state-of-the-art and best practices of this exciting new genre of music performance and music technology.

I am chairing a panel discussion titled “Play Something Crazy…Now: Improvisation as a Tool in Composing for Laptop Orchestras.” It will be Tuesday at 1 pm, and the panelists are Paula Matthusen (Wesleyan University), Christopher Burns (UW-Milwaukee), Scott Hewitt (University of Huddersfield), and J. Corey Knoll (LSU). We will discuss improvisation as both a means and an end in composing for LEOs, with some attention given to composing for ensembles of unknown instrumentation.

Thoughts on computer based instrument paradigms

Note: This was originally posted on Scratch My Brain, but seemed appropriate to this space, so I have posted it here as well.

Over the past couple of years, I have been thinking about computer music instrument design, or how to turn my laptop into a musical instrument. Much of this is due to my participation in the Laptop Orchestra of Louisiana or LOLs. The process of writing a piece for the LOLs often involves designing an instrument, and in my thinking on the subject, I have been putting these instruments into two broad categories. Direct control instruments are instruments in which an action of the performer maps directly to a sound from the instrument, i.e. pull the trigger and sound comes out, move the joystick forward and the pitch changes, etc. The other category is code/process controlled instruments, or instruments where the sound is produced by a process, which is simply launched by the performer, or possibly live coded, but the performer does not have control of individual musical events once the process is set into motion.

I have tended towards direct control instruments in my own work. I think this is largely due to my trombone player DNA. I am used to playing an acoustic instrument (direct control) and so much of my performance world view has been formed by that experience. One of the difficulties with designing new direct control instruments is that it often takes a significant amount of time to learn to play them well. Like any instrument, one must spend some time with it to develop any technique or sense of musical connection to the instrument.

On the other hand, process controlled instruments allow for the creation of highly complex musical expressions with little or no time spent learning technique, but they lack the intimacy of control, especially in terms of timing, that one gets from direct control.

Tonight I was reading an article (from 1991) by David Wessel called “Improvisation with Highly Interactive Real-Time Performance Systems.” In this article, he describes a system that seems to be a direct process control system. He launches the processes (I use the term process to be consistent with my categories, I don’t know that he would use that word) from a direct control instrument. This returns the control of low level timing to the performer, yet allows the performer to still take advantage of what the computer processes have to offer. He also talks about mapping expressive gestures to entire phrases as opposed to single notes.

These ideas have started some wheels turning about my next computer instrument.

I love it when I discover that someone solved my current dilemma twenty years ago. That’s why we should always be attentive in history class.