


default search action
15th NIME 2015: Baton Rouge, Louisiana, USA
- Edgar Berdahl, Jesse T. Allison:

15th International Conference on New Interfaces for Musical Expression, NIME 2015, Baton Rouge, Louisiana, USA, May 31 - June 3, 2015. nime.org 2015 - Javier Jaimovich, R. Benjamin Knapp:

Creating biosignal algorithms for musical applications from an extensive physiological database. 1-4 - Courtney Brown, Sharif Razzaque, Garth Paine:

Rawr! a study in sonic skulls: embodied natural history. 5-10 - Eric Sheffield, Michael Gurevich:

Distributed mechanical actuation of percussion instruments. 11-15 - Alexander Refsum Jensenius:

Microinteraction in music/dance performance. 16-19 - Andreas Bergsland, Robert Wechsler:

Composing interactive dance pieces for the motioncomposer, a device for persons with disabilities. 20-23 - Brendan McCloskey, Brian Bridges, Frank Lyons:

Accessibility and dimensionalty: enhanced real-time creative independence for digital musicians with quadriplegic cerebral palsy. 24-27 - Andrew Mercer-Taylor, Jaan Altosaar:

Sonification of fish movement using pitch mesh pairs. 28-29 - Rébecca Kleinberger, Gershon Dublon, Joseph A. Paradiso, Tod Machover:

Phox ears: a parabolic, head-mounted, orientable, extrasensory listening device. 30-31 - Edgar Berdahl, Denis Huber:

The haptic hand. 32-33 - Tommy Feldt, Sarah Freilich, Shaun Mendonsa, Daniel Molin, Andreas Rau:

Puff, puff, play: a sip-and-puff remote control for music playback. 34-35 - Mikkel Helleberg Jørgensen, Aske Sønderby Knudsen, Thomas Michael Wilmot, Kasper Duemose Lund, Stefania Serafin, Hendrik Purwins:

A mobile music museum experience for children. 36-37 - Hsin-Ming Lin, Chin-Ming Lin:

Harmonic intonation trainer: an open implementation in pure data. 38-39 - Antonio Deusany de Carvalho Junior:

Indoor localization during installations using wifi. 40-41 - Dianne Verdonk:

Visible excitation methods: energy and expressiveness in electronic music performance. 42-43 - Brennon Bortz, Javier Jaimovich, R. Benjamin Knapp:

Emotion in motion: a reimagined framework for biomusical/emotional interaction. 44-49 - Duncan Menzies, Andrew P. McPherson:

Highland piping ornament recognition using dynamic time warping. 50-53 - Dario Cazzani:

Posture identification of musicians using non-intrusive low-cost resistive pressure sensors. 54-57 - Masami Hirabayashi, Kazuomi Eshima:

Sense of space: the audience participation music performance with high-frequency sound id. 58-60 - Asbjørn Blokkum Flø, Hans Wilmers:

Doppelgänger: a solenoid-based large scale sound installation. 61-64 - Sang Won Lee, Georg Essl:

Web-based temporal typography for musical expression and performance. 65-69 - Jiffer Harriman:

Start 'em young: digital music instrument for education. 70-73 - Arvid Jense, Hans Leeuw:

Wambam: a case study in design for an electronic musical instrument for severely intellectually disabled users. 74-77 - Rhushabh Bhandari, Avinash Parnandi, Eva Shipp, Beena Ahmed, Ricardo Gutierrez-Osuna:

Music-based respiratory biofeedback in visually-demanding tasks. 78-82 - Warren Enström, Joshua Dennis, Brian Lynch, Kevin Schlei:

Musical notation for multi-touch interfaces. 83-86 - Aura Pon, Johnty Wang, Laurie Radford, Sheelagh Carpendale:

Womba: a musical instrument for an unborn child. 87-90 - J. Cecilia Wu, Yoo Hsiu Yeh, Romain Michon, Nathan Weitzner, Jonathan S. Abel, Matthew Wright:

Tibetan singing prayer wheel: a hybrid musical- spiritual instrument using gestural control. 91-94 - Dan Ringwalt, Roger B. Dannenberg, Andrew Russell:

Optical music recognition for interactive score display. 95-98 - Özgür Izmirli:

Framework for exploration of performance space. 99-102 - Ricky Graham, Brian Bridges:

Managing musical complexity with embodied metaphors. 103-106 - Sair Sinan Kestelli:

Motor imagery: what does it offer for new digital musical instruments? 107-110 - Benjamin Knichel, Holger Reckter, Peter Kiefer:

Resonate - a social musical installation which integrates tangible multiuser interaction. 111-115 - Florent Berthaut, Diego Martínez Plasencia, Martin Hachet, Sriram Subramanian:

Reflets: combining and revealing spaces for musical performances. 116-120 - Abram Hindle:

Orchestrating your cloud orchestra. 121-125 - Charles Roberts, Matthew Wright, JoAnn Kuchera-Morin:

Beyond editing: extended interaction with textual code fragments. 126-131 - Koray Tahiroglu

, Thomas Svedström, Valtteri Wikström:
Musical engagement that is predicated on intentional activity of the performer with noisa instruments. 132-135 - Beste F. Yuksel, Daniel Afergan, Evan M. Peck, Garth Griffin, Lane Harrison, Nick W. B. Chen, Remco Chang, Robert J. K. Jacob:

Braahms: a novel adaptive musical interface based on users' cognitive state. 136-139 - William Marley, Nicholas Ward:

Gestroviser: toward collaborative agency in digital musical instruments. 140-143 - Muhammad Hafiz Wan Rosli, Karl Yerkes, Matthew Wright, Timothy Wood, Hannah Wolfe, Charlie Roberts, Anis Haron, Fernando Rincón Estrada:

Ensemble feedback instruments. 144-149 - James Leonard, Claude Cadoz:

Physical modelling concepts for a collection of multisensory virtual musical instruments. 150-155 - Jerônimo Barbosa, Joseph Malloch

, Marcelo M. Wanderley, Stéphane Huot:
What does 'evaluation' mean for the nime community? 156-161 - Andrew P. McPherson, Victor Zappi:

Exposing the scaffolding of digital instruments with hardware-software feedback loops. 162-167 - Si Waite:

Reimagining the computer keyboard as a musical interface. 168-169 - Alberto Novello, Antony Raijekoff:

A prototype for pitched gestural sonification of surfaces using two contact microphones. 170-173 - Simon Alexander-Adams, Michael Gurevich:

A flexible platform for tangible graphic scores. 174-175 - Peter D. Bennett, Jarrod Knibbe, Florent Berthaut, Kirsten Cater:

Resonant bits: controlling digital musical instruments with resonance and the ideomotor effect. 176-177 - Jiffer Harriman:

Feedback lapsteel : exploring tactile transducers as string actuators. 178-179 - Matthew Blessing, Edgar Berdahl:

Textural crossfader. 180-181 - Mikko Myllykoski, Kai Tuuri, Esa Viirret, Jukka Louhivuori:

Prototyping hand-based wearable music education technology. 182-183 - Jeff Snyder, Ryan Luke Johns, Charlie Avis, Gene Kogan, Axel Kilian

:
Machine yearning: an industrial robotic arm as a performance instrument. 184-186 - Jingyin He, Ajay Kapur, Dale A. Carnegie:

Developing a physical gesture acquisition system for guqin performance. 187-190 - Natasha Barrett:

Creating tangible spatial-musical images from physical performance gestures. 191-194 - Jérôme Villeneuve, Claude Cadoz, Nicolas Castagné:

Visual representation in genesis as a tool for physical modeling, sound synthesis and musical composition. 195-200 - Ian Hattwick, Marcelo M. Wanderley:

Interactive lighting in the pearl: considerations and implementation. 201-204 - Kazuhiko Yamamoto, Takeo Igarashi:

Livo: sing a song with a vowel keyboard. 205-208 - Robin Hayward:

The hayward tuning vine: an interface for just intonation. 209-214 - Kristian Nymoen, Mari Romarheim Haugen, Alexander Refsum Jensenius:

Mumyo - evaluating and exploring the myo armband for musical interaction. 215-218 - Rhys Duindam, Diemo Schwarz, Hans Leeuw:

Tingle: a digital music controller re-capturing the acoustic instrument experience. 219-222 - Ivan Franco, Marcelo M. Wanderley:

Pratical evaluation of synthesis performance on the beaglebone black. 223-226 - Andrew Piepenbrink, Matthew Wright:

The bistable resonator cymbal: an actuated acoustic instrument displaying physical audio effects. 227-230 - Eric Sheffield, Sile O'Modhrain, Michael Gould, R. Brent Gillespie:

The pneumatic practice pad. 231-234 - Stefano Papetti, Sébastien Schiesser, Martin Fröhlich:

Multi-point vibrotactile feedback for an expressive musical interface. 235-240 - Richard Graham, John Harding:

Septar: audio breakout design for multichannel guitar. 241-244 - Ali Momeni:

Caress: an electro-acoustic percussive instrument for caressing sounds. 245-250 - David B. Ramsay, Joseph A. Paradiso:

Grouploop: a collaborative, network-enabled audio feedback instrument. 251-254 - Nicolas D'Alessandro, Joëlle Tilmanne, Ambroise Moreau, Antonin Puleo:

Airpiano: a multi-touch keyboard with hovering control. 255-258 - Guangyu Xia, Roger B. Dannenberg:

Duet interaction: learning musicianship for automatic accompaniment. 259-264 - Jamie Bullock, Ali Momeni:

Ml.lib: robust, cross-platform, open-source machine learning for max and pure data. 265-270 - Palle Dahlstedt:

Mapping strategies and sound engine design for an augmented hybrid piano. 271-276 - Jerônimo Barbosa, Filipe Calegario, João Tragtenberg, Giordano Cabral, Geber L. Ramalho, Marcelo M. Wanderley:

Designing dmis for popular music in the brazilian northeast: lessons learned. 277-280 - Tim Shaw, Sébastien Piquemal, John Bowers:

Fields: an exploration into the use of mobile devices as a medium for sound diffusion. 281-284 - Basheer Tome, Donald Derek Haddad, Tod Machover, Joseph A. Paradiso:

Mmodm: massively multipler online drum machine. 285-288 - Timothy J. Barraclough, Dale A. Carnegie, Ajay Kapur:

Musical instrument design process for mobile technology. 289-292 - Zeyu Jin, Reid Oda, Adam Finkelstein, Rebecca Fiebrink:

Mallo: a distributed synchronized musical instrument designed for internet performance. 293-298 - Lauren Hayes:

Enacting musical worlds: common approaches to using nimes within both performance and person-centred arts practices. 299-302 - Steve Benford, Adrian Hazzard, Alan Chamberlain

, Liming Xu:
Augmenting a guitar with its digital footprint. 303-306 - Adnan Marquez-Borbon, Paul Stapleton:

Fourteen years of nime: the value and meaning of 'community' in interactive music research. 307-312 - Mohammad Akbari, Howard Cheng:

Clavision: visual automatic piano music transcription. 313-314 - Roger B. Dannenberg, Andrew Russell:

Arrangements: flexibly adapting music data for live performance. 315-316 - Palle Dahlstedt, Per Anders Nilsson, Gino Robair:

The bucket system - a computer mediated signalling system for group improvisation. 317-318 - Nuno N. Correia, Atau Tanaka:

Prototyping audiovisual performance tools: a hackathon approach. 319-321 - Chris Korda

:
Chordease: a midi remapper for intuitive performance of non-modal music. 322-324 - Ethan Benjamin, Jaan Altosaar:

Musicmapper: interactive 2d representations of music samples for in-browser remixing and exploration. 325-326 - Jeff Gregorio, David S. Rosen, Michael N. Caro, Youngmoo E. Kim

:
Descriptors for perception of quality in jazz piano improvisation. 327-328 - Robert Van Rooyen, Andrew Schloss, George Tzanetakis:

Snare drum motion capture dataset. 329-330 - Jiffer Harriman:

Pd poems and teaching tools. 331-334 - Adnan Marquez-Borbon:

But does it float? reflections on a sound art ecological intervention. 335-338 - Robert Van Rooyen, George Tzanetakis:

Pragmatic drum motion capture system. 339-342 - Steven Gelineck, Dannie Korsgaard, Morten Büchert:

Stage- vs. channel-strip metaphor - comparing performance when adjusting volume and panning of a single channel in a stereo mix. 343-346 - Jan C. Schacher, Chikashi Miyama, Daniel Bisig:

Gestural electronic music using machine learning as generative device. 347-350 - Simon Waloschek, Aristotelis Hadjakos:

Sensors on stage: conquering the requirements of artistic experiments and live performances. 351-354 - Andrés Cabrera:

Serverless and peer-to-peer distributed interfaces for musical control. 355-358 - Charles Martin

, Henry J. Gardner, Ben Swift:
Tracking ensemble performance on touch-screens with gesture classification and transition matrices. 359-364 - Hans Anderson, Kin Wah Edward Lin, Natalie Agus, Simon Lui:

Major thirds: a better way to tune your ipad. 365-368 - Qi Yang, Georg Essl:

Representation-plurality in multi-touch mobile visual programming for music. 369-373 - Simon Lui:

Generate expressive music from picture with a handmade multi-touch music table. 374-377 - Adrian Hazzard, Steve Benford, Alan Chamberlain

, Chris Greenhalgh:
Considering musical structure in location-based experiences. 378-381 - Florent Berthaut, David Coyle, James W. Moore, Hannah Limerick:

Liveness through the lens of agency and causality. 382-386 - Ajit Nath, Samson Young:

Vesball: a ball-shaped instrument for music therapy. 387-391 - Thomas Resch:

Rwa - a game engine for real world audio games. 392-395 - Romain Michon, Julius O. Smith III, Yann Orlarey:

Mobilefaust: a set of tools to make musical mobile applications with the faust programming language. 396-399 - Michael Krzyzaniak, Garth Paine:

Realtime classification of hand-drum strokes. 400-403 - Jason Long, Jim W. Murphy, Ajay Kapur, Dale A. Carnegie:

A methodology for evaluating robotic striking mechanisms for musical contexts. 404-407 - Troy Rogers, Steven T. Kemper, Scott Barton:

Marie: monochord-aerophone robotic instrument ensemble. 408-411

manage site settings
To protect your privacy, all features that rely on external API calls from your browser are turned off by default. You need to opt-in for them to become active. All settings here will be stored as cookies with your web browser. For more information see our F.A.Q.


Google
Google Scholar
Semantic Scholar
Internet Archive Scholar
CiteSeerX
ORCID














