Dr Charles Patrick Martin
Areas of expertise
- Computer Human Interaction 080602
- Performing Arts And Creative Writing 1904
- Neural, Evolutionary And Fuzzy Computation 080108
- Music Performance 190407
Research interests
- new interfaces for musical performance
- computational creativity
- smartphone/tablet musical instruments
- collaborative performance
- co-creative interfaces
- improvisation
- percussive approaches to computer music
Biography
Charles Martin is a specialist in percussion, music technology, and musical AI from Australia. He links percussion with electroacoustic music and other media through new technologies. He is the author of musical iPad app, PhaseRings, and founded touchscreen ensemble, Ensemble Metatone, percussion group, Ensemble Evolution, and cross-artform group, Last Man to Die. Charles’ doctoral research involved developing intelligent agents that mediate ensemble performance.
Charles was a postdoctoral fellow at the University of Oslo in the Engineering Prediction and Embodied Cognition (EPEC) project and the RITMO Centre for Interdisciplinary Studies in Rhythm, Time, and Motion from 2016–2019 where he developed new ways to predict musical intentions and performances in smartphone apps and embedded devices.
In 2019, Charles returned to the Australian National University as lecturer in computer science.
Researcher's projects
- MicroJam (2016-) Exploring Tiny Performances and Prediction with Smartphones at the University of Oslo.
- Andromeda is Coming (2015-) Improvised music and media duo with Alec Hunter.
- Metatone (2012-2016) Research project extending ensemble improvisation with new music-making iPad apps, gesture recognition and machine learning.
- Sticks and Tones (2012-) Mallet percussion duo that perform music from the ragtime era, classic films and video games!
- Nordlig Vinter (2011-2014) A suite of compositions for percussion and iOS devices created by Charles Martin while living in Piteå, near the Arctic circle in northern Sweden.
- Ensemble Evolution (2010-2013) An international ensemble exploring the future of percussion through composition, education and technology.
- Last Man to Die (2008-2010) Cross-artform group that connect acting, drawing, and percussion through technology in installation/performances.
- Strike on Stage (2009-2010) Percussion and multimedia performance using computer vision and augmented reality.
Available student projects
- developing predictive musical instruments
- machine learning of musical style
- musical AI
- computer support for collaborative musical expression
- new interfaces for musical expression (NIME)
- applying ML/AI in creative practices
Current student projects
I supervise and co-supervise PhD and Master's students at the ANU and at University of Oslo.
PhD Students (co-supervisor):
- Tønnes Nygaard (http://robotikk.net/) is studying evolutionary robotics at the University of Oslo, Department of Informatics and has created a open-source quadruped robotics platform with mechanically extensible legs for evolving control systems and robot morphology simultaneously during real-world activity.
- Benedikte Wallace (https://www.hf.uio.no/ritmo/english/people/phd-fellows/benediwa/) is studying machine learning models of sound-related movement and dance, among other creative applications of artificial intelligence, at the RITMO Centre of Excellence, University of Oslo.
Master's projects
- Physical Intelligent Instrument using Neural Networks (Torgrim Næss) 2018-2019
Past student projects
Master's Student Theses:
- Viktoria Røsjø: Variational Autoencoders with Mixture Density Networks for Sequence Prediction in Algorithmic Composition - A Musical World Model
- Benedikte Wallace: Predictive songwriting with concatenative accompaniment
- Henrik Brustad: Digital Audio Generation with Neural Networks
Graduate Student Collaborations:
- Mathias Ciarlo Thorstensen: Visualization of Robotic Sensor Data with Augmented Reality
- Preben Ødegård Aas: Getting a Grip on Musical Interaction - An exploratory study of embodied sound design through a grid-based system
Publications
- Nygaard, T, Martin, C, Torresen, J et al. 2021, 'Real-world embodied AI through a morphologically adaptive quadruped robot', Nature Machine Intelligence, vol. 3, pp. 410-419.
- Nygaard, T, Martin, C, Howard, D et al. 2021, 'Environmental adaptation of robot morphology and control through real-world evolution', Evolutionary Computation, vol. 29, no. 4, pp. 441-461.
- Wallace, B, Martin, C, Torresen, J et al. 2021, 'Exploring the Effect of Sampling Strategy on Movement Generation with Generative Neural Networks', 10th International Conference on Artificial Intelligence in Music, Sound, Art and Design, ed. Juan Romero, Tiago Martins, Nereida RodrÃguez-Fernández, Springer Nature Switzerland AG, Switzerland, pp. 344-359.
- Wallace, B, Martin, C, Torresen, J et al. 2021, 'Learning Embodied Sound-Motion Mappings: Evaluating AI-Generated Dance Improvisation', C&C 21: Creativity and Cognition, Association for Computing Machinery (ACM), New York, NY, United States.
- McArthur, R & Martin, C 2021, 'An Application for Evolutionary Music Composition Using Autoencoders', 10th International Conference on Artificial Intelligence in Music, Sound, Art and Design, ed. Juan Romero, Tiago Martins, Nereida Rodriguez-Fernandez, Springer Nature Switzerland AG, Switzerland, pp. 443-458.
- Jolly, M, Hunter, A, Martin, C et al. 2020, Magic lantern performance, Suburban Apparitions, for ACT Historic Places at Calthorpes House.
- Martin, C, Glette, K, Nygaard, T et al. 2020, 'Understanding Musical Predictions With an Embodied Interface for Musical Machine Learning', Frontiers in Artificial Intelligence, vol. 3, no. 6, pp. 1-14.
- Erdem, C, Lan, Q, Fuhrer, J et al. 2020, 'Towards playing in the 'Air': Modeling motion-sound energy relationships in electric guitar performance using deep neural networks', 17th Sound and Music Computing Conference, SMC 2020, ed. Simone Spagnol and Andrea Valle, SMC Publishing Inc, Italy, pp. 177-184.
- Wallace, B, Martin, C, Torresen, J et al. 2020, 'Towards Movement Generation with Audio Features', 11th International Conference on Computational Creativity, ed. F. Amlcar Cardoso, Penousal Machado, Tony Veale and Joao Miguel Cunha, Association for Computational Creativity, Coimbra, Portugal, pp. 284-287.
- Proctor, R & Martin, C 2020, 'A Laptop Ensemble Performance System using Recurrent Neural Networks', International Conference on New Interfaces for Musical Expression, ed. Romain Michon and Franziska Schroeder, New Interfaces for Musical Expression, Birmingham, UK, pp. 43-48.
- Martin, C, Liu, Z, Wang, Y et al. 2020, 'Sonic Sculpture: Activating Engagement with Head-Mounted Augmented Reality', International Conference on New Interfaces for Musical Expression, ed. Romain Michon and Franziska Schroeder, New Interfaces for Musical Expression, Birmingham, UK, pp. 39-42.
- Martin, C & Torrensen, J 2020, 'Data-driven analysis of tiny touchscreen performance with MicroJam', Computer Music Journal, vol. 43, no. 4, pp. 41-57.
- Li, S & Martin, C 2020, 'Comparing Three Data Representations for Music with a Sequence-to-Sequence Model', 33rd Australasian Joint Conference on Artificial Intelligence, ed. Marcus Gallagher, Nour Moustafa, Erandi Lakshika, Springer Nature Switzerland AG, Switzerland, pp. 16-28.
- Hopgood, C, Martin, C & Grétarsson, G 2019, 'Synesthetic: Composing works for Marimba and Automated Lighting', Australasian Computer Music Conference, Australasian Computer Music Association, Melbourne, Australia, pp. 23--27.
- Swift, B, Martin, C & Hunter, A 2019, 'Two Perspectives on Rebooting Computer Music Education: Composition and Computer Science (2019)', Seeing the inaudible, hearing the invisible, Published by the Australasian Computer Music Association, Australia, pp. 53-57.
- Martin, C 2017, 'Percussionist-Centred Design for Touchscreen Digital Musical Instruments', Contemporary Music Review, vol. 36, no. 1-2, pp. 64-85pp.
- Martin, C & Gardner, H 2016, 'Can Machine-Learning Apply to Musical Ensembles?', Human Centred Machine Learning at CHI 2016, Association for Computing Machinery (ACM), USA, pp. 1-5.
- Martin, C, Gardner, H, Swift, B et al. 2016, 'Intelligent Agents and Networked Buttons Improve Free-Improvised Ensemble Music-Making on touch-Screens', Human Centred Machine Learning at CHI 2016, Association for Computing Machinery (ACM), USA, pp. 2295-2306.
- Martin, C & Gardner, H 2015, 'That Syncing Feeling: Networked Strategies for Enabling Ensemble Creativity in iPad Musicians', CreateWorld 2015, Apple University Consortium, Tasmania, pp. 1-7.
- Martin, C, Gardner, H, Swift, B et al. 2015, 'Music of 18 Performances: Evaluating Apps and Agents with Free Improvisation', Australasian Computer Music Conference 2015 ACMC2015-MAKE!, Australasian Computer Music Association, Australia, pp. 85-94.
- Martin, C 2014, 'MAKING IMPROVISED MUSIC FOR IPAD AND PERCUSSION WITH ENSEMBLE METATONE', Australasian Computer Music Conference 2014, ed. Timothy Opie, Australasian Computer Music Association, Fitzroy, pp. 115-118.
- Martin, C, Gardner, H & Swift, B 2014, 'MetaTravels and MetaLonsdale: iPad apps for percussive improvisation', 32nd Annual ACM Conference on Human Factors in Computing Systems, CHI 2014, Association for Computing Machinery (ACM), Toronto Canada, pp. 547-550.
- Martin, C, Gardner, H & Swift, B 2014, 'Exploring percussive gesture on iPads with ensemble metatone', 32nd Annual ACM Conference on Human Factors in Computing Systems, CHI 2014, Association for Computing Machinery (ACM), Toronto Canada, pp. 1025-1028.
- Martin, C 2013, 'Integrating mobile music with percussion performance practice', 39th International Computer Music Conference, ICMC 2013, Michigan Publishing, a division of the University of Michigan Library., Perth, WA, pp. 437-440.
Projects and Grants
Grants information is drawn from ARIES. To add or update Projects or Grants information please contact your College Research Office.
- Next Gen AI Grad Program Round 2 - Human-AI Interaction in the Metaverse (Secondary Investigator)
- AI for Decision Making: Open-Form Music Composition for Synchronised and Coordinated Action (Primary Investigator)
- The Augmented Web (Secondary Investigator)