Tom Mitchell - music/code/digital arts

Projects

Outside interactions proejct image, a hand with a wearable playing a strange insturment A dancer and a person lying on the floor one is wearing a VR headset A dancer and a person lying on the floor one is wearing a VR headset Drawing of the Bridge building
MiMU gloves image
More soon

Publications

image Child, L., Mitchell, T., & Ford, N. (2023) A systematic review of reverberation and accessibility for B/blind users in virtual environments, In: Proceedings of AES 2023 International Conference on Spatial and Immersive Audio
image Simmons, J., Bown, A., Bremner, P., McIntosh, V., and Mitchell, T. J. (2023) Hear here: Sonification as a design strategy for robot teleoperation using virtual reality, In: Proceedings of VAM-HRI Virtual, Augmented, and Mixed-Reality for Human-Robot Interactions at HRI
image Aynsley, H., Mitchell, T. J., and Meckin, D. (2023) Participatory conceptual design of accessible digital musical instruments using generative AI, In: Proceedings of New Interfaces for Musical Expression
image Bremner, P., Mitchell, T. J., and McIntosh, V. (2022) The impact of data sonification in virtual reality robot teleoperation, In: Frontiers in Virtual Reality, 3, Article 904820
image Renney, N., Renney, H., Mitchell, T. J., & Gaster, B. R. (2022) Studying how digital luthiers choose their tools, In: Proceedings of the CHI Conference on Human Factors in Computing Systems
image Renney, H., Willemsen, S., Gaster, B. R., and Mitchell, T. J. (2022) HyperModels - A framework for GPU accelerated physical modelling sound synthesis, In: Proceedings of New Interfaces for Musical Expression
image Renney, H., Gaster, B., and Mitchell, T. J. (2022) Survival of the synthesis—GPU accelerating evolutionary sound matching, In: Concurrency and Computation: Practice and Experience 34(10), Article e6824
image Bolarinwa, J., Eimontaite, I., Mitchell, T., Dogramadzi, S., and Caleb-Solly, P. (2021) Assessing the Role of Gaze Tracking in Optimizing Humans-In-The-Loop Telerobotic Operation Using Multimodal Feedback, In: Frontiers in Robotics and AI 8, Article 578596
image Brown, D., Nash, C., and Mitchell, T. J. (2020) Was that me?: Exploring the effects of error in gestural digital musical instruments, In: Proceedings of the of the 15th International Conference on Audio Mostly
image Mitchell, T. J., Jones, A. J., O’Connor, M. B., Wonnacott, M. D., Glowacki, D. R., and Hyde, J. (2020) Towards molecular musical instruments: Interactive sonifications of 17-alanine, graphene and carbon nanotubes, In: Proceedings of the of the 15th International Conference on Audio Mostly
image Renney, H., Gaster, B. R., and Mitchell, T. J. (2020) There and Back Again: The Practicality of GPU Accelerated Digital Audio., In: Proceedings of the 20th New Interfaces for Musical Expression
image Hunt, S. J, Mitchell, T. J. and Nash, C. P. (2020) Composing computer generated music, an observational study using IGME: the Interactive Generative Music Environment., In: The Proceedings of NIME 2020
image Hunt, S. J, Mitchell, T. J. and Nash, C. P. (2019) Automating algorithmic representations of musical structure using IGME: The Interactive Generative Music Environment, In: Innovation In Music 2019
image Bolarinwa, J., Eimontaite, I., Dogramadzi, S., Mitchell, T., & Caleb-Solly, P. (2019) The use of different feedback modalities and verbal collaboration in tele-robotic assistance, In: Proceedings of the IEEE International Symposium on Robotic and Sensors Environments (ROSE)
image Mitchell, Thomas J., Thom, J., Pountney, M., Hyde, J. (2019) The alchemy of chaos: A sound art sonification of a year of Tourette’s episodes, In: Proceedings of the International Conference on Auditory Display
image O'Connor, M. B., Bennie, S. J., Deeks, H. M., Jamieson-Binnie, A.; Jones, A. J.; Shannon, R. J.; Walters, R.; Mitchell, T. J.; Mulholland, A. J.; Glowacki, D. R. (2019) Interactive molecular dynamics in virtual reality from quantum chemistry to drug binding: An open-source multi-person framework Journal of Chemical Physics, 150(22), ISSN: 1089-7690, DOI: 10.1080/14626268.2018.1510841
image Renney, H., Gaster, B. R., & Mitchell, T. (2019) OpenCL vs: Accelerated finite-difference digital synthesis, In: Proceedings of the International Workshop on OpenCL DOI: 10.1145/3318170.3318172
image Brown, D., Nash, C. and Mitchell, T. (2018) Simple mappings, expressive movement: A qualitative investigation into the end-user mapping design of experienced mid-air musicians, Digital Creativity, 29:2-3, ISSN: 1462-6268, DOI: 10.1080/14626268.2018.1510841
image Gaster, B. R., Renney, N. and Mitchell, T. (2018) Outside the block syndicate: Translating Faust's algebra of blocks to the arrows framework, In: International Faust Conference
image Brown, D., Nash, C. and Mitchell, T. (2018) Understanding user-defined mapping design in mid-air musical performance, In: International Conference on Movement Computing DOI: 10.1145/3212721.3212810
image Arbon, R. E., Jones, A. J., Bratholm, L. A., Mitchell, T. and Glowacki, D. R. (2018) Sonifying stochastic walks on biomolecular energy landscapes, In: International Conference On Auditory Display
image Hunt, S., Mitchell, T. and Nash, C. (2018) A cognitive dimensions approach for the design of an interactive generative score editor, In: International Conference on Technologies for Music Notation and Representation
image Renney, N., Gaster, B. and Mitchell, T. (2018) Return to temperament (In digital systems), In: Audio Mostly
image Hunt, S., Mitchell, T. and Nash, C. (2017) Thoughts on interactive generative music composition, In: Computer Simulation of Musical Creativity
image Brown, D., Nash, C. and Mitchell, T. (2017) A user experience review of music interaction evaluations, In: International Conference on New Interfaces for Musical Expression DOI: 10.5281/zenodo.1176286
image Hunt, S., Mitchell, T. and Nash, C. (2017) How can music visualisation techniques reveal different perspectives on musical structure?, In: International Conference on Technologies for Music Notation and Representation
image van den Berg, C., Heap, I., Stark, A. and Mitchell, T. (2017) Expressive gestural personality, In: Push Turn Move: Interface Design in Electronic Music, ISBN: 978-87-9999995-0-7
image Brown, D., Nash, C. and Mitchell, T. (2016) GestureChords: Transparency in gesturally controlled digital musical instruments through iconicity and conceptual metaphor, In: International Conference on Sound and Music Computing, DOI: 10.5281/zenodo.851193
image Brown, D., Renney, N., Stark, A., Nash, C. and Mitchell, T. (2016) Leimu: Gloveless music interaction using a wrist mounted leap motion, In: International Conference On New Interfaces for Musical Expression, DOI: 10.5281/zenodo.1176000
image Mitchell, T., Bennett, P., Tew, P., Davies, E. and Madgwick, S. (2016) Tangible interfaces for interactive evolutionary computation, In: ACM Conference Human Factors in Computing Systems, Extended Abstracts, DOI: 10.1145/2851581.2892405
image Davies, E., Tew, P., Glowacki, D., Smith, J. and Mitchell, T. (2016) Evolving atomic aesthetics and dynamics, In: International Conference on Evolutionary and Biologically Inspired Music, Sound, Art and Design, DOI: 10.1007/978-3-540-32003-6_54 (Nominated for best paper)
image Mitchell, T., Hyde, J., Tew, P. and Glowacki, D. (2016) danceroom Spectroscopy: At the frontiers of physics, performance, interactive art and technology, In: Leonardo Leonardo, 49 (2), DOI: 10.1162/LEON_a_00924 (Cover article)
image Madgwick, S. O. H., Mitchell, T. J., Barreto, C. and Freed, A. (2015) Simple synchronisation for open sound control, In: International Computer Music Conference. (Nominated for best paper)
image Hyde, J. I., Mitchell, T. J., Tew, P. and Glowacki, D. R. (2014) Molecular music: Repurposing a mixed quantum-classical atomic dynamics model as an audiovisual instrument, In: Generative Art Conference
image Rutter, E. K., Nash, C. and Mitchell, T. J. (2014) Turnector: Tangible control widgets for capacitive touchscreen devices, In: International Computer Music Conference (ICMC) and the 11th Sound & Music Computing Conference
image Mitchell, T. J., Madgwick, S., Rankine, S., Hilton, G., Freed, A. and Nix, A. (2014) Making the most of Wi-Fi: Optimisations for robust wireless live music performance, In: International Conference on New Interfaces for Musical Expression, DOI: 10.5281/zenodo.1178875
image Place, A., Lacey, L. and Mitchell, T. (2014) AlphaSphere: From prototype to product, In: International Conference on New Interfaces for Musical Expression, DOI: 10.5281/zenodo.1178903
image Serafin, S., Trento, S., Grani, F., Perner-Wilson, H., Madgwick, S. and Mitchell, T. J. (2014) Controlling physically based virtual musical instruments using the gloves, In: International Conference on New Interfaces for Musical Expression, DOI: 10.5281/zenodo.1178937
image Glowacki, D. R., O'Connor, M., Calabro, G., Price, J., Tew, P., Mitchell, T., Hyde, J., Tew, D., Coughtrie, D. J. and McIntosh-Smith, S. (2014) A GPU-accelerated immersive audiovisual framework for interaction with molecular dynamics using consumer depth sensors, Faraday Discussions 169, ISSN 1359-6640 DOI: 10.1039/C4FD00008K
image Glowacki, D., Tew, P., Hyde, J., Kriefman, L., Mitchell, T., Price, J. and McIntosh-Smith, S. (2013) Using human energy fields to sculpt real-time molecular dynamics. Molecular Aesthetics. MIT Press, ISBN 9780262018784
image Madgwick, S. and Mitchell, T. J. (2013) x-OSC: A versatile wireless I/O device for creative/music applications, In: International Conference on Sound and Music Computing, DOI: 10.5281/zenodo.850439
image Place, A., Lacey, L. and Mitchell, T. (2013) AlphaSphere, In: International Conference on New Interfaces for Musical Expression, DOI: 10.5281/zenodo.1178642
image Glowacki, D., Tew, P., Mitchell, T. J., Price, J. and McIntosh-Smith, S. (2012) danceroom Spectroscopy: Interactive quantum molecular dynamics accelerated on GPU architectures using OpenCL UK Many-Core developer conference
image Mitchell, T. J. (2012) Automated evolutionary synthesis matching: Advanced evolutionary algorithms for difficult sound matching problems, Soft Computing 16 (12), DOI: 10.1007/s00500-012-0873-x
image Mitchell, T. J., Madgwick, S. and Heap, I. (2012) Musical interaction with hand posture and orientation: A toolbox of gestural control mechanisms, In: International Conference on New Interfaces for Musical Expression, DOI: 10.5281/zenodo.1178111
image Mitchell, T. J. and Heap I. (2011) SoundGrasp: A gestural interface for the performance of live music, In: International Conference on New Interfaces for Musical Expression, DOI: 10.5281/zenodo.1178111
image Mitchell, T. J. (2010) An exploration of evolutionary computation applied to frequency modulation audio synthesis parameter optimisation. Ph.D, University of the West of England
image Mitchell, T. J., Creasey, D. P. (2007) Evolutionary sound matching: A test methodology and comparative study, In: Proceedings of the International Conference on Machine Learning and Applications, DOI: 10.1109/ICMLA.2007.34
image Mitchell, T. J., Pipe, A.G. (2006) A comparison of evolution-strategy based methods for frequency modulated musical tone timbre matching. In: Proceedings of the International Conference on Adaptive Computing in Design and Manufacture. Institute for People-centred Computation (IP-CC), ISBN 978-0955288500
image Mitchell1, T.J., Sullivan, J.C.W. (2005) Frequency modulation tone matching using a fuzzy clustering evolution strategy. In: Proceedings of the Audio Engineering Society 118th Convention. AES
image Mitchell T.J., Pipe A.G. (2005) Convergence Synthesis of Dynamic Frequency Modulation Tones Using an Evolution Strategy. In: Applications of Evolutionary Computing. EvoMUSART 2005. Lecture Notes in Computer Science, vol 3449. DOI: 10.1007/978-3-540-32003-6_54

About

imageHi, I'm Tom (he/him) and I am a Professor of Audio and Music Interaction in the Department of Computing and Creative Technologies at UWE, Bristol. I am also a UKRI Future Leaders Fellow working on the project “Sensing Music Interactions from the Outside-In”. For more information see project website: www.micalab.org.

I lead the Creative Technologies Laboratory and co-lead the Bridge: a £3M creative technology facility at UWE, which is funded by the AHRC and WECA. I am also an associate member of the Computer Science Research Centre and Bristol Robotics Laboratory, and resident at the Watershed's amazing Pervasive Media Studio.

I am a co-founder of mi.mu gloves, a technology company who enable musicians to compose and perform through movement and gesture, a project that began as a research collaboration between the inimitable Imogen Heap and myself back in 2011

I am also an experienced software developer, C++ programmer and I am very familiar with the Juce library. Over the years I have contributed to a range of interactive media and audio projects for Tracktion, Spitfire Audio, May Productions, Interactive Scientific, x-io Technologies, mi.mu gloves, and Phona, to name a few. I have also worked on a range of interdisciplinary research projects combining art and science including Soma, danceroom Spectroscopy and Transmission.

Email me

CVs (LaTeX files )