Nov 2013
25 Mon
26 Tue
27 Wed
28 Thu 10:45 AM – 05:30 PM IST
29 Fri 10:45 AM – 06:45 PM IST
30 Sat
1 Sun
Hrishik Mishra
Musical performances, especially in electronic forms, have progressed over the years to more intrusive and immersive forms. Using standard protocols like MIDI ( Musical Instrument Digital Interface ), musicians have been able to manipulate, improvise, record and process their performances because every possible event like a note, note off, knob rotation, velocity of hit etc. is a digital value in a computer system. Most of the synthesizers, guitar processors, computers and other music equipment today are aided with a MIDI interface to provide accurate clocking and control options. eg) Using standard MIDI fader, the user may control the volume output of the musical instrument.
The objective of this project was develop an application which allows musician to employ sensors, smart UI (User interface ) and other programmable controllers to control parameters like cut-off frequency, resonance etc. using smart devices like tablets, mobile phones and phablets. The concept can also be extended to smart wearable devices, which allows the musician to use his motion signals to affect musical performances. At the same time, products like Google glasses, can be used for real time information display.
In short, the project is primarily intended to provide a musician with effective tools on a smart device to communicate with music equipment and make tasks easier and more involving. Additionally, availability of such a facility on a generic smart device makes it an economical option.
A similar but not exact project was demonstrated by Peter Brinkmann, Google in Berlin Droidcon. The video is posted on the following link:
http://de.droidcon.com/2013/sessnio/music-mobile-devices-midi-and-libpd
MIDI is a standard comprising of data format, cables and connectors which allows musical instruments to communicate relevant information for control and processing. eg) When we hit a key on the keyboard of a synthesizer, it triggers a NOTE ON event for the particular note number along with the velocity with which it was hit. Similarly, a control like a fader or a knob can be used to transmit MIDI data to change patches, increase volume etc on the synthesizer. This composes the MIDI IN/OUT capability of a standard MIDI interface.
The project started ground up with development of the MIDI-USB layer over android platform which allows a standard Android based smart device to understand MIDI signals as an input and transmit as an output. Over this layer shall operate four types of controllers.
-UI based controls: eg) XY pad which allows the user to control upto 4 parameters over conventional sliders, knobs etc.
-Sensor based controls: Motion sensors, proximity sensors etc can be mapped to a MIDI output to control parameters with body motion, eg) Moving the hand up with a smart device strapped to it, will raise the volume.
-LFO (low frequency oscillator): Based on the tempo of the performance, standard LFO signals like a sine, square, triangle etc can be routed to a particular control to create effects like panning and stereo effects.
-Recorded automation: The musician will be allowed to record his control performance like movement of hands, knob rotations etc and combine them together to execute with a simple button click as an automation.
Apart from this, the application is also planned to be equipped with a sequencer, which allows reading MIDI files (.mid) on the android device to route it to the musical instrument so that the musician can record and play the parts that can be looped while he focuses on other aspects of performance.
As of now, it is only a vision to incorporate smart devices like Google glasses and wearable gear to interact with the application because it is an independent project without any funding
All these features, will enable the musician to get more involved and focus more on improvisations using a simple smart device like a tablet, phablet or even a mobile device.
I intend to develop this project as an open source venture and I request people willing to join to write to me.
Understanding of basic MIDI concepts would make it easier.
If a musician wants to test their device, they can get their MIDI synthesizer or instrument to test the application’s peformance.
I have finished my graduation studies in Electrical Engineering since 3 years and get paid to work as a software developer. I have an inclination towards system understanding and control because of my educational background. I have been composing music on software for the past 7 years as a hobbyist. With an understanding how MIDI works and how musical performances can be controlled through it, I started this project to simpify things for musicians and myself.
I have done crazy things like connecting my old joystick to the synthesizer to control it’s resonance and cut-off frequency and these things are now a part of what I enjoy doing.
Since, I have been following this path for some time now and wish to make a contribution to this field of less explored application of smart devices to synthesis and performance, I would want to collaborate with people willing to collaborate on this project.
Hosted by
{{ gettext('Login to leave a comment') }}
{{ gettext('Post a comment…') }}{{ errorMsg }}
{{ gettext('No comments posted yet') }}