Droidcon is India’s largest Android developer conference, and is part of the world wide series of conferences that happens in London, Paris, Berlin, Netherlands, Tunis, Ankara and Brussels. If you are doing anything with Android, you’d want to be here in Bangalore on Nov 28-30th 2013.
A WIP schedule will be up on the Droidcon India website and will be updated periodically.
Voting is open to attendees who have purchased event tickets. If there is a proposal you find notable, please vote for it and leave a comment to initiate discussions. Your vote will be reflected immediately, but will be counted towards selections only if you hold a ticket. Proposals will also be evaluated by a program committee, consisting of:
- Aravind Krishnaswamy of Levitum is the Program Chair
- Amrit Sanjeev of Intuit and Blrdroid will run the App Demos track.
- Kumar Rangarajan of LittleEyeLabs will run the Systems track.
- Rahul Gonsalves of UnCommon will run the UX track.
- Aravind Krishnaswamy of Levitum will run the Business track.
- Ravi Korukonda of PurpleTalk will run the Gaming track.
- Soham Mondal of Triveous and Blrdroid will run the Workshops.
Proposers must submit presentation drafts as part of the selection process to ensure that the talk is in line with the original proposal, and to help the program committee build a strong line-up for the event.
Final date for submission of proposals Oct 18th, 2013.
First set of pre-confirmations Oct 18th, 2013.
Submission of slide drafts Oct 25th, 2013.
Second set of pre-confirmations Oct 26th, 2013.
Schedule draft posted on site Nov 4th, 2013
Final confirmations Nov 5th, 2013.
Final schedule Nov 8th, 2013
All speakers are requested to be available for officehours during the conference. This will be a scheduled 30 minute block of time during which attendees can meet you at a designated space for open Q&A offstage.
There is only one speaker per session. Attendance is free for selected speakers. HasGeek will cover your travel to and accommodation in Bangalore from anywhere in the world for speakers delivering full sessions (30 minutes or longer). As our budget is limited, we will prefer speakers from locations closer home, but will do our best to cover for anyone exceptional. If you are able to raise support for your trip, we will count that as speaker travel sponsorship.
HasGeek believes in open source as the binding force of our community. If you are describing a codebase for developers to work with, we’d like it to be available under a permissive open source license. If your software is commercially licensed or available under a combination of commercial and restrictive open source licenses (such as the various forms of the GPL), please consider picking up a sponsorship. We recognize that there are valid reasons for commercial licensing, but ask that you support us in return for giving you an audience. Your session will be marked on the schedule as a sponsored session.
If your proposal is accepted for a session > 30 minutes long, we will cover your event ticket.
If your proposal is not accepted, you can buy a ticket at the same rate as was available on the day you proposed. We’ll send you a code.
Immersive control of Music performances using smart devices
Musical performances, especially in electronic forms, have progressed over the years to more intrusive and immersive forms. Using standard protocols like MIDI ( Musical Instrument Digital Interface ), musicians have been able to manipulate, improvise, record and process their performances because every possible event like a note, note off, knob rotation, velocity of hit etc. is a digital value in a computer system. Most of the synthesizers, guitar processors, computers and other music equipment today are aided with a MIDI interface to provide accurate clocking and control options. eg) Using standard MIDI fader, the user may control the volume output of the musical instrument.
The objective of this project was develop an application which allows musician to employ sensors, smart UI (User interface ) and other programmable controllers to control parameters like cut-off frequency, resonance etc. using smart devices like tablets, mobile phones and phablets. The concept can also be extended to smart wearable devices, which allows the musician to use his motion signals to affect musical performances. At the same time, products like Google glasses, can be used for real time information display.
In short, the project is primarily intended to provide a musician with effective tools on a smart device to communicate with music equipment and make tasks easier and more involving. Additionally, availability of such a facility on a generic smart device makes it an economical option.
A similar but not exact project was demonstrated by Peter Brinkmann, Google in Berlin Droidcon. The video is posted on the following link:
MIDI is a standard comprising of data format, cables and connectors which allows musical instruments to communicate relevant information for control and processing. eg) When we hit a key on the keyboard of a synthesizer, it triggers a NOTE ON event for the particular note number along with the velocity with which it was hit. Similarly, a control like a fader or a knob can be used to transmit MIDI data to change patches, increase volume etc on the synthesizer. This composes the MIDI IN/OUT capability of a standard MIDI interface.
The project started ground up with development of the MIDI-USB layer over android platform which allows a standard Android based smart device to understand MIDI signals as an input and transmit as an output. Over this layer shall operate four types of controllers.
-UI based controls: eg) XY pad which allows the user to control upto 4 parameters over conventional sliders, knobs etc.
-Sensor based controls: Motion sensors, proximity sensors etc can be mapped to a MIDI output to control parameters with body motion, eg) Moving the hand up with a smart device strapped to it, will raise the volume.
-LFO (low frequency oscillator): Based on the tempo of the performance, standard LFO signals like a sine, square, triangle etc can be routed to a particular control to create effects like panning and stereo effects.
-Recorded automation: The musician will be allowed to record his control performance like movement of hands, knob rotations etc and combine them together to execute with a simple button click as an automation.
Apart from this, the application is also planned to be equipped with a sequencer, which allows reading MIDI files (.mid) on the android device to route it to the musical instrument so that the musician can record and play the parts that can be looped while he focuses on other aspects of performance.
As of now, it is only a vision to incorporate smart devices like Google glasses and wearable gear to interact with the application because it is an independent project without any funding
All these features, will enable the musician to get more involved and focus more on improvisations using a simple smart device like a tablet, phablet or even a mobile device.
I intend to develop this project as an open source venture and I request people willing to join to write to me.
Understanding of basic MIDI concepts would make it easier.
If a musician wants to test their device, they can get their MIDI synthesizer or instrument to test the application’s peformance.
I have finished my graduation studies in Electrical Engineering since 3 years and get paid to work as a software developer. I have an inclination towards system understanding and control because of my educational background. I have been composing music on software for the past 7 years as a hobbyist. With an understanding how MIDI works and how musical performances can be controlled through it, I started this project to simpify things for musicians and myself.
I have done crazy things like connecting my old joystick to the synthesizer to control it’s resonance and cut-off frequency and these things are now a part of what I enjoy doing.
Since, I have been following this path for some time now and wish to make a contribution to this field of less explored application of smart devices to synthesis and performance, I would want to collaborate with people willing to collaborate on this project.