Orphion
Problems restoring Editor Upgrade Purchase in 1.6.1

Dear Orphion user,

I’m very sorry for this issue, but if you have problems to restore your editor Upgrade In-App-Purchase, please contact me.

I’m working hard to solve this issue as soon as possible.

It seems there is something going wrong with App Store receipt validation wich was working well during beta testing.

UPDATE: I could fix this issue and submitted a new version to Apple for review. As soon as they approve it, buying or restoring the editor will be available again.

IMPORTANT NOTICE: It might be necessary to press the buy button again (you won’t be charged again of course) in order to restore the upgrade.

Thank you for your help!

Bastus

Apple calling

A few days ago I received a phone call from Apple: They told me (nicely) I have to remove the sensing of the finger touch size from Orphion or they will remove the app from the app store. Orphion has been approved by app review seven times before, so this was a real surprise!

Within the next days, an update will be released with a (crippled) version of Orphion without this feature. Anyone who wants to have the “original version” should get it now, as long as its still in the app store.

What you can do:

BACK UP your current version to be able to restore it (Tutorial)

Please SUPPORT this online petition started by one of my users to convince Apple to make this feature officially usable. As soon as this happens, I will reintegrate it into Orphion.

UPDATE: In Orphion 1.5.2 you can activate the old articulation gesture again by this trick: Go to the last page of the info window and double-tap on the version number.

Why Orphion has only one sound

People sometimes ask me, why there is only one sound in Orphion. The answer is very easy: It has been basically designed as an integrated instrument consisting of interface and sound generation, just like every traditional musical instrument. Could you imagine someone asking his violin maker to integrate trumpet sound in his next violin? Not really… There are so many digital instruments out there with thousands of different sounds but not the ideal interfaces corresponding to it. For Orphion, the integrated design is the best solution and allows interactions, which are not possible to control via MIDI. This is the reason why I constantly work to improve Orphion’s interface and sound but won’t integrate new sounds. Since you can use it as a MIDI controller you can play any sound with it you like, but probably not with the very special natural feel in musical expression you have when you play its unique own sound.

Orphion - Concept and design of a touchscreen musical instrument

The usage of contemporary multitouch devices such as Apples iPad as musical instruments needs to focus on the design of a special interface that incorporates the fact of missing kinesthetic feedback by the device. To form a conclusive instrument this interface has then to be linked directly to the sound generation to create an intuitive feel of control. The Orphion uses virtual pads in a certain layout that represent individual voices. Pitch and timbre of each voice depend on the initial point of touch, the size and variation of size of the touchpoint, and the variation of the position after the initial touch. Parameters control a physical model for sound generation.

The development of musical instruments during the last centuries mainly aimed on the improvement of sound generation rather than finding new interfaces. Especially for electronic instruments, apart from exceptional inventions like the Theremin in 1920, the most commonly used interface still is the 18th century approved interface of the piano keyboard.
For the Orphion the goal was to develop an instrument that deliberately departs from that concept and takes advantage of the haptic and technological possibilities of multi-touch devices; it’s sound should be a direct representation of the actions the fingers perform on the pads. My aim was also to create an instrument that feels “natural” to someone familiar with the behavior of existing acoustical instruments like drums and string instruments.

Interfaces for musical instruments

Acoustical instruments have a natural coupling between inter-face and sound generation, which is defined by the materials used for their construction. For most electronic instruments these two components are separated. The requirements of the interface part for all kinds of musical instruments, however, can be generally defined:

  • allow virtuosity/expression
  • intuitive playing
  • traceability (for the audience)
  • predictability (for the player)
  • give feedback (acoustic/visual)

Instrument vs. Controller

Since its mechanical structure and the materials used for build-ing the enclosure of electronic instruments normally do not contribute to their sound whilst played, which is especially true for software based instruments, these instruments needs a strong logical link between the action of the player and the generated sound. This strong link between a very specific inter-face and the sound generation is what defines that structure as an instrument.
Alternatively the interface is an open structure with different functions or sounds for different situations, and can be seen more as a controller rather than as an instrument. Such an interface can work very well for a variety of situations, however the above mentioned properties of an instrument are hard or impossible to achieve. Additional layers of complexity arise, if the interface does not control the sound generation directly but drives e.g. a sequencer. Such a “sequencing instrument” points in a new direction but also marks the strongest possible departure from a traditional musical instrument. For my purposes I decided to focus on the idea of a strong coupling between interface and sound generation.

The Orphion

How does a touchscreen sound and what gestures allow expressive playing whilst keeping control? Most applications for touchscreens use the finger or stylus as a replacement for the mouse to control knobs and buttons. Since multitouch has been established, new gestures were created, for example pinching two fingers to resize objects,  but in order to express musical ideas, more specific gestures and input models have to be developed.

Finding a logic interaction model and thus a suggested way of playing seems to be the main part in the development of a new instrument. The Orphion should allow polyphonic playing of defined pitches with different articulations (staccato, legato) and timbres for each individual voice. When developing Orphion I had to take into account:

  • haptic properties of touchscreens (size and tactile or kinesthetic ways of interaction)
  • musical playability (recognition of initial touchpoint and matching of pitches), musical expression (dynamics, intonation, vibrato, timbre)
  • intuitive and natural feel
  • technical possibilities (precision of control data, processing power)

As guiding models for the behavior of the Orphion I looked at two types of instruments: Drums and string instruments.

  • drums: round playing area with different timbres, re-lease time and damping depend on the velocity and du-ration of touch.
  • string instruments: multiple individually tuned strings plus ability to play the tuning via tapping the strings, control of tone and articulation during sustain-phase (intonation/vibrato, damping)

See figure 1 for the connection between the interface and the sound generation by different parameters:

Figure 1. Simplified diagram of interface and sound generation

Interface and sound

The interface of the Orphion represents virtual pads, which are capable of sounding either plucked like a guitar string or produce timbres closer to a slap on a conga drum depending on the size of the touch point. The timbre changes when hit closer to the rim like on a real drum, and its pitch is a function of distance from the center hit point, in order to model something that comes close to pulling a string (the range varies by the size of the touch point). Every parameter is controlled by a single finger per note. The iPad currently supports up to eleven touch points. However, internal polyphony of the instrument is de-fined by the number of pads present on the touch screen.
Visually the interface is represented by a defined set of pad drawn as circles of variable size and position. The different sets of layouts allow the instrument to be adapted to multiple musical situations, genres and to match the virtuosity of the player. The arrangement structure with symmetrical intervals (e.g. fig. 2) in each axis can be used to find new harmonic structures by advanced musicians, pad layouts with only pentatonic tone material or other simplified musical concepts (e.g. fig. 3)  make it also interesting for musical beginners. Layouts with less pads can give you the feel of a percussion instrument played with a fixed assignment between finger and pad (e.g. fig. 4).

 
  Figure 2. Symmetrical major 3rds horizontally, 
  minor 3rds and semitones vertically, 4ths and 5ths diagonally


  Figure 4. blues-scale layout

 
  Figure 5. five-finger layout, e.g. as tuned bass drums

Sound synthesis

The sound synthesis is based on physical model that simulates a string (Karplus-Strong algorithm). I use a combination of a pulse of filtered noise and a sustained excitation sound created by a two operator FM synthesis structure. The low-pass filtering of the feedback path is controlled in realtime for lively articulation of the sound after initial touch. The complex excitation model allows a wide spectrum of different sounds from gently plucked strings to xylophone-like hits or damped attack of muted drums. As long a finger is touching  the surface of a pad, the distance from the pad’s center controls slight detuning (intonation) and variation of timbre towards the rim.
We achieve a very natural feeling by dynamical adjusting parameters within a single synthesis model rather than switching between different models for different playing situation, as it would be the case with sample based instruments.

The visual representation of the pads on iPad is straight for-ward and functional: A pad is defined as a outlined blue circle with it’s note name written in the center. When touching a circle, it is filled with color, ranging from red to yellow depending on touch size, thus indicating the amount of damping.

Past, present and future

In this blog i will give you a more detailed insight in where Orphion comes from, how it works and what the plans are for the future. The first article is about how it all started…

MultiTouch Instrument

My first idea for a new musical instrument with multitouch came up early 2010. It wasn’t played on an iPad yet but used the MacBook’s trackpad for multitouch input. Here I also made a first concept study for the sound synthesis model, all realised in Max/MSP.

This is a small demo of how the concept study instrument could be played and sounds with acoustic woodwind instruments:

As a result of my research on musical interfaces for touch screens I combined the interaction models of percussion instruments and string instruments like a guitar in virtual pads that can be modulated by pressure/touch size and movement/speed of movement. The main problem was the missing detailed feedback about how hard you touched the trackpad and thus the portation to a touchscreen was just a matter of time.

Master thesis

I developed the concept further as my master thesis at udk berlin, supervised by Robert Henke. At that time, iPad 1 had just come out and was to slow for the complex sound synthesis with 11 individual voices and so I had to find another solution: I used the great app Fatastick to remote controll a Max-Patch on my Mac, which did all the DSP und just sent graphical feedback back to the iPad. the following demo is played on this setup:

In the next post you can read how the story went on and the Orphion finally became an app on the App-Store