Humanizing midi input/parameters
First of all i would like to thank you for some truly great and expressive instruments. I own all Sample Modeling instruments except the saxophones, and I rely solely on SM woodwinds and brass (in conjunction with VSL Dimension Strings) for my mock-up work. I am crossing fingers that we in addition to a future violin and contrabass also will see development towards string-section setups from SM, as I find the realism and playability of these instruments unsurpassed in the virtual realm.
Seeing that some people (notably over at VI-control forum) complain about the "synthiness" of the viola and the cello, I was wondering if Sample Modeling perhaps could employ some kind of humanizer-engine a la Vienna Instruments Pro, either as a separate midi-plugin or directly integrated in the SWAM engine. The ability to add a little "randomness" to various parameters (bow pressure, bow position, etc) would probably go a long way towards adding realism to a performance. In VI Pro, there are also the micropitch-envelopes which in addition to the round robins can lend a lot of vitality to the end result, particularly when used for individual instruments within an ensemble.
I know that a lot of the criticism towards SM instruments probably comes from whatever bias people may have about sampling vs physical modelling (even though SM instruments are sample based at the core, I believe?), but some of it may come from the fact that the instruments need a lot of performance control in order to sound their best. Even with a TEC breath and bite controller freeing up both hands for other tasks, there are so many parameters available that will have an impact on the final result. Some kind of system for adding a little erratic-ness would be great;) If this functionality is already present within the SWAM engine, I will be grateful if someone can point me in the right direction
Seeing that some people (notably over at VI-control forum) complain about the "synthiness" of the viola and the cello, I was wondering if Sample Modeling perhaps could employ some kind of humanizer-engine a la Vienna Instruments Pro, either as a separate midi-plugin or directly integrated in the SWAM engine. The ability to add a little "randomness" to various parameters (bow pressure, bow position, etc) would probably go a long way towards adding realism to a performance. In VI Pro, there are also the micropitch-envelopes which in addition to the round robins can lend a lot of vitality to the end result, particularly when used for individual instruments within an ensemble.
I know that a lot of the criticism towards SM instruments probably comes from whatever bias people may have about sampling vs physical modelling (even though SM instruments are sample based at the core, I believe?), but some of it may come from the fact that the instruments need a lot of performance control in order to sound their best. Even with a TEC breath and bite controller freeing up both hands for other tasks, there are so many parameters available that will have an impact on the final result. Some kind of system for adding a little erratic-ness would be great;) If this functionality is already present within the SWAM engine, I will be grateful if someone can point me in the right direction
Comments
This will create a much more human feeling than just using random variations. Humans act not "randomly" but "on a hidden purpose" .
If you really want random variations, this can be done by appropriate plugins in the DAW.
I use Reaper, and here it would be rather easy to do realtime-scripts, that implement random output and/or variations of controller data in a Midi stream. The output of such script can be routed as well to midi CC inputs of the VSTi , as to it's VST-parameters if the instrument does not provide such parameter as a midi CC (e.g. "Style" with the "Flutes" ).
With such script you also have full control over the kind of the randomness: e. g. maximum steepness of the change, distribution of the values (e.g. Gauss or equal), etc.
-Michael
I am already employing a TEC B&B controller in addition to my midi keys, and i agree that "random" does not equal "human". Still, I think that subtle variations in bow position and pressure, as well as pitch intonation could help in realizing a higher degree of "humanism" - I am not talking extremes here, just very subtle variations. If they could be linked to other input events (as the "Interactive bow pressure" parameter), that would probably be the best way.
The scripting route sounds very interesting, but unfortunately I am completely in the dark as to how I would accomplish such a thing:) Any pointers would be great. I am using Cubase, BTW.
But It would need Reaper to run in.
I decided to go with Reaper exactly because with Reaper you can do all such weird things you can think of. I do need this to be able to use the Swam (and other) instruments for live playing with Master Keyboards and the TEC BBC.
A Friend of mine was a long time Cubase user and after he tried Reaper (due to my recommendation he dumped his latest and greatest Cubase version for the much less expensive but much more versatile product.
So I recommend to try Reaper (free fully functional test drive available -> http://www.reaper.fm/download.php ), so that we can work together on the "humanizing" issue.
-Michale
Here's my take: Having played guitar for 30-some years, I recently bought a violin - just to get a feel for/understanding of the instrument (I briefly played contrabass in a chamber orchestra in my youth). It is my belief that no matter how well trained the vilolinist, there has to be slight inconsistenties in fingering position from note to note, and this is what got me into thinking that maybe this could be a clue as to why some people find the sound of SM Viola and Cello a little synth'y. Personally I find the sonic basis of both of these instruments to be absolutely fantastic, but they really need a lot of hands on control to sound their best. Beeing able to appoint some of these task to the SWAM engine or a midi plugin would free me up to concetrate on writing the music - and then later I could edit the midi performance by hand if/as needed.
Thank you very much for your suggestions, anyway! I'll keep googling and report if I find anything useful
I will order the TEC BCC v2 during the next few days. Same additionally to Breath and Bite also detects nodding and head tilting. Routing those to appropriate SWAM parameters hopefully will add a "better than random" human touch.
Of course there could be VST plugins that are able to introduce some randomnes in a Midi Stream. It's just a lot easier to do a Reaper "JSFX" plugin than to create a VST pluguin. OTOH there are means (provided by the friendly Reaper manufacturer for free) that allow for JSFX plugins to be used as VST plugins in other DAWs.
See "ReaJS" -> http://www.reaper.fm/reaplugs/#reajs
But I of course had no interest in trying that, yet.
Moreover I don't know if it's possible in Cubase to remote-control a Plungin's VST-Parameters (that are not available via Midi CCs) as it can be done in Reaper.
-Michael
thank you for pointing out these aspects. We know very well what you're talking about, and we completely agree. In fact, since several years we introduced several "pseudorandom" elements in our Kontakt-based Instruments to achieve a more natural sound. By "pseudorandom" we mean neither simple randomization of some parameters, nor fluctuations deriving from player's gestures, which can indeed be reproduced with MIDI controllers. Rather, this term refers to fluctuations with certain spectral properties, which could be detected by the analyses of real performances. We believe that this approach should be applied to all virtual instruments, and a general purpose solution is under study.
Samplemodeling branded, of course ;-)
Giorgio & Peter
Thanks,
-Michael