Creating MIDI effect plugins with VST.NET

Dec 8, 2008 at 9:52 PM
First off, thanks to Jacobi for the work on VST.NET.  The ability to write VST plugins in C# is going to be great.

The project I am starting to work on is a MIDI effect plugin.  I have written a few plugins in C++ back in the day, but this library handles things quite differently.  Basically this kind of plugin needs to be able to generate MIDI messages (primarily note on/off messages) and the host needs to be able to route those messages to a VSTi.  A good example of this is the Eckel plugin (see http://freemusicsoftware.org/1175 for the Eckel plugin).  In FLStudio, to use an example host, you load this plugin into an effect channel, set the MIDI output port to match the MIDI input port of a VSTi synth, and then it will start sending messages to your synth.  This kind of plugin does not process audio at all (although it could just pass thru audio I suppose -- the Eckel plugin blocks audio and so needs to be inserted into an unused channel).  The plugin also does not process incoming MIDI at all.  It simply generates MIDI and sends it to the host.  It does need some info from the host however, such as tempo and start/stop messages.

So my question is: does anyone have a template project for this kind of plugin?  I have looked over the MIDI note mapper sample project and I believe that it could be modified to this kind of plugin by simply removing the MIDI source stuff, but I'm still having some difficulty parsing the code.  If I remember right, in C++ you basically specify what kind of plugin you have by exposing a set of "can do" flags, and it looks like in VST.NET you do this by returning processor objects (like MIDIProcessor and AudioProcessor) but quite frankly I'm a bit lost.
Dec 9, 2008 at 4:50 AM
Well, I jumped the gun on this one.  I didn't expect it to be as easy as implementing an interface.  So I definitely need to keep the MIDI source interface, and I guess I can get rid of the MIDI processor interface since I don't need to process incoming MIDI.  And I see now that the MIDI is returned to the host via the AudioProcessor interface, in the Process call.  Couldn't be easier.

The only question I have, then, is if there is any information on the VstMidiEvent class, specifically how to set the values other than the midiData byte array.  So for example, I want to send the message

0x90 0x3c 0x64

which goes in the byte array.  But what are the:

DeltaFrames
NoteLength
NoteOffset
Detune
NoteOffVelocity

values?  I always thought the note length was determined by when the note off message was sent following the note on message.  DeltaFrames I don't understand at all, and Google isn't being very helpful.  Any guidance would be greatly appreciated.
Coordinator
Dec 9, 2008 at 3:24 PM
No problem. Yes, all you have to do is implement the IVstPluginMidiSource interface (which is only one property) and the Framework will communicate to the host that you will be sending it VstMidiEvent instances. Refer to the original Steinberg VST SDK for information about the VstMidiEvent struct. The managed version in Core is exactly the same.

But basically these field will provide you with extra information when receiving VstMidiEvent instances from the host, when that midi is played back from a recorded midi track. Most of these fields will not be filled when you receive live (real-time) midi data from the host.

I think only DeltaFrames are needed when outputting midi to the host. This field tells the host how to space the midi events in time. The comment in the VST C++ source file says "sample frames related to the current block start sample position". So for each call to Process() you start with DeltaFrames = 0 (when that Midi event occurs right at the start of that cycle).
Dec 9, 2008 at 7:07 PM
Thanks for this.  Yes I now understand that the DeltaFrames tells you where, in the block of samples given to process(), the midi event occurs.  So it looks like, when sending MIDI back to the host all of this can be set to 0, but when an outgoing MIDI event is queued, Process() can look to see if the event should be fired during this block and DeltaFrames can be set so the timing works out properly. 
Dec 10, 2008 at 7:19 PM
One more question, if I may.  I am looking at the MidiNoteMapper example to see how MIDI event messages are sent to the host.  This happens in the AudioProcessor class, in the Process method:

First a reference to the host processor is retrieved:

if (_hostProcessor == null) {
    _hostProcessor = _plugin.Host.GetInstance<IVstMidiProcessor>();
}

Then the events are passed to the host's process function:

_hostProcessor.Process(_midiProcessor.Events);

The problem is, _hostProcessor is always null -- even after the GetInstance call --  so this line is never called.  I am testing in FLStudio.  I initially thought that perhaps the host didn't support receiving MIDI messages from the plugin, but I know that Eckel plugin I mentioned does just that.  And regardless, this looks like the host itself cannot be referenced.

Is this host specific (is _hostProcessor non-null in any host)? Or is there something missing from the code?
Coordinator
Dec 10, 2008 at 8:20 PM
Of course you may ;-)

The host object will only return the IVstMidiProcessor interface when the host application supports both receiveEvents and receiveMidiEvents cando's. It must answer both with an explicit "yes" (value 1). Perhaps FLStudio answers with a "dont know" (value 0)?? Perhaps I should relax this condition to not being "no" (value -1)..?

(This code is located in Jacobi.Vst.Framework.Host.VstHostInterfaceManager.cs, the CreateMidiProcessor method.)

The idea is that once you retrieve a reference to the interface, you know for sure your Midi Events will arrive at the host application...
You should always code in such a way that the reference may be null.

Hope this helps
Dec 11, 2008 at 7:58 PM
Hmm, well I rebuilt from source with those two lines modified, but now I can't get plugins to load at all.  I'm sure I screwed something up somewhere.  I might have to wait for your next release.

In the meantime, I'm trying to get info on what FLStudio returns in response to those two queries, and I'm hunting for another simple host to use to test with.
Dec 11, 2008 at 11:14 PM
Ok I have some more info on this.  I ran the Eckel plugin and the MIDI mapper plugin through a plugin diagnostic app to see what functions and opcodes are firing for each.  Below are the functions called by each.  This is again in FLStudio:

Eckel (the MIDI plugin that works):

- AudioMaster OpCodes
--Version
--WantMidi
--CanDo

- Dispatcher OpCodes
--Open
--SetProgram
--SetSampleSize
--SetBlockSize
--MainsChanges
--EditGetRect
--EditOpen
--EditIdle
--GetInputProperties
--GetOutputProperties
--GetEffectName
--GetVendorString
--CanDo
--SetEditKnobMode

MIDI Mapper (doesn't work)

-AudioMaster Opcodes
****NONE!

-Dispatcher Opcodes
**** Same as Eckel, except GetInputProperties and GetOutputProperties are not called.

Now without really knowing what the hell I'm talking about, I believe the problem is with the GetInputProperties, GetOutputProperties and WantMidi functions.  Those all sound like functions that would get called on intialization and which would relate to the MIDI capabilities of the plugin.  If those functions get called when the hosts MIDI interface is created for the plugin, then I've not been helpful at all, but hopefully this tells something about why these plugins aren't working in FLStudio.

Dec 11, 2008 at 11:21 PM
One more bit of information: Eckel cannot be loaded as an instrument (which is correct, it should not ever be, it is a MIDI effect).  My dummy plugin and the Note Mapper plugin are, I assume, not alerting FLStudio that they are effects and should be loaded into effect channels, because I can load them up as instruments with no complaint from FL.  They still receive null MIDI processors from the host no matter where they get loaded, however.
Coordinator
Dec 12, 2008 at 6:01 PM
Right. That WantMidi thingy is probably why FLStudio doesn't let VST.NET send it midi events. It was deprecated in VST 2.4 and VST.NET is strictly 2.4, so it doesn't send it, never...

Perhaps I should support the depricated master callback functions? Not sure yet. This seems to be a valid situation where you might want it...
Feb 12, 2009 at 8:08 AM
at first HI to all.
to Jacobi - Thank u very much man! This project is great. I'm "Indigo Children" project member (indigo-children.promodj.ru - only ru-ru), and now i'm preparing live set with ableton live 7.0.3
for doing that i have to write a little midi plugin, that will thigger the notes. I've read the posts, seen the samples, have look at docs, but still have some questions, may be 'cause i'm new to vst developing. Live 8 will have Max msp system on it which provide users with ability to create plugins by themself, but it's nothing beside the vstnet project)

i want my plugin to work like arpeggiator, but with out of arpeggiator), i want it to repeatedly play the notes while i hold the midi keyboard keys,and cann't get the strategy of working such kind of plugin. i understand that the midi events should be stored at midiprocessor and occure during audioprocessor.process call.

I have idea, but not sure that it's right. i think i have to store the state of the keyboard in a map, and during process call i have to see which of the keys are pressed, then do the calculating of delta frame based on time position for each note and generate midievent, is it good idea or what...?
sorry for bad en(
Coordinator
Feb 12, 2009 at 6:07 PM
Edited Feb 12, 2009 at 6:11 PM
Hi Jinek,

If I understand you correctly you want to send midi note events to the host when you receive a specific midi note event.
I assume you have some way to generate, program or record these note-sequences.

I would store these sequences in a dictionary (or a map, like you said ;-) for each specific midi note event.
A note sequence would simply be an array of VstMidiEvent instances, one for each note in the note-sequence. You need a note number, a velocity (volume) and the timing (deltaFrames) for each note in the sequence.

You implement a (IVst)MidiProcessor in your plugin and also implement the IVstMidiSource interface to indicate you can 'source' midi events to the Host (the framework takes care of communicating these capabilities to the Host). You implement the Process method of the MidiProcessor to look up the note-sequence in the dictionary when you receive a midi-note-on. You load the sequence into a Player object. This Player class keeps track of how many notes have been played back from the note-sequence. Note that a long note-sequence can need multiple calls to Process (of the audio processor) before its finished. This player object manages the midi playback in multiple chunks. It also re-aligns the delta frames of the stored note-sequence note events to the current call to AudioProcessor.Process.

The following example shows on the first lines the notes to be played and the time between the notes (-). The line below are the calls made to the AudioProcessor.Process method.

a--b--c----d--e--f-------g--a--b--c---
| cycle 1 | cycle 2 | cycle 3 | cycle 4 | cycle ?

So the first time Process is called (after the sequence has been triggered of course) you play the a--b--c- part. The player remembers the time it needs to wait for the second cycle. The next time the AudioProcessor.Process method is called you output --d--e-- and again the player remembers the timing. Etc, etc.

It gets a little more complicated when multiple note-sequences can be played at the same time. Just have the Player class implement a private buffer where the midi notes from different sequences can be merged. Mind the timing during merging: a second sequence can start at anytime while the first is already playing.

If you release the key on the keyboard the midi note-off event again looks up the sequence and you have to remove any remaining notes from the player object. Again when having multiple note-sequences playing at once, this gets more complicating. You coud derive your own MidiEvent class from VstMidiEvent to tag each midi note with a sequence Id. That way you would know which notes exactly to remove when a note-off is received.

Also look at the MidiNoteSampler sample. It does sort of the same thing but for audio. It detetcs an incoming midi note event. Checks the map if an audio recording is available. If not, it start recording. If so, it starts playback.

Hope it helps.
Feb 12, 2009 at 9:33 PM
Edited Feb 12, 2009 at 9:39 PM
Yes, thank you very much, it helps. i think this discussion will be helpful for everyone who are new to midi vst fx development, the world will get a lot of cool plugins written with vstnet)

so.. i don't want the plugin to play the note-sequence, i should explain it better,
algorythm is like this:
the plugin has time-based parameters - Rate (1/16,1/32... like arpeggiator) , and note length (ms)
the plugin recieve the note on event and starts playing the notes(with such volume) each Rated time, stops the playing when receive note off event.
it's like an arpeggiator, but only if i press one key, so if i play the chord it should repeat the chord, not the arp.
------------------
saying map i mean the midi-keyboard map <-it should store the state of the keys (pressed/unpressed and volume), when i press abcd chord, for example, it should get the state: 
{
map["a"] = new State(States.Pressed,event[0].Velocity);
map["b"] = new State(States.Pressed,event[1].Velocity);
....
}
smth like that...
and during the audioprocessor.process method call, i think, it look up the map for the pressed keys and generate only those events, which must occur during the current buffer time, it's so because we don't know the time when exactly the plugin recieve the note off message. Can I set the note length for the generated note on message, will it stop by itself, or i have to send note off message?(we know the note length value - it's the plugin parameter).
And how could i align the delta frame value for each event, i mean i have to know what is the delta frame values  for each Rated tick in the current buffer. For example: the Rate Parameter is 1/16, the buffer length is 64 samples, and 3 notes on/off should be occur during current buffer period in each 1/16 of the hosts bar, what is the idea of calculating deltaframe, as far as i see we must know the tempo, the sample rate and so on to calculate the bar length and the playback position of the first frame in the current buffer
Coordinator
Feb 14, 2009 at 5:03 AM
The VstMidiEvent classes are immutable (read-only). Once they're constructed you can't change their properties. So you have to make a copy with the adjusted values.

It will not stop by itself. You send a note-on to start playing a note and you send a note-off to stop playing that note.

The 'BlockSize' are the number of frames you can expect in each call to AudioProcessor.Process. To calculate the number of frames you need for 1/4, 1/8, 1/16 etc. you need the information returned from GetTimeInfo (on the IVstHostSequencer interface). Search the net and you will even find the formula for it.