There's a revision to this patch with source file download here: viewtopic.php?p=7600213#p7600213
I'll post the working patch in the sharing thread when I'm satisfied it's robust, and while I'll refrain from making suggestions, below please find a short narrative of my patch's (herein named MFELL) evolution.
Max/MSP "proof of concept"
I was interested in how one musical line could effect the note onsets and duration of another line, and put together a quick patch that would take two pulse trains derived from MIDI files, and merge the onsets and durations in various ways. For example, two audio files like this:
Would be combined using a patch like this:
The audio pulse trains were derived from MIDI files and consist of 0 and 1 values corresponding to noteon/noteoff events. The two pulse trains are read into buffers k1 and k2, which are then played back through the k1 and k2 [wave~] objects. The [phasor~] is set to play back at "normal" speed, and the two number boxes going into the [phasor~] objects control their phase relationship. The two pulse trains are brought together via [bitand~] which creates merged onsets and durations according to the Boolean formula. There are of course, many other ways to merge the trains at this juncture in the patch. The resulting train is sent to [edge~] which signals transitions from 0->1 and 1->0 out its two outlets. These transitions trigger messages--noteon/noteoff pairs--that are then formatted and output as MIDI.
MIDI version in Max
Pre-preparation of MIDI files was a certain annoyance, but also I believe I found a defect in Max's [bitor~] object, and so I moved to create a MIDI-only version of the above patch. The [seq] [midiparse] objects form the "players" for two MIDI files, supported by a UI button and attendant "read", "print" and "stop" messages. I take the leftmost outlet of [midiparse] to obtain the pitch/velocity pairs of the noteon/noteoffs.
[zl] is Max's list processor, and [zl 2 nth 2] splits the pitch/velocity pair into its components. It does so with evaluation order swapped, so there's a corresponding [zl.swap 0 1] as the last operation to "unswap" the order right before [midiformat].
I take the velocity data and convert from 0-127 to 0-1 via the "if then else" objects. This transformation then feeds two [&&] objects which form the Boolean truth table. I use two objects so that the AND is performed whenever either MIDI sequence generates a noteon or noteoff: one sequence stores, but is triggered by a change in the other, and vice versa.
The outcome of the Boolean operation goes into a [change] object, which encapsulates some powerful MIDI functionality. The leftmost outlet produces results equivalent to Architect's [data thinner] object, where successive duplicate values of a data stream are tossed away. Additionally, the next two outlets produce the results of 0->X and X->0 transitions, and produce no output otherwise. So I use [change] to reduce the data stream to the merged noteon/noteoff transitions, and split flow of control into those two streams. Given these streams now consist of 1s or nothing, it's a simple matter to [-1] to get 0 for noteoff velocity, and 1 [*] the original MIDI note velocity to reintroduce the velocity data from the first MIDI sequence. These are [pack]-ed back with the original MIDI pitch data. And [swap]-ed into correct order for format into MIDI.
First Architect version
The Architect version of the MFELL patch is conceived as a "filter", a plugin that would work on a Reaper track, and take advantage of all of Reaper's routing and MIDI playback/editing features. I have it set up as a Reaper folder of two tracks, which simply automates the routing of two separate tracks of MIDI into a third, which hosts the Architect MFELL filter and the VSTi that interprets the resulting MIDI stream. One MIDI track is presumde to use MIDI Channel 1, and the other Channel 2.
There was a lot of thrashing about yesterday, and I started afresh quite a few times, so the above is merely the last in a series of "broken" attempts at translating the patch into Architect objects. Retrospectively I wasted a lot of time, having forgotten that [data thinner] was not the equivalent of Max's [change]. IIRC years ago, when I first started using Max/MSP they were equivalent, but I used the newer features here, and then promptly forgot I had!
MIDI input from the Host is evaluated for its channel assignment: the channel range of 1->2 is transformed into 0->1 by subtraction, and these form the truth values for [branch], which then splits the MIDI stream into its channel components.
Retrospectively, I don't know why I preferred [MIDI to tuple] to [unpack MIDI]. The addition of the "noteon/noteoff" tags and the "true/false" uncoupling data seemed (probably falsely) to be extraneous to my project. In the end, [MIDI to tuple] did provide a simple debug string that I could print out rather than corralling all the unpacked data--but again--this is a trivial matter.
I didn't correctly recall my thinking in the above choice. There is no [unpack MIDI] in Architect AFAIK, but [unpack noteon] and [unpack noteoff]. Given I was going to process the velocity streams, it seemed an advantage to have the velocity data combined (although I should re-assess that). Plus the choice of [MIDI to tuple] took the object count down by two.
The AND logic has superficial differences from Max: the "if then else" construct becomes [!=] 0. The separate "call" inlet on [AND] enables the use of one object, but then there's the issue (in this case) of having to preload (false) data into the object to keep flow of control happy. The results go into a [data thinner], and as above, constitutes my chief barrier to success. [data thinner] was creating a stream of filtered noteon/notoff transitions, but it wasn't splitting execution control into two streams. [branch] is supposed to do this routing, but it can't distinguish between 0->1 and 1->0, which is the crux of the matter.
The original pitch and velocity data is fed into [pack noteon] along with a setting for MIDI Channel 3 and Architect "note uncoupling".
Latest Architect version
So this one finally is successful. After I stupidly recalled that [data thinner] isn't the equivalent of Max [change], I wrote my own variant.
One small change is the move to [modulo] 2 to split the MIDI Channel streams. This flips the execution order so that visually, Channel 1 is on the left, Channel 2 is on the right of the screen.
Looking at the objects in the dark yellow box, I use [order] to give control over order of execution (although I feel this is a cop-out: it should be able to be written using Architect's natural order T->B, L->R). The [latch] is read before it's updated, so it retains the previous value of the stream. A read of [latch] loads the [!=] comparison of current against previous value. If [!=] is true, the current value from [order] flows through the [branch], otherwise it's discarded. The resulting stream is then of velocity values 0-127 representing the transition to noteon or noteoff.
I realized that [pack noteon] also work as [pack noteoff]. After all, MIDI noteoff is just a noteon with a 0 velocity value. So I eliminated the [pack noteoff] object as something that unnecessarily complicated my task.
As I was now producing a logical data stream of integers 0-127 instead of trues and falses for note control, I was moved to write my own AND logic that would eliminate a data conversion process. In the "AND Logic" blue box, I use [clamp] to scale the 0-127 velocity stream to 0-1. Multiplying 0s and 1s together gives the Boolean AND truth table. The result of "AND Logic" and "Change" is then multiplied together to create the merged velocity, which is combined with the rest of the MIDI stream, and sent to the VSTi.
So that's it! Thanks again for Architect!
All the best, Charles