Niseq tech. updates

let created = `Date (2021, 7, 1) in

Since my last post on the ideas behind Niseq , I've been working on quite a lot of different things, and thinking about it feels like a blur - so this post will be a nice summation of some of the last months work for me, and hopefully you'll find it interesting.

Effects, composition and control

Last year, right before my performance at Bornhack, I implemented an effects-system for supporting shader-effects like brightness/contrast, zooming and fading the most recently shown images together. Since then I implemented a bandpass effect and made effects compose when applying new effects on top of previously recorded image-sequences using niseq. Also I came up with a new way of toggling the control of continous niseq parameters, including effects.

I named the bandpass effect after the common bandpass audio-filter. It is a fragment-shader that tests for the luminosity of each pixel and filters the pixels away that are not within the specified range - a simple but expressive effect. The player of niseq controls bandpass position and width.

Composition of effects was an obvious feature - as a lot of my workflow with niseq is based on recursively recording improvised image-sequences and playing on top of previous recordings. Parameters in the previously recorded effect, and the newly applied effect, are basically added together. Note that as niseq records image-sequences into its own format, which includes the specification of effects, there is no loss in fidelity when applying effects recursively.

The last major effects-related feature I implemented, was an extension of niseqs keyboard shortcuts system, that now supports toggling of options by holding down keys. This turned out to be a really powerful and intuitive way of toggling control of continuous parameters, e.g. for effects. When implementing this I found it very interesting that the FRP (functional reactive programming) implementation of the previous shortcut system, could be easily extended without changing any of the previous code - i.e. this extension was implemented solely via new FRP values!

There also emerged a fantastic side-effect, partly because of this design - all keys held down are decoupled from all new keypresses; which enables control of all effect-parameters, only limited by how many keys you can hold down at the same time.

Instrument versus GUI

Niseq has intentionally been designed without a GUI, as I think the artist should have a direct focus on the output of the instrument, instead of getting disconnected emotionally by interacting with a "middle-man" in form of a GUI. Instead the artist has the shortcut-scheme in his/her mind - like a cellist intuitively knows how to move arms, fingers and body to express the intended music and emotions.

On the other hand, as niseq has become more complex, and as there is a lot of state related to the players interaction with the instrument, it has become more and more useful to observe the current state. This led me to design a client application I call n_modeline that listens to a pub/sub socket of niseq and shows its state. This makes it feasible to be forgetful of the state of niseq and still be able to learn what it is at a later point.

That n_modeline is a non-interactive GUI, I find a feature in comparison with typical GUIs:

  • Interactive GUI programs are horrible to write in general.
    • It makes the codebase of the rest of the program horrible too as it demands recursion between view and model - and it demands so many lines of code.
  • With a non-interactive client-UI, all state obviously lies in the main application.
    • So one can e.g. spin up arbitrary amounts of UI-clients that listen to the state of the main application, without the state of UI-clients interacting in unintended ways.

The only advantage I see around interactive GUI applications, which in many contexts is an essential advantage, is that most people can use a GUI without reading a manual or having used the application before.

That niseq is an instrument, and that it therefore shouldn't have an interactive GUI, is also a perfect match with the purely functional reactive framework niseq is written in - FRP. FRP leads to beautifully simple reactive code, where one even programs with complex time-semantics - but it is limited to very local recursion; recursion is only present inside fixed-point combinators or folds over events. And as typical interactive GUI's depend on whole-program recursion, it wouldn't be a good fit with pure FRP. Though, as I'll get a bit into later, niseq client applications can become interactive GUI's if need be.

Synchronization with music

Another project I worked a lot on in the past couple of months was synchronization of improvisations on niseq with predefined music. This process has been messy - including coding up two different client-applications that I, to begin with, implemented as standalones; one to synchronize with music and one to show live timelines of improvisations in a layered view.

It turned out that I could simplify the users interaction with these applications a lot by merging them into one, which became my niseq client "n recipe".

Expansion/contraction workflow

I've wanted to keep the improvisational nature of niseq when working with more complex workflows, and there has always been an aspect of artistic expansion and artistic filtering in my workflows. A sidenote: I think that the expansion/contraction process is a universal tendency of most systems that evolve - i.e. creative systems.

So I'll explain the expansion/contraction parts of the workflow related to n recipe. The expansion phase of the workflow is live-improvisation, where I improvise on top of existing improvisations in several iterations - in the case of n recipe this is always synchronized with music.

The contraction phase is where I rewatch an improvisation, and tag sections with scores and comments (saved in recipe-files), and when I've done this for several improvisations I merge the best parts of each improvisation into one, which I actually do live - so timings of switching between improvisations is done by a mix of intuition and watching the scored timelines.

Technically, n recipe does the job of:

  • Loading niseq replays into niseq groups
  • Subscribing to the playback state of niseq
  • Sending synchronizing events to an audio application based on state of niseq
  • Reading 'recipe' files from disk and show them as live-updated timelines

The following pieces were made recently with the new workflow of n recipe and iterative improvisations:

Will fall interpretation 02

Will fall interpretation 01

Future generative clients

An interesting technical aspect of n recipe, is that it is my first niseq client application that both listens to the state of niseq and sends events to niseq. This makes it a client that binds a recursive loop around niseq, where niseq internally has minimal recursion. This design has the danger of feedback-loops, but for the same reason also allows for really interesting generative clients. The idea is that the generative clients control niseq in different ways via niseqs input-server, and are all patched together, as the modules of modular synthesizers. What makes this even more interesting for me is the potentials of FRP in this context, where each synth-module can be a little elegant FRP application, seen as a black box from the outside. In my last blogpost on the tech. of niseq I also mentioned these modular-synth clients, so hopefully I'll get to work on them soon (:

The rest

I'm not going to elaborate much on the rest I worked on, but besides making music and experimenting with making art using niseq:

  • I made a bunch of other niseq clients - e.g. an interactive client to find recordings that match, and did some earlier experiments with alternative workflows.
  • My query- DSL for finding tagged clips was extended with more features, like negation.
  • I added more expressive features to loops within image-sequences - modification of movement, width and drawing of the loops waveform.
  • I added a new command-line DSL for clients that interact with groups in niseq, so the user can choose where to add new groups or modify existing ones.

If you are an artist who's interested in collaborating on audio/visual jam-sessions - then feel free to contact me at rand at r7p5 dot earth.