Feature – GlowLDB: The Musical Household

This is probably one of my favourite pieces written about what we do, and more specifically where we used to live.
The interviewed was conducted by good friend Shiba Mazaza, who also organised all the clothes and styled us with Thando Bawana.

Mohato Lekena, Robin Brink, Ross Dorkin

 

Besides the in depth interview what I loved about this all was the amazing photography by the homie Donovan Marais, many of which I still use often today.

Robin and his drums

Ross Dorkin and the records

The full article can be seen Here

Massive thanks to Shiba, Donovan and Thando

TEDxUCT Talk: Life Synthesis

As part of TEDxUCT I gave a talk on various subjects ranging from my research, my personal work, and how these are all related.

It was a fun talk, and it ends off with a short section of the TEDxStellenbosch performance we did a while ago

I think the video was a good distillation of some of the broad concepts that I speak about, which focus around how one can synthesize seemingly disparate interests in their lives into a singular project that’s larger than the whole

The video can be streamed below

Dissertation Abstract

I recently handed in my dissertation for the completion of my Master’s degree in Computer Science at the University of Cape Town. While the entire dissertation will be uploaded on the department’s website once it has been marked, I thought I would put the abstract up to give people an idea of what the research was about.

You can see it below.

Technology has many times in human history redefined how we consume and also create music. Technical innovations have led directly to new findings in music theory and the birth of new genres and trends. When the possibility for creating digital instruments became a reality, virtual musical instruments were born, impacting the way composers worked. The innovations that occurred impacted both sound syntheses, allowing for the generation of new and interesting sounds, and in terms of input control, allowing for new ways to both write and compose music. Through the separation of these two musical tasks (musical control and sound synthesis), designers of new instruments were free from having physical constraints dictating the direction that their new designs took, and thus the stakeholders involved in the design of new instruments grew.

Despite all of this, though, many interfaces were indebted to their physical predecessors, and thus digital pianos often resembled physical ones, and drum sequencing applications followed design traditions that began with the design of analogue electric ones. This was despite the fact that research had suggested that digital music software can limit the forms of music we create by using interfaces that directly copy those of the analogue instruments that came before. Through this copying, even some interfaces for the latest technology to interrupt the new music field – mobile, multi-touch devices – still has interface analogies that are decades old when it comes to the task of drum sequencing.

In this study we report on a new multi-touch mobile interface that affords a completely new form of drum sequencing. The starting point for the design of this interface did not lie in already existing instruments, but rather design began from the position of using established HCI practices, techniques and knowledge and applying these theories to the task of drum sequencing. These ideas as well as those from avant-guard music and embodied interaction were synthesised and then launched as a technology probe. This probe was then evaluated using a method consisting of semi- structured interviews and Discourse Analysis.

20 users in total participated in the evaluation sessions, with there being an equal split of people who considered themselves musicians, and those who did not. We found that for users with no musical training, and for users with a large amount of musical training (who we called group A), the software did allow them to be more creative. These users chose to actively explore the possibilities that the interface presented them, creating music which was both complex enough to surprise them, yet controlled enough to be able to mimic and take inspiration from a set of given audio examples. However, users with limited training on existing sequencing software (Group B) found the new interface challenging, often ending the sessions early and not looking to explore the interface any further than required.

Race Constructions EP

Wildebeats - Race Comstructions [Red Bull Music]

I recently got to spend a week at the Red Bull Studios Cape Town recording music which would end up in an EP I called Race Constructions (stream the EP at the end of this post)

While a week can be pretty short for recording an entire EP, there were just so many great pieces of hardware at the studio that it was incredibly easy to generate content. In fact, I had to end up coming up with strategies in order to make sure I got to use all the gear I wanted to. In the end, I ended up experimenting with and using one piece of gear in depth on one song, so in a sense each song was based on a separate synth.

The first track is based of the Jupiter 6, which I seriously had to wrestle with before it gave me anything. The second based on the Moog Sub Phatty, which is an awesome and much more usable machine (due mainly to being much newer) and the third used mainly the software synths on the RB computers. The fourth song, which is in my opinion the most interesting, was made by sending midi from an Axiom 49 into the Nord Lead rackmount synth, while the last track was all Machinedrum.

For more on the project read some writing I did on it here and read a review of the project here

Mohato Lekena: Selected Work

Mohato Lekena is Computer Scientist, musician and interaction designer based in Cape Town’s CDB currently perusing an M.Sc. at the University of Cape Town. He has been named one of the Mail and Gardian’s 200 Young People who will influence the future of the country, and as a South African representitive in the British Council’s Future Music Rising Campaign.

This is a list of selected work he has produced and projects that he has been involved with.

Contents

01: Xen Mobile Drum Sequencer

imobile interface

Goal: To explore new ways of expressing rhythm and drum sequencing on a mobile multi-touch system
Technologies used: C# for Windows Phone, XNA, Virtual Studio

Overview: Digital music software can limit the forms of music we create by using interfaces that directly copy those of the analogue instruments that came before. In this study, a novel multi-touch interface that affords a completely new form of drum sequencing was designed and implemented. Based on ideas from Avant-Guard music and embodied interaction, it was deployed as a technology probe and then evaluated by a wide range of users. The main interaction metaphor revolved around allowing users to draw loops on the mobile device and have these loops played out by the system. The speed of the loops could be altered using accelerometer based interactions. It was found that for users with no musical training, and for users with a large amount of musical training, the software did allow them to be more creative. However, users with limited training on existing sequencing software found the new interface challenging.


02: Orbital Antics Android Game

imobile interface

Goal: Create a physics based Android game
Technologies used: Android, Eclipse, jbox2d Library

Orbital Antics is a physics based platform game for Android developed by a team of Mohato Lekena and 3 other people. Movement of the player’s avatar, which is a ball, is controlled by the orientation of the user’s device, which tilts the axis of the game’s gravity. Beyond this, users can also control pin-ball like flippers and a limited “boost” which thrusts the avatar forward. The game was pretty fun in the end.

Top


03: Still Life Generative Sequencer

Goal: To create a generative, AI based sound device that explores concepts of composition and ownership of creativity
Technologies used: Processing, MaxMSP, Ableton
Overview: Still Life is a generative, AI based sound sequencing and manipulation interface. The interface revolves around a moving visualisation that explores a two dimensional environment according to the rules of a primitive AI. The movements and behaviours of this visualisation are then used to inform various parameters of the sound produced.
The behaviours of the visualisation, and by extension the sound produced by the interface, also vary as the interface’s environment changes, cycling between three states. These states are visualised by a darkening and lighting of the background, and these changes make the visualisation’s movements more or less erratic. Interactivity is achieved my clicking anywhere on the two dimensional space. Doing this “suggests” an area for the interface to explore, although this suggestion may be ignored.

Top


04: iMobile data visualization project

imobile interface

Goal: Explore alternative methods of representing user data on mobile devices
Technologies used: Android, Eclipse, SQLite
Overview: The contemporary desktop metaphor used on many modern PC’s has been shown to be both inefficient and out-dated in terms of usability. Regardless of this though, many concepts associated with the desktop metaphor, such as hierarchical filing, are being used on mobile computing platforms. Since, at the PC level, various desktop replacement systems have been both suggested and built, what this project seeks to do was to port an amalgamation of some of these systems concepts to the mobile platform. In doing so, the research questions explored revolved around (1) whether or not such a mobile system can be built to a high level of usability and (2) whether or not such a system would be preferable to the popular Android mobile Interface. The main interaction metaphor revolved around not regarding file type at all when displaying information (such as text message, PDF, or MP3), but rather always order items by time – using search extensively and allowing users to save search queries. Technically, the project focussed on building databases of meta-information used for the searching, and implementing drawn gestures for shortcuts. It was concluded from the study that, in terms of data organisation, having a search based user interface is more efficient and usable than the current application based hierarchically filed Android system.

Top


05: TEDxRemix

Goal: Present an audio visual performance for TEDxSTELLENBOSCH
Technologies used: Processing, PureData, Adobe Premiere, Logic Pro
Overview: In early 2012 we (glowLDB) were invited to produce a remix for TEDxStellenbosch in collaboration with the TEDx Global Music Project. Our piece was to be based on material from the growing archive of TEDx musical performances that is the TEDx Global Music Project.
Our first goal was to expand on the remix concept by not simply creating a collage of found sounds (the conventional approach). This view led us to create an abstract sound world which involved extensive planning, processing and attention to detail. We decided early on to produce an audiovisual piece in which the sonic and visual elements were interdependent.
In order to compose the piece we chose a framework of two rules. Firstly we sought to explore the TED aesthetic in colour, movement and form. Secondly we worked exclusively with sound samples from TEDx musical performances. Our visual elements were produced in Processing and much of our sound design was generated in Pure Data. The final composition was arranged in Logic Pro and Adobe Premiere.

Top


06: Cover Designs

Covers

Goal: Create digital covers for various Wildebeats projects
Overview: as part of my Wildebeats production projects, I’ve had to design a number of covers for singles that would act as the music’s online image. All design here was completed in GIMP

Back to top