The Tree of Life

Life can be represented in the shape of a tree. Each branch sprouts from a decision made, leading off down a different path. Observed from afar the branches - depicting countless possibilities derived from the decisions made - begin to take the form of a tree. As entities traversing the tree we get to experience only one path along the branches. This is what I set out to explore for the final exhibition at Queens Collective.

As a writer it is natural to imagine different possibilities that exist in a narrative, similar to the diverging branches of a tree. A conventional story however expects all other possibilities to be dropped except the one that leads the story to the conclusion desired for the final draft. For a while I had been wanting to create a program that would allow a reader to play out an interactive story, somewhat like a digital version of a Choose-Your-Own-Adventure book or a narrative-driven video game.

I started to develop a Windows UWP app for the exhibition which would allow a writer to retain the different possibilities and expand them to their eventual conclusions. This changes the approach of a writer towards the narrative as they need to consider where they wish to provide choices to the reader, which subsequently causes a numerical explosion of different paths and outcomes. This makes the creative and imaginative process of writing more cumbersome and demanding. Yet at the same time it is also exciting for it stretches the limits of the imagination.

For the person reading the story the element of choice can be exciting. At the same time it can also be unclear if that would make a difference as we are not conventionally used to a narrative changing based on the choices we are provided. By knowing a choice can change the narrative the reader can then feel compelled to read the story again to see how it changes the next time around.

 
People reading the text for the exhibition piece. It was developed using the Windows UWP app by creating an aurally-driven narrative.

People reading the text for the exhibition piece. It was developed using the Windows UWP app by creating an aurally-driven narrative.

 

The app was coded to have an extensible framework that allows for different media to be used to create an immersive narrative. Currently it only allows for sounds as the premise of the story I was writing was more reliant on aural input. Eventually I plan on expanding it so text and images, and possibly even videos could be used to create an immersive, dynamic and interactive narrative. 

It is still in the prototypical stage and not fully stable - programmer speak for 'it has bugs'. I'll continue working on it further, fixing bugs, making performance improvements, and enhancing its functionality to make it an even better experience. Look out for my next post in a few months following months when I'll publish the app to the Windows Store for anyone to experience it.


I would like to acknowledge Queens Collective for granting me the opportunity to be an artist-in-residence with them without which this and the other creative endeavors I have explored over the past 6 months would not have been possible.

Related Posts


Irfan A.

Storyteller. Software Engineer