Rhythm and music are the earliest and most natural forms of human self-expression; they are also methods of transmission and perception of information. Rhythm and music act not only on the level of consciousness, but also on the one of subconsciousness, creating images. Programming as a kind of art is a process of operating with pure semantic forms, which means the highest level of involvement of consciousness into the creative act. The main idea of LYCAY project is to use musical accompaniment during programming. This accompaniment corresponds to the meaning of the program so it allows the programmer to look at code in some different ways than usually. Working with music corresponding to the code, programmers will be able to look at the code and algorithms they are creating differently: using not only their consciousness but subconsciousness as well. Thus, programmers will be able to see more accurate and complete picture of what they create, to involve all their feelings into the process of creation, to use their consciousness and subconsciousness. Programmers will be able to open their mind and create something they could never imagine before. Aesthetic beauty of code will be expressed in music: if the programming code has aesthetic beauty then the music of this code should also sound nice.
The meaning of the program is determined by the flow of instructions and data during runtime of the program. This flow depends on possible values of input parameters so the meaning of the program is determined by execution of the program for all possible values of input parameters. Programmers are also able to understand the meaning of the program even if they do not mentally interpret/calculate this program for all possible values of all parameters. Moreover, they have some kind of intuitive feeling of the program. To make this intuition clear LYCAY was created.
I use the following mathematical formalisms in LYCAY. Any programming language is organized by its formal grammar. The meaning of the formal grammar is a system of functions dependent on the language’s grammatical rules and atomic rules (i.e. concrete expressions: variable names, variable values and so on). If there are ways to make all grammar rules sound, then the sound of each rule corresponds to the meaning of this rule, so the sound of the whole program corresponds to the meaning of the program too. The representation of the program in music is achieved by translation of grammatical rules of a programming language (I use java) into the musical phrases. The music itself is generated by JMusic1 that was developed by Andrew Sorensen and Andrew Brown.
LYCAY is designed like IDEA’s plugin (IDEA2 is a very popular integrated development environment for Java and it provides a lot of simple ways to work with code such as parser and so on) so there are many possibilities to extend functionality of LYCAY. For each grammatical rule of java’s grammar a user can set any arbitrary algorithm that will make this rule sound the way he/she designs. This algorithm is also written in Java. A simple example: each note can depend on the previous one, each rule can influence its sub-rules, etc. LYCAY plays music realtime, simultaneously highlighting the played line of code. Thus, LYCAY determines what to play, plays it and highlights what it is playing.
There are some features that are implemented in the current version: One can play any logically correct part of code. By “logically correct part” I mean a part that corresponds to some grammatical rule. LYCAY determines the complete grammatical rule that corresponds to the position of the cursor and plays it.
For example, one can play an equality operation, a loop or a conditional operator (see screenshot below).
One can play a whole Java method or/and save music as midi file (see screenshot below).
When LYCAY is playing music one can hear two soundtracks. The first is generated during the grammatical parsing of code. It is achieved by mapping grammatical rules to music. So when the first track is playing one can establish a correspondence between a rule and sound. The second track is generated by the flow of lexemes3 of code. Thus there’s established a correspondence between the concrete text of code and music. It is achieved by mapping lexemes to music. So there are two kinds of connections between code and music, and, therefore, the subconsciousness.
I would like to mark a difference between this approach and livecoding 4. Although they seem very similar as they use similar concepts: code and music, that somehow depends on the code, there is a very important and principal difference. This difference can be described as follows. In livecoding one start with writing musical operators that will play. When an algorithm is written, it starts working - calculating, and this calculation is translated into music that is played. Livecoding sets the process of calculation and in this process there is music played; music reflects calculation, and calculation influences the music.
LYCAY generates meta-music: it doesn’t calculate the algorithm, the music of which we would like to get. LYCAY looks at the algorithm from above, and plays it without calculating it. Thus, if livecoding is a thing-in-itself, and can only be used while programming for generating particular music; LYCAY can be used while programming anything with the given language.
The correlation between these two approaches is an interesting question to investigate.
To start LYCAY, you need to have IDEA 5.0. Copy the file lycay_#####.jar into the plugins folder (it is where you have installed IDEA, for instance, C:\Program Files\JetBrains\IntelliJ IDEA 5.0\plugins). Start IDEA and LYCAY will start automatically.
Documentation on LYCAY’s architecture and instructions on programming in LYCAY are available at: http://lycay.sourceforge.net All necessary files could also be found there. The requirements: Java 1.5, IDEA 5.0 (build since 3461)