[CM] 2nd International Linux Audio Conference
Juan Reyes
juanig@ccrma.Stanford.EDU
Thu, 29 Apr 2004 16:37:20 -0400
> while the
> starry-eyed composer clicks and drags widgets that control those
> processes, listening to the output as he does so --
I still wonder whether the "starry-eyed" composer has real-time (or
rather big-time) to **listen** while focusing on where to click next.
> as I sat there poking the variables
> that controlled that algorithm -- call it realtime algorithmic
> "improvisation" if "composition" seems too grand.
BTW, your description clearly outlines that aspect of algorithmic
composition which indeed I also find it to be in "real-time" in
particular the 'search' almost in AI terms, for the ideal sound. This is
really algorithmic.