home about services projects news contact site map

CMA 2000

Digital Heir to Light Show Debuts in San Francisco at CMA Show

Dynamic real-time digital video visualization of music on big screens was a last-minute addition to the California Music Awards (CMA) event at the Bill Graham Auditorium in San Francisco. The digital transformation of the '60s light show debuted in the birthplace of the light show, as Onadime translated music into real-time digital video art at the Tower Records CMA show.

Onadime's music visualization technology was added to the show after producers viewed dynamic digital tapestries extending far beyond Onadime's presence at the 2000 Rock and Roll Hall of Fame ceremonies. At this show, Onadime real-time music visualization was driven directly by the night's music in the award show's opening, closing and breaks.

"We were happy to be asked to the party, and maybe rekindle a few smoldering light-show embers here in San Francisco," said Onadime president Bruce Mitchell, recognizing the irony of bringing a digital-world light show to San Francisco.

"I date back to the days of glass pans, colored oil, water, shapes, and overhead film projectors at the Carousel, Winterland and the Fillmore. Onadime technology is a wee bit more sophisticated," understated Mitchell. Onadime has been awarded a patent for the visual programming software featuring extensible architecture that interfaces with a high-speed image renderer within Macintosh computers. The engine is founded on dynamic visual linking of sensors-to-effects in software, through which art and an artist's instructions create a data-flow image that responds to music or any sensor input.

"Like light shows, Onadime's visual art compositions are unique, real-time events blending visual images to music. The difference lies in the richness of visual reality that can be created, recreated or altered with digital accuracy and flexibility," Mitchell explained.

The end result is unprecedented when a digital artist's "compositions" of images, effects and dynamic-link mappings of sensors-to-effects are translated to data-flow relationships instantly responding to music. The input can be music or sounds from MIDI or Macintosh AV input, as well as pictures, QuickTime video, live video or/and sounds. In fact, music is only one form of input for Onadime compositions. Onadime can respond to speech or movement of any kind, whether in QuickTime pictures, dancing on stage or hand movements on a mouse.

The result is as variable as an artist's vision and the music being played through the interface.