Fact-checked by Grok 2 weeks ago

Visual music

Visual music is an artistic practice that translates musical compositions or sounds into visual forms, emulating the original auditory syntax through abstract imagery, color, rhythm, and motion to evoke spatial and temporal experiences akin to hearing music. This intermedia form often draws on , where sensory experiences like sound and sight overlap, and encompasses a range from static paintings to dynamic films and installations. The history of visual music spans over three centuries, originating in theoretical ideas from ancient philosophers like and , who explored correspondences between sound and color, and advancing through 18th-century inventions such as Louis-Bertrand Castel's Clavecin pour les yeux (Ocular Harpsichord) in 1734, an early designed to project colored lights in response to musical notes. In the 19th and early 20th centuries, the movement gained momentum with the development of mechanical s and influenced by composers like and , whose atonal music inspired visual abstractions. Pioneering exhibitions, such as the 2005 "Visual Music" show at the Museum of Contemporary Art in and the Hirshhorn Museum, highlighted over 90 works by more than 40 artists, underscoring its evolution from painting to multimedia. Key figures in visual music include Wassily Kandinsky, whose synesthetic experiences led him to associate specific colors with musical tones—such as bright yellow with high notes—and to produce seminal abstract works like Fragment 2 for Composition VII (1913), as detailed in his 1912 treatise Concerning the Spiritual in Art. Other influential artists encompass Oskar Fischinger, known for films like Ornament Sound (c. 1932), which directly translated sounds into visual patterns; Viking Eggeling and Hans Richter, founders of the 1920s Absolute Film movement in Germany; Len Lye, with his kinetic animations; and Norman McLaren, a pioneer in animated sound visualization. Later contributors include John and James Whitney, who developed computer-generated visuals in the mid-20th century, and contemporary creators like Jordan Belson with Epilogue (2005) and Jennifer Steinkamp with interactive installations such as SWELL (1995). Notable works further illustrate visual music's breadth, including Mikhail Matiushin's Painterly-Musical Construction (1918), an early experiment in color-musical synthesis, and Georgia O'Keeffe's Blue and Green Music (1921), which evoked undulating rhythms through form and hue. Thomas Wilfred's Study in Depth: Opus 152 (1959) exemplified lumia, or "light music," using projected colored lights to mimic orchestral dynamics. In the digital era, artists like Scott Draves have produced algorithmic pieces such as Dreams in High Fidelity, leveraging software to generate evolving visual patterns synchronized with sound, reflecting the medium's adaptation to technology since the late 1990s. Visual music continues to influence , bridging disciplines like , , and , while facing challenges such as scholarly inaccuracies and varying interpretations of its nature. In recent years, particularly since 2020, has emerged as a tool for generative visual music, enabling dynamic, algorithm-driven abstractions synchronized with sound. Its enduring appeal lies in the pursuit of universal sensory harmony, as seen in ongoing exhibitions and research that expand its historical canon.

Definition and Origins

Definition

Visual music is an art form that creates time-based visual imagery analogous to the temporal and structural elements of , which is non-narrative and devoid of programmatic content. This practice translates musical forms into abstract visual compositions, emphasizing non-representational shapes, colors, and movements to evoke synaesthetic experiences where visual elements resonate with auditory perceptions. Unlike narrative-driven , visual music prioritizes pure to mirror music's intrinsic qualities, such as progression and , without literal depiction or storytelling. The term "visual music" was coined by British art critic in 1912, specifically in reference to the improvisational abstract paintings of , which Fry described as evoking musical rhythms through color and form. This nomenclature highlighted the emerging idea of visual art operating like music in its temporal flow and emotional immediacy, independent of representational subjects. Central to visual music are characteristics like the precise synchronization of visual rhythms, colors, shapes, and movements with musical components including pitch, tempo, harmony, and dynamics, creating a direct analogue between auditory and visual stimuli. This distinguishes it from conventional music videos or representational animations, which often prioritize narrative or illustrative elements over abstract structural equivalence. Synaesthesia serves as a foundational concept in visual music, with historical claims by artists such as , who described experiencing auditory sensations—like specific tones or harmonies—from visual stimuli such as colors and forms. However, debate persists over whether truly possessed synaesthesia or employed it metaphorically to articulate his artistic process, as analyses of his accounts reveal inconsistencies with clinical definitions of the condition.

Historical Development

The concept of visual music traces its origins to the early , when Jesuit scholar Louis Bertrand Castel proposed the "ocular " in the , an instrument designed to associate specific colors with musical notes to create a synesthetic experience of "color music." This theoretical innovation, inspired by analogies between sound and light, laid foundational ideas for linking auditory and visual elements, though no functional prototype was built during Castel's lifetime. In the , these ideas advanced toward practical devices, exemplified by British painter Alexander Wallace Rimington's invention of the Colour Organ in 1893, a keyboard-controlled apparatus that projected colored lights in synchronization with music to evoke emotional responses through visual harmony. Rimington's device, patented and demonstrated publicly in by 1895, represented an early attempt to perform "color symphonies," influencing subsequent light shows and organ-like projections that bridged music and abstract visuals. Early 20th-century breakthroughs emerged within the Italian Futurist movement, where artists Bruno Corra and Arnaldo Ginna created the first hand-painted abstract films in 1911–1912, including titles like The Rainbow and The Dance, directly translating musical rhythms and moods into non-representational colored patterns on film strips. These pioneering works, though largely lost, anticipated synchronized audiovisual abstraction. Shortly after, in 1921, German filmmaker Walter Ruttmann released Lichtspiel: Opus I, recognized as the first abstract film explicitly synced to music, using oil paints on glass to generate fluid, rhythmic light forms that visualized tonal progressions. The through 1940s marked a golden age of visual music, driven by innovative animations and projections. produced a series of abstract Studies in the , synchronizing geometric forms and colors to music through hand-drawn and mechanical techniques, establishing visual music as a cinematic art form. In the 1930s, American animator Mary Ellen Bute created color organ-inspired films like Rhythm in Light (1934), employing oscilloscope-like visuals to abstractly interpret classical compositions, blending with musical structure. Paralleling these efforts, Norwegian-American artist Thomas Wilfred developed Lumia projections from the into the 1960s using his Clavilux device, which generated evolving, music-accompanied light compositions performed in theaters and galleries. Post-World War II expansion in the 1940s–1960s incorporated technological advancements, as seen in John Whitney Sr.'s analog computer animations, where he repurposed military equipment to generate parametric patterns synced to music, producing works like Film Exercises that automated complex visual harmonies. Similarly, at the , pioneered graphical sound techniques from the 1930s through the 1950s, drawing waveforms directly on film to create synthetic scores for abstract animations, such as Synchromy (1971), which reversed traditional image-sound hierarchies. By the mid-20th century, visual music gained institutional recognition through exhibitions in the –1970s, culminating in retrospectives like the 2005 "Visual Music" show, which highlighted this era's milestones in synesthetic art and .

Creation Techniques

Visual Instruments

Visual instruments represent a pivotal development in the early history of visual music, comprising mechanical and optical devices designed to produce synchronized light displays in response to musical performance. These pre-digital apparatuses, often resembling traditional musical instruments like harpsichords or organs, aimed to translate auditory input into visual output through direct physical linkages, enabling real-time synesthetic experiences. Emerging from 18th-century theoretical proposals, they evolved into practical hardware by the late 19th and early 20th centuries, primarily using keyboards or pedals to control lights projected onto screens or walls. One of the earliest conceptualizations was the ocular proposed by French Jesuit mathematician Louis-Bertrand Castel in 1725. Castel's design theoretically mapped musical notes to specific colors, drawing on Isaac Newton's spectrum theory, where each key of a would lift a curtain to reveal light passing through colored glass pieces, creating "visual melodies" that could be appreciated by the deaf or in silent performances. Although Castel never built a functional , his ideas inspired 19th-century realizations, such as devices using prisms or rotating wheels to generate color sequences aligned with musical scales, establishing a fixed correspondence between pitches (e.g., C to ) and hues. In the late , practical implementations advanced with inventions like Bainbridge Bishop's in the . This American artist's attached lighting instruments to a , using carbon- lamps to project colored beams onto a screen, modulated by inputs to mimic musical phrases in light. Bishop's patented device (1877) employed gels and filters for hue control, with organ keys directly triggering light intensity and color changes to parallel note durations and harmonies. Similarly, British painter Alexander Wallace Rimington's Colour Organ, patented in 1893, featured a five-octave integrated with lamps, prisms, and diaphragms; pressing keys activated rotating discs for hue and intensity variation, synced to organ pipes, while stops and pedals adjusted and strengths to evoke rhythmic visual patterns. A landmark in this lineage was Thomas Wilfred's Lumia suite and Clavilux instruments, developed from the 1920s through the 1960s. The Danish-American artist's Clavilux projected ever-changing colored forms—resembling auroras—onto screens, with keyboard controls modulating light via mechanical linkages to gels, motors, and shutters, allowing performers to compose visual "phrases" that echoed musical structures. Wilfred patented multiple versions, emphasizing fluid motion and form alongside color, and composed over 30 Lumia works for live presentation, where operators used the device to synchronize visuals with accompanying music in theater settings. These instruments operated on direct mechanical principles, linking sound production mechanisms—such as , pedals, or stops—to light generation components like , carbon- sources, colored gels, prisms, or early projectors. For instance, a might mechanically open a shutter to allow through a specific filter while simultaneously sounding a , ensuring temporal alignment between auditory and visual elements; intensity was often varied via rotating wheels or diaphragms to reflect , and hue selection relied on fixed mappings derived from natural analogies like the color to the musical . Early power sources included gas illumination, later transitioning to electric for brighter projections, though setups required manual operation and were prone to hazards like overheating. Despite their innovations, analog visual instruments faced significant limitations, particularly rigid fixed associations between notes and colors, which constrained expressive flexibility—for example, assigning a single hue like to all C notes across s, regardless of context or performer intent. These constraints, rooted in early theories like Castel's or Newton's, limited adaptability in performances, as mappings did not account for perceptual variations in cyclicity or emotional nuance, often resulting in repetitive visuals that failed to fully capture musical complexity. Their legacy endures in the influence on light concerts in theaters, where devices like the Clavilux enabled immersive live spectacles, paving the way for synesthetic art forms while highlighting the need for more dynamic technologies.

Graphic Notation

Graphic notation emerged in the modernist era as a means to represent musical structures through abstract visual forms, distinct from conventional staff-based systems. Wassily Kandinsky's paintings from 1911 to 1913, such as Composition VII, served as proto-notations, translating synesthetic experiences of sound into colors, lines, and shapes that evoked auditory sensations like chords and rhythms. Influenced by his chromesthesia, where music triggered vivid visual imagery, Kandinsky described color as a "keyboard" for the soul, using dynamic forms to parallel musical progression and intensity. This approach laid foundational principles for graphic notation by prioritizing emotional and structural equivalence between visual and sonic elements. In the mid-20th century, advanced these ideas with the UPIC (Unité Polyagogique Informatique du CEMAMu) system, conceived in the mid-1970s and first realized in 1977, which enabled composers to draw sound waves and timbres directly on a graphic tablet for computer-assisted . Users sketched curves and lines to generate wavetables, mapping horizontal axes to time (from milliseconds to minutes) and vertical dimensions to and , thus visualizing and probabilistic musical processes. 's first UPIC composition, Mycènes Alpha (1978), demonstrated this by converting hand-drawn graphics into complex electronic textures, emphasizing the system's role in democratizing composition for non-specialists. Prominent examples include John Cage's notational experiments in the 1950s, culminating in the Concert for Piano and Orchestra (1957–1958), where 63 pages of abstract graphics—featuring lines, numbers, and symbols—allowed performers to interpret , duration, and amplitude through chance operations without a fixed score. The piano part employed 84 types of notations, such as note sizes for amplitude and spatial arrangements for timing, fostering indeterminacy and performer agency. Similarly, György Ligeti's micropolyphonic works from the late 1950s, like the Kyrie from Requiem (1963–1965, developed from earlier ideas), used dense clusters of lines to notate overlapping voices, where visual proximity conveyed textural fusion and rhythmic complexity beyond individual es. Techniques in graphic notation employ lines for directional flow (e.g., rising for ascending pitch), shapes like circles or boxes for timbral clusters or registers, colors to differentiate instrumental families or , and spatial layouts to indicate and , as seen in Earle Brown's December 1952 (1952), where vertical positioning suggests pitch and horizontal spread denotes time. Density of marks often represents intensity or polyphonic layering, such as in Morton Feldman's Projection 4 (1951), where grouped squares evoke sustained textures without precise note values. Unlike traditional notation's linear specificity on staves, these methods embrace ambiguity, prioritizing perceptual and interpretive freedom to capture non-Western or aleatoric structures. Graphic notation fulfills a dual role as both performative art and analytical tool: in live settings, scores like Cornelius Cardew's (1963–1967) are projected as expansive drawings, inviting improvisers to respond to relational visuals such as curving lines for melodic arcs. As an analytical aid, it visualizes intricate forms like stochastic music in Xenakis's works, where probabilistic distributions appear as scattered points or gradients, helping composers model chaos and density. Modern extensions in the , such as Eric Wenger's software (initially released around 1996), built on these principles by converting images into sound via pixel-based , where controls envelopes and RGB values map to and , allowing graphical sketches to generate audio directly. This tool preserved the focus on visual abstraction, enabling composers to treat photographs or drawings as scores for experimental sound design without traditional interfaces.

Digital Methods

Digital methods in visual music emerged in the mid-20th century as computational tools enabled the transition from analog hardware to programmable systems for synchronizing sound and visuals. John Whitney Sr., a pioneering figure in computer animation, bridged analog and digital approaches during the 1960s and 1970s by adapting slit-scan techniques—originally mechanical processes involving moving slits to create distorted motion—into early digital frameworks using military surplus computers like the M-5 Antiaircraft Gun Director. This allowed for precise control over parametric patterns that responded to musical rhythms, as seen in works like Catalog (1961), where algorithmic variations produced abstract, music-like visual forms. By the 1980s, oscilloscope-based visuals advanced this lineage, with artists employing vector graphics to draw waveforms directly from audio signals on cathode-ray tubes, creating Lissajous figures that mirrored harmonic structures in real time. Jerobeam Fenderson exemplified this in later iterations, using analog oscilloscopes interfaced with digital audio to generate intricate, sound-driven vector patterns, as in his Oscilloscope Music series, where left and right audio channels control horizontal and vertical deflections for synchronized geometric displays. Software tools proliferated in the 1990s, facilitating user-friendly creation of audio-visual compositions without specialized hardware. , developed by Eric Wenger, introduced spectral image-to-sound mapping, where users paint frequency-time images that the software resynthesizes into audio, and conversely, audio spectra are visualized as editable images for granular manipulation, enabling composers to sculpt sounds and visuals interchangeably. Real-time patching environments like (introduced in 1990 by ) and its open-source counterpart (developed by Miller Puckette in 1996) allowed artists to build modular networks of audio and video objects, routing signals for live synchronization—such as modulating video textures with oscillator outputs—to create responsive installations. For larger-scale interactive works, node-based platforms like (since 2004) and (from in 2009) support real-time rendering of visuals driven by audio inputs, ideal for immersive environments where sensors or controllers trigger procedural effects. Core algorithms underpin these tools, transforming audio data into visual elements with mathematical precision. The Fast Fourier Transform (FFT) decomposes sound into frequency components, mapping amplitudes to colors or shapes in spectrum analyzers—for instance, low frequencies as warm hues and high harmonics as angular forms—to visualize musical spectra as dynamic, layered abstractions. extends this through particle systems, where note data governs particle birth, velocity, and decay; in environments like , incoming triggers simulate flocking behaviors or explosive bursts aligned with beats, producing emergent visuals from simple rules. Post-2010 advancements integrated , enhancing generative capabilities. Neural networks, particularly Generative Adversarial Networks (GANs), analyze audio features like and to produce abstract visuals, as in ML's tools, which train on datasets to output synchronized animations from sound inputs, allowing artists to iterate on styles without manual coding. In live contexts, software like Resolume (evolving through the 2020s) incorporates these for , where AI-assisted effects layers respond to DJ sets in , blending clips with audio-reactive distortions for performances. By 2024–2025, AI-generated have further advanced this, with tools creating fully synchronized visuals from audio tracks, as seen in releases like Linkin Park's "Lost" and remixes of classic tracks, enabling rapid production of immersive audiovisual content. Compared to analog methods, digital approaches offer infinite variability through algorithmic recombination, enabling endless permutations from finite inputs; heightened interactivity via real-time user control; and scalability for multi-screen installations or web distribution without physical degradation.

Media Applications

In Film and Animation

Visual music found early expression in abstract films of the 1920s and 1930s, where filmmakers synchronized non-representational imagery to musical structures. Oskar Fischinger's Studies series, comprising 13 short black-and-white films produced in the late 1920s, exemplified this approach through cutout animations and cameraless techniques, such as drawing geometric forms on white paper with charcoal to create fluid, rhythmic movements aligned with musical rhythms. These works, including Studie Nr. 5 (1930), emphasized orchestral density and spatial depth, transforming musical phrases into dynamic visual patterns. Similarly, Len Lye pioneered direct-on-film scratching in the 1930s, etching and painting abstract designs directly onto celluloid strips to produce vibrant, syncopated animations synced to jazz and dance tunes, as seen in films like Rhythm (1936), which captured asymmetrical rhythms approximating swing phrasings. The graphical sound era advanced visual music by integrating image and audio production on the same film strip. At the (NFB), conducted optical soundtrack experiments from the mid-1930s through the 1950s, etching, drawing, and photographing patterns directly onto the film's soundtrack area to generate both synthetic sounds and corresponding visuals. This technique, used in works like Dots (1940) and Loops (1940), allowed precise synchronization of graphical waveforms to produce percussive and tonal music, effectively making the film a dual medium for visual and auditory abstraction. McLaren's innovations predated electronic synthesizers, treating the optical track as a canvas for "graphical music." Hollywood and experimental cinema intersected in the 1930s and through efforts to commercialize visual music. Mary Ellen Bute's Seeing Sound series, produced from the mid-1930s to the , visualized classical compositions using animations generated with mechanical devices and early oscillators to translate sound waves into luminous, oscillating forms. These shorts, such as Synchrony (1938), screened in theaters as preludes to feature films, bridging experimentation with mainstream audiences by rendering music as geometric light patterns. In the 1950s and 1960s, Jordan Belson's psychedelic films, including those derived from concert series (1957–1959), evoked cosmic immersion through layered abstractions of color and motion, often projected in planetariums to accompany scores. Belson's works, like Allures (1961), intensified visual music's sensory impact, influencing the era's countercultural light experiences. Key techniques in these films included frame-by-frame painting, multiple exposures, and selective rotoscoping to ensure precise alignment of visual motifs with musical phrases. Fischinger's Kreise (1934), for instance, employed frame-by-frame painting on punched paper and multiple exposures in the three-strip GasparColor process to produce pulsating circles that responded to classical motifs, creating a luminous, rhythmic interplay of form and sound. , though less common in pure abstraction, was adapted in some hybrid works to trace musical waveforms or subtle motions for enhanced synchronization. These methods prioritized analog precision, allowing artists to craft non-narrative sequences where visuals directly mirrored musical structure, , and . By the , visual music in film evolved toward analog-digital formats, building on the multi-media spectacles of the 1967 Montreal Expo. The Expo's light shows and experimental presentations, such as those in the pavilion, combined analog projections with emerging electronic controls for immersive, synchronized environments that foreshadowed digital integration in abstract . These approaches expanded visual music's scale, using multiple screens and automated lighting to amplify musical responsiveness in live settings.

Computer Graphics

The application of to visual music began gaining prominence in the 1970s and 1980s, building on early digital experimentation to produce synchronized abstract animations. John Whitney, recognized as a foundational figure in , shifted from analog techniques—such as those in his 1961 Catalog—to digital using systems like the IBM 360 computer by the late 1960s and 1970s. These efforts created parametric patterns that visualized musical rhythms through looping geometric forms, marking some of the first instances of digitally rendered visual music. By the 1990s, projects like , developed by Wayne Lytle starting in the mid-1990s, advanced this tradition with physics-based simulations of self-playing instruments. These computer-generated animations depicted robotic mechanisms—such as drum machines and string synthesizers—that appeared to autonomously produce sound, with motions precisely modeled using and to align with composed music tracks. Rendering techniques in for visual music often exploit optical principles to parallel auditory ones, enhancing synesthetic expression. Ray tracing, a core method for simulating light propagation, has been adapted to generate visual representations of harmonic interference, where virtual light rays mimic sound wave superpositions to form evolving patterns akin to musical overtones. Complementing this, shader programming in (GLSL) enables real-time manipulation of visuals driven by audio input, such as modulating colors and textures based on from fast Fourier transforms (FFT). This allows for fluid deformations of shapes— like warping meshes or blending hues—that respond instantaneously to bands in music, facilitating live performances. Key software ecosystems have democratized procedural generation of visual music. MilkDrop, released in 2001 by Ryan Geiss as a Winamp plugin, pioneered hardware-accelerated visualization through per-frame FFT processing, which analyzes audio waveforms to drive parametric equations deforming 2D and 3D primitives into hypnotic, beat-synced forms. Similarly, Houdini's Channel Operator (CHOP) networks process audio signals to parameterize procedural models, generating particle systems or geometry that evolve with musical amplitude and pitch, while Processing—an open-source Java-based environment—supports custom sketches integrating audio libraries for algorithmic visuals like oscillating grids tied to sound spectra. MIDI integration further enhances these tools, enabling live synchronization where note data from instruments directly controls graphic parameters, such as scaling fractals or triggering particle bursts, in real-time rendering pipelines. Notable contemporary works, particularly from the 2000s to 2020s, leverage these advancements for large-scale applications. teamLab, an interdisciplinary collective established in 2001, employs in interactive installations where projected visuals—rendered via and custom engines—pulse and morph in harmony with ambient music, using sensor-driven algorithms to adapt patterns to sonic cues. Hardware evolution has underpinned this progression, from cathode ray tube (CRT) oscilloscopes in the mid-20th century, which displayed basic vector Lissajous figures as audio visualizers, to modern GPU-accelerated rendering. GPUs now handle parallel computations for intricate simulations, such as geometries (e.g., Mandelbrot iterations) that expand and contract in sync with beats, enabling high-frame-rate outputs unattainable on earlier systems.

Virtual Reality

Virtual reality (VR) has expanded visual music into fully immersive environments since the , enabling users to inhabit abstract audio-visual spaces where movement and sound synchronize in three dimensions. Early experiments, such as Char Davies' Osmose (1995), pioneered this integration by using body gestures to navigate ethereal virtual landscapes accompanied by interactive soundscapes, creating a symbiotic relationship between physical motion and evolving musical visuals. In this installation, participants floated through grid-like and organic realms, with spatialized audio—composed with elements of natural and synthetic tones—triggering visual transformations, thus blending somatic immersion with auditory rhythms. Advancements in consumer VR hardware during the 2010s and 2020s, particularly with platforms like and , have democratized 360-degree visual music experiences, allowing real-time synchronization of abstract graphics to music in head-tracked environments. Applications such as Fantasynth VR on transform user-selected tracks into dynamic particle systems and waveform visualizations that respond to head movements and gestures, enveloping the viewer in pulsating, music-driven geometries. Similarly, (2018), a gamified VR rhythm title for and , exemplifies this evolution by syncing neon block-slicing mechanics and environmental effects to (EDM) beats, where visual cues like glowing sabers and rhythmic light pulses heighten the perceptual fusion of . Core techniques in visual music leverage spatial audio mapping to drive visuals, such as soundscapes that animate particle fields or volumetric effects in response to musical frequencies and panning. For instance, tools like Ableton's Envelop for Live enable ambisonic audio to position sounds in virtual space, which in turn modulates visual elements like scattering particles or morphing structures, enhancing the sense of depth and directionality. Haptic feedback further enriches these experiences by integrating tactile responses to music, with wearable devices delivering vibrations synced to basslines or melodies, simulating physical impacts that align touch with auditory and visual pulses in concerts. Prominent works in the 2020s, showcased in 's VR art galleries and Immersive Pavilions, utilize engines like and Unreal for real-time rendering of interactive abstract worlds that react to user-input music. Projects in 2022's Immersive Pavilion, for example, featured multi-sensory musical journeys where participants co-create evolving visual symphonies through gestures, with -powered environments generating patterns and light orchestrations tied to live audio inputs. Recent advancements as of 2025 include explorations of to synchronize real-time music generation with visual elements in , enhancing immersive audio-visual experiences. Despite these innovations, challenges like persist in VR visual music, addressed through smooth visual rhythms that align frame rates and animations to musical tempos for reduced disorientation. Joyful or soft music tracks have been shown to significantly mitigate cybersickness symptoms in VR sessions, promoting calmer of dynamic visuals. Post-2020, collaborative multi-user VR concerts have emerged as a key , enabling shared spaces for synchronized audio-visual performances, as seen in platforms like NOYS VR where remote audiences interact in real-time musical metaverses.

Scientific Foundations

The scientific foundations of visual music rest on perceptual and acoustic principles derived from , physics, and , which explain how auditory and visual stimuli can be integrated to create unified experiences. Synaesthesia research provides a key neurological basis, where sensory experiences involuntarily cross modalities, such as perceiving sounds as colors. Early investigations in the , led by , documented these phenomena through surveys of mental imagery, revealing familial patterns and consistent associations like numbers evoking specific hues. In artists like , debates persist on whether such experiences were innate—rooted in hyperconnectivity between sensory brain areas—or acquired through prolonged exposure to music and art, as evidenced by analyses of his writings describing auditory-visual equivalences. Psychoacoustics further elucidates correspondences between sound attributes and visual qualities, where listeners intuitively map auditory features onto visual ones. For instance, higher sound frequencies are commonly associated with brighter colors and lighter tones, a pattern observed across cultures and supported by experimental studies on cross-modal matching. This principle informed Alexander Scriabin's 1911 score for Prometheus: The Poem of Fire, which specified a "color organ" to project lights corresponding to musical keys—e.g., the key of triggering red hues—drawing on his personal synaesthetic perceptions and early psychoacoustic theories of sensory . From a physics , analogies between sound and spectra underpin visual representations of , as both are oscillatory phenomena governed by wave equations. Sound , propagating as pressure variations in air, can be visualized using oscilloscopes to display harmonics as complex waveforms, revealing overtones and timbres. Lissajous figures exemplify this, formed by superimposing two perpendicular sinusoidal oscillations at harmonic ratios (e.g., 2:1 for an ), producing closed curves that illustrate relationships and interactions, thus providing a geometric analog for musical intervals. Cognitive science contributes through Gestalt principles, which describe how the brain organizes disparate sensory inputs into coherent wholes, facilitating audio-visual unity. Principles like similarity (grouping similar sounds and visuals) and proximity (temporal alignment of stimuli) enhance cross-modal perception, as shown in experiments where synchronized auditory rhythms improve visual grouping. (fMRI) studies from the 2000s further reveal neural mechanisms, with activations in the during music-color associations indicating integrated processing in multisensory areas, supporting the perceptual binding essential to visual music. Microtonal and mathematical notations extend these foundations by employing geometry to visualize non-Western scales, transcending . The 20th-century Bohlen-Pierce scale, dividing the 3:1 (a "tritave") into 13 equal steps of approximately 146 cents, uses circular diagrams to map intervals, highlighting just like 9:7 and 7:5 without equivalence, thus geometrically representing alternative structures. Visual music has significantly influenced arts, particularly through its integration into performance-based movements like in the 1960s, where artists drew on musical and visual experimentation to create multimedia experiences. Nam June Paik's development of video synthesizers, such as the Paik-Abe synthesizer, exemplified this by enabling real-time manipulation of video signals in response to sound, blending electronic music with dynamic visuals in live settings. These innovations extended to connections with and , where perceptual illusions of movement and color vibration, as pioneered by , inspired abstract visual patterns that evoke rhythmic responses akin to musical structures, fostering synesthetic interplay between sight and sound. The evolution of music videos from the 1980s to the 2000s owes much to visual music's emphasis on abstract and synchronized imagery, transforming into a platform for experimental audiovisuals. For instance, a-ha's "Take On Me" (1985) pioneered rotoscoped techniques to blend live-action with hand-drawn sketches, creating a seamless fusion of and abstract motion that synced visual transitions to the song's pop rhythms, influencing subsequent hybrid formats. This approach paralleled the rise of VJ culture in nightclubs, where real-time visualizers manipulated generative graphics and video loops to respond to DJ sets, enhancing immersive club experiences through software like VJamm and Resolume, which treat visuals as a performative extension of music. In industrial applications, visual music principles have permeated advertising and , notably through features like Spotify's , introduced in 2017, which allows artists to attach short, looping vertical videos to tracks, replacing static album art with dynamic, music-synced visuals to boost listener engagement. Additionally, tools like the software use audio-reactive algorithms to generate visuals from sound inputs for live performances and . Visual music intersects with related art forms such as , exemplified by James Turrell's installations, which manipulate to evoke auditory-like perceptions, as in his "Skyspace" series and the ongoing project (initiated 1972), where colored light fields create synesthetic experiences blending visual immersion with implied sonic resonance. Overlaps with appear in Alvin Lucier's experimental works, such as "Music on a Long Thin Wire" (1977), where physical vibrations produce evolving tones visualized through spatial acoustics, bridging auditory phenomena with perceptual mappings that echo graphical representations of sound waves. Post-2020 extensions of visual music have emerged in digital marketplaces, with NFT platforms enabling audio-visual pieces that combine generative music and synchronized animations as unique collectibles. For example, Adventure Club's 2020 NFT drop on Blockparty featured limited-edition audio-visual works, including a 1/1 "" piece, hailed as a landmark in merging electronic with blockchain-verified visuals, paving the way for artist royalties and fan ownership. Concurrently, AI-driven has fueled high-profile auctions, such as "Augmented Intelligence" sale from February 20 to March 5, 2025, where algorithms produced artworks that sold for a total of $728,784, highlighting the fusion of computational creativity with art traditions.

References

  1. [1]
    CVM-Ox-Keefer-Visual Music
    A visualization of music which is the translation of a specific musical composition (or sound) into a visual language, with the original syntax being emulated ...
  2. [2]
    THE HIRSHHORN PRESENTS EAST COAST PREMIERE OF ...
    Apr 2, 2005 · Works such as Mikhail Matiushin's Painterly-Musical Construction (1918) and Georgia O'Keeffe's Blue and Green Music (1921) even had musical ...Missing: key notable
  3. [3]
    Wassily Kandinsky's Symphony of Colors - Denver Art Museum
    Mar 19, 2014 · Kandinsky literally saw colors when he heard music, and heard music when he painted. The artist explored these sensations in unconventional, artistic ways.
  4. [4]
    A Brief History of Visual Music - Farside Studio
    It is typically non-narrative and non-representational. Visual music can be accompanied by sound but can also be silent.”
  5. [5]
    The Artistic and Intellectual Influences on the Abstract Kinetic ... - Tate
    43 But Fry had originally developed the idea in the catalogue of the Second Post-Impressionist Exhibition in late 1911, where he used the term 'visual music' to ...
  6. [6]
    [PDF] REAL TIME MUSIC VISUALIZATION
    May 8, 1982 · This study uses real-time 3D animations to visually extend live music, transposing musical dynamics into a visual form that develops over time.
  7. [7]
    (PDF) Was Kandinsky a Synesthete? - ResearchGate
    Aug 7, 2025 · Due to the lack of information in the past, synesthesia was sometimes misunderstood as a neurological disorder, a brain impairment, or even a ...
  8. [8]
    The Dream of Color Music, And Machines That Made it Possible
    The Jesuit, Father Louis Bertrand Castel, built an Ocular Harpsichord around 1730, which consisted of a 6-foot square frame above a normal harpsichord; the ...
  9. [9]
    [PDF] Ocular Harpsichord: Colour-Sound Analogy At Large in the ...
    Proposed in 1725 by Louis-Bertrand Castel (1688-1757, a. Jesuit scientist and writer at the Journal de Trévoux), it was meant to play 'colour music'. However, ...
  10. [10]
    [PDF] The Technological Revolution of the Coloured Organ in Alexander ...
    The term “coloured organ” was first applied as a patent by Rimington in 1893. ... [15] Alexander Wallace Rimington, Colour-music: the. Art of Mobile Colour ...<|separator|>
  11. [11]
    Colour music - Luke McKernan
    Aug 21, 2014 · ... organ, which would project coloured light onto a small screen; and artist Alexander Wallace Rimington patented his Colour Organ in 1893.
  12. [12]
    Cinema and Abstraction: From Bruno Corra to Hugo Verlinde
    Dec 19, 2011 · They created thus, in 1911 and 1912, The Rainbow and The Dance, two strips of approximately two hundred meters painted directly on the filmstrip ...Missing: date | Show results with:date
  13. [13]
  14. [14]
    Oskar Fischinger - Project MUSE
    Jul 21, 2009 · Inspired by the emerging abstract cinema movement in Germany in the early 1920s, Fischinger initially developed a series of animated Studies ...
  15. [15]
    Seeing Sound - A Mary Ellen Bute Retrospective
    Sep 29, 2014 · A pioneer of visual music and electronic art, Mary Ellen Bute produced over a dozen short abstract animations between the 1930s to the 1950s.
  16. [16]
    Lumia: Thomas Wilfred and the Art of Light | Yale University Art Gallery
    Feb 17, 2017 · When Wilfred began in 1919 to produce large-scale light projections with the aid of his “Clavilux,” an organlike instrument controlled by ...
  17. [17]
    The Life of John Whitney, Computer Animation Pioneer
    The young John Whitney worked in the Lockheed Aircraft Factory during the war and while he was working with high-speed missile photography, he was technically ...Missing: 1940s- | Show results with:1940s-
  18. [18]
    Aural Innovation in the Films of Norman McLaren - Oxford Academic
    Norman McLaren was a prolific and innovative filmmaker for almost half a century. During this time, his experimentation with image, sound and the physicality of ...
  19. [19]
    CVM News - Center for Visual Music
    Fall 2009 - CVM's Film series at The Guggenheim Museum, NY opens ... Visual Music exhibition at The Hirshhorn Museum, Washington DC, closed September 11, 2005.
  20. [20]
    How Color Organs Make Music by Linking Light and Sound
    Nov 5, 2015 · Mechanical color organs, built to visually represent or accompany sound, first appeared in the 18th century as instruments modeled on both the ...
  21. [21]
    Louis-Bertrand Castel - Linda Hall Library
    Nov 15, 2021 · In 1725, he wrote a letter in which he proposed an idea for an “ocular harpsichord”, an instrument that would play colors instead of sounds.Missing: 1730s organ
  22. [22]
    3 centuries of an art of light - Imager by Rhythmic Light
    Mary Ellen Bute's first film, Rhythm in Light, was produced in 1934 and premiered at Radio City Music Hall in 1935. Like other of her films, the five minute ...<|separator|>
  23. [23]
    Lumia: Thomas Wilfred and the Art of Light
    This groundbreaking exhibition features fifteen light compositions, shown together for the first time in nearly fifty years.
  24. [24]
    Coloured hearing, colour music, colour organs, and the search ... - NIH
    May 9, 2022 · For Castel, whose colour organ was first exhibited in 1735, each note of the scale corresponded to a specific colour. In particular, the Jesuit ...
  25. [25]
    Synesthesia, a Visual Symphony: Art at the Intersection of Sight an
    Oct 8, 2021 · The visual symphony Kandinsky experienced may have been the result of a rare neurological phenomenon known as synesthesia. black-and-white ...
  26. [26]
    UPIC - Iannis Xenakis
    Jul 13, 2023 · In the mid-1970s Xenakis began developing the UPIC system, a computer music system having a drawing surface as input device: drawings made ...
  27. [27]
    Concert for Piano and Orchestra - John Cage Complete Works
    Concert for Piano and Orchestra has no score, but rather consists of highly detailed parts. Any performance may include all of the instruments, but the work ...
  28. [28]
    LINES, MASSES, MICROPOLYPHONY: LIGETI'S KYRIE AND ... - jstor
    Yet the status of the individual line in micropolyphony is ambiguous, remaining an active component of the texture even as it is removed from direct perception.
  29. [29]
  30. [30]
    [PDF] UNIVERSITY OF CALIFORNIA, IRVINE Graphic Score on Trial
    Aug 16, 2023 · Graphic notation is most commonly thought of as a tool for composers of contemporary music ... tools deployed in this Serbian group's performance ...
  31. [31]
    MetaSynth - U&I Software, LLC
    MetaSynth is an award-winning macOS suite of apps for sound design, music composition, and mixing. It uses 64-bit floating-point math for all audio operations ...Missing: 1970s- 1990s
  32. [32]
    Original Creators: Visionary Computer Animator John Whitney Sr
    Jun 11, 2012 · By the 1970s he'd stopped using his analogue machinery and instead primarily used digital computers, which were used to create Arabesque.Missing: 1960s | Show results with:1960s
  33. [33]
    PALAIS DE MARI John Whitney - PREPARED GUITAR
    Oct 7, 2014 · In 1962, John Whitney produced computer films, first with an IBM analog computer (since 1966 only digital computers can be used in film). In ...
  34. [34]
    Oscilloscope Music
    Oscilloscope Music is music that creates it's own visuals. The same signal that goes to the speakers is also fed into an analog oscilloscope to draw green ...Oscilloscope Music Remixes... · Music · Oscilloscope Music (2016) · SoftwareMissing: 1980s | Show results with:1980s<|separator|>
  35. [35]
    What is Max? - Cycling '74
    Make Visual Music. Max includes full-featured, expandable video and graphics tools with Jitter. Jitter is optimized for realtime audiovisual work, and is ...Missing: Pure | Show results with:Pure
  36. [36]
    A guide to seven powerful programs for music and visuals
    What should I learn? A guide to seven powerful programs for music and visuals. · Pure Data · Max/MSP/Jitter and Max for Live · SuperCollider · TidalCycles · P5JS.
  37. [37]
    Resolume vs TouchDesigner vs VVVV
    Jan 8, 2024 · We break down three often compared programs: Resolume vs TouchDesigner vs VVVV, and answer the FAQs about each software.
  38. [38]
    Interactive visuals with TouchDesigner - Music Hackspace
    TouchDesigner is the industry standard tool for interactive visuals in live performance, broadcasting and art installations.
  39. [39]
    Making Audio Reactive Visuals with FFT - sangarshanan
    Nov 5, 2024 · FFT (Fast Fourier Transform) optimizes on DFT and makes it feasible to process even in real-time applications! which is exactly what we need to ...
  40. [40]
    Audio-Reactive Particle Systems in TouchDesigner | feat. Yaima "It's ...
    Oct 11, 2025 · This project showcases a real-time, audio-reactive visual system built in TouchDesigner, designed to translate sound into evolving motion and ...
  41. [41]
    Max and Machine Learning with RunwayML - Music Hackspace
    Max and Machine Learning with RunwayML – On-demand. Level: Intermediate. RunwayML is a platform that offers AI tools to artists without any coding experience.Missing: post- 2010s
  42. [42]
    Resolume VJ Software & Media Server
    Resolume is the industry-leading software for VJ's around the world. Arena & Avenue's intuitive workflow will have you mixing, matching and playing your content ...Download – Resolume · Software · Training · FootageMissing: 2020s | Show results with:2020s
  43. [43]
    Analog vs Digital Music: How They Impact Playback - Fluance.com
    Because the technological needs are low, digital recording methods have given the music industry endless possibilities with fast, high-quality recordings.
  44. [44]
    CVM - Oskar Fischinger Biography - Center for Visual Music
    Seeing Fischinger's abstract scroll sketches, Diebold urged Oskar to take up abstract filmmaking. In April 1921, Fischinger was thrilled by the first ...
  45. [45]
    Week 4 – MES 160 | World History of Animation - BMCC OpenLab
    leading to the Studies series (see “Studie nr. 5” (1930) excerpt (0 ...
  46. [46]
    Len Lye's Kinetic Experiments: Sounds of Sculpture by Sarah Wall
    Lye first developed his ideas about the relationship between music and movement while working in experimental cinema in 1930s London. Jack Ellitt, a trained ...
  47. [47]
    Lye, Len - Senses of Cinema
    May 12, 2007 · In the early 1930s he experimented with new colour processes such as Dufaycolor and Gasparcolor while pioneering “direct animation”, a method of ...
  48. [48]
    Norman McLaren: Animated Musician - NFB
    Sep 12, 2014 · Not only did Norman McLaren create his own film imagery, he also made his own music by drawing, etching and photographing patterns directly ...Missing: 1930s- 1950s
  49. [49]
    Norman McLaren's animated sound | The Cinema of Attractions
    Feb 16, 2011 · Here's a short documentary that details Norman McLaren's hand-drawn sound technique. He appears around the 3-minute mark.Missing: National Board Canada 1930s- 1950s
  50. [50]
    How to write a film on a piano: Norman McLaren's visual music - BFI
    Sep 15, 2014 · He composed for the instrument he knew best: actual physical film. He used animated sound, an innovative sort of electronic, optical-graphical music.Missing: 1935- 1950s
  51. [51]
    Mary Ellen Bute: Seeing Sound - Animation World Network
    During a 25-year period, from 1934 until about 1959, the 11 abstract films she made played in regular movie theaters around the country, usually as the short ...Missing: 1930s- 1940s
  52. [52]
    Jordan Belson - Center for Visual Music
    Jordan Belson is one of the greatest artists of visual music. Belson creates lush vibrant experiences of exquisite color and dynamic abstract phenomena.Missing: 1960s | Show results with:1960s
  53. [53]
    Jordan Belson - Center for Visual Music
    In the early 1950s, Belson continued making animated films out of his paintings. ... Other experimental effects from Vortex went into a 1959 film Seance ...
  54. [54]
    Kreise (1933-34) | Timeline of Historical Colors in Photography and ...
    Kreise (English title Circles) (Oskar Fischinger, GER 1933-34). Credit: Library of Congress, (c) Fischinger Trust, courtesy Center for Visual Music.Missing: techniques frame- frame painting multiple exposure rotoscoping 1934
  55. [55]
    The Art of Roto: 2011 - fxguide
    Oct 10, 2011 · Rotoscoping is the process of manually altering film or video footage one frame at a time. The frames can be painted on arbitrarily to create custom animated ...
  56. [56]
    Expanding Cinema: Expo 67 Reconstructions and Other Finds
    Feb 15, 2020 · Montreal's Expo 67 represents one the most important artistic experiments of the twentieth century—a harbinger of the digital era to come, and a ...
  57. [57]
    Experimental Multi-Screen Cinema - Expo 67
    The pavilions at Expo 67 showcased hundreds of films and slide shows, but the only one that were memorable were those that used more than one screen.Missing: Montreal hybrid analog-
  58. [58]
    Animusic-MIDI-Driven Computer Animation
    MIDI allows the music and the animations to work, control, and be controlled in sync. Initially Lytle used off the shelf animation and music applications but in ...
  59. [59]
    [PDF] Affect-Conveying Visual-Musical Analogies in an Artistic Generative ...
    Feb 21, 2023 · This research discusses analogies between visual and musical elements in abstract generative artworks. When designed well, a viewer/listener ...
  60. [60]
    Build Your Own Audio Visualization In a Shader | Learn With Jason ...
    Jan 20, 2022 · Getting into shaders and writing GLSL might sound intimidating, but don't be scared! Char Stiles will teach us to code our own visualizations in the browser.
  61. [61]
    About MilkDrop - Geisswerks
    MilkDrop is the next generation of music visualizer. First written in 2001, its popularity seemed to only grow as more and more people explored its ...
  62. [62]
    [PDF] Visualization of Musical Instruments through MIDI Interface
    The graphic display program also receives the MIDI messages and interprets them in the same manner in order to create visual elements to reflect those signals.
  63. [63]
    Co-Creative Music - teamLab
    This instrument lets people play music by drawing lines on the screen with their fingers and then tapping the screen to place various stamps.Missing: visual computer graphics
  64. [64]
    A CRT Audio Visualiser For When LEDs Just Won't Do | Hackaday
    Mar 4, 2023 · Oscilloscopes use electrostatic deflection, not magnetic. ... The built-in radio became a source of wireless music to display, no mods to the case ...
  65. [65]
    Fractals in Computer Graphics - GeeksforGeeks
    Jul 23, 2025 · Fractals is a complex picture created using iteration and a single formula. Sometimes, objects cannot be drawn with a given equation or with a given geometry.
  66. [66]
    Osmose | Immersence | Char Davies
    Osmose (1995) is an immersive interactive virtual-reality environment installation with 3D computer graphics and interactive 3D sound, a head-mounted ...Missing: music | Show results with:music
  67. [67]
    Media Art Net | Davies, Charlotte: Osmose - Medien Kunst Netz
    On entering the work «Osmose» (1995), we encounter two image surfaces in a darkened space. One image surface is quickly recognized as a projection surface for ...
  68. [68]
    VIRTUAL REALITY MUSIC VISUALIZER! | Fantasynth VR (HTC Vive ...
    Jul 4, 2017 · ... VR games with the Oculus Touch, HTC Vive and Playstation VR. VIRTUAL REALITY MUSIC VISUALIZER! | Fantasynth VR (HTC Vive Gameplay). 20K views ...Missing: 2010s 2020s
  69. [69]
    'Beat Saber' PC Review – a VR Rhythm Game for Budding Jedi ...
    May 1, 2018 · Beat Saber is a single player rhythm game in Early Access that puts two lightsabers in your hands and tasks you with one of the funkiest, most stylish Jedi ...
  70. [70]
    Free Tools for Live Unlock 3D Spatial Audio, VR, AR | Ableton
    May 30, 2018 · Free new tools by Envelop for Ableton Live 10 unlock 3D spatial audio, ambisonics, virtual reality and augmented reality.Missing: particle | Show results with:particle
  71. [71]
    Feeling the music: exploring emotional effects of auditory-tactile ...
    Conclusion. Our findings highlight the role of haptic feedback and self agency in enhancing emotional responses to music, with audio-tactile music increasing ...
  72. [72]
    SIGGRAPH '22: ACM SIGGRAPH 2022 Immersive Pavilion
    The pavilion featured VR experiences like data visualization, musical journeys, shared VR/mobile games, and multi-sensory VR with scents.Missing: gallery Unreal Engine
  73. [73]
    Listening to the right tunes can prevent motion sickness in VR
    May 7, 2023 · A small study has found that playing “joyful” music can reduce the symptoms of cybersickness from virtual reality (VR).Missing: smooth visual rhythms
  74. [74]
    An In-Depth Analysis of the Immersive Live Events Market
    Oct 30, 2025 · User Experience: The core of the NOYS VR experience is social interaction. The platform is designed to allow fans to join live concerts from ...
  75. [75]
    Visualised Numerals - Nature
    Visualised Numerals. FRANCIS GALTON. Nature volume 21, pages 252–256 (1880)Cite this article.
  76. [76]
    The evolution of the concept of synesthesia in the nineteenth century ...
    It was, however, rediscovered for synesthesia research by Francis Galton (1880a), although he did not adopt her term. Also in 1873, the Austrian synesthete ...
  77. [77]
    Was Kandinsky a Synaesthete? Examining His Writings and Other ...
    This paper investigates whether Kandinsky had inborn synaesthesia, while acknowledging that there are also types of induced synaesthesia which he may have ...Missing: innate | Show results with:innate
  78. [78]
    (PDF) Coloured hearing, colour music, colour organs, and the ...
    PDF | There has long been interest in the nature of the relationship(s) between hue and pitch or, in other words, between colour and musical/pure tones,.
  79. [79]
    Sensory translation between audition and vision - PMC
    In this narrative historical review, focusing on the translation between audition and vision, we attempt to shed light on the topic by addressing the following ...
  80. [80]
    Vocal Visualizer: Physics & Sound Science Activity - Exploratorium
    The harmonic motions traced out by the moving laser beam are called Lissajous patterns. The combination of the mirror moving in an up-and-down direction (along ...
  81. [81]
    Lissajous Curves | Academo.org - Free, interactive, education.
    When they are, the figure is static. By the way, the "Harmonic Visualizer" is so named because it makes musical harmonies visible. If you close your eyes ...Missing: waves | Show results with:waves
  82. [82]
    (PDF) Sound enhances visual perception: Cross-modal effects of ...
    Mar 12, 2019 · PDF | Six experiments demonstrated cross-modal influences from the auditory modality on the visual modality at an early level of perceptual.
  83. [83]
    A critical review of the neuroimaging literature on synesthesia
    Synesthesia refers to additional sensations experienced by some people for specific stimulations, such as the systematic arbitrary association of colors to ...
  84. [84]
    [PDF] Teaching Elementary Aural Skills - Carolyn Wilson Digital Collection
    Jan 1, 2013 · 379-380) See Appendix 2 for diagrams of the frequencies of the Bohlen-Pierce scale, compared to Western equal temperament; and the ...
  85. [85]
    Nam June Paik Exhibits Video and Television as Art - EBSCO
    Many of his concerns in the visual arts are influenced by his musical performance experiences and ideas of the radical Fluxus art movement of the 1960's. Nam ...
  86. [86]
    Nam June Paik | Smithsonian American Art Museum
    The Paik-Abe video synthesizer transformed electronic moving-image making. Paik invented a new artistic medium with television and video, creating an ...
  87. [87]
    Op Art Movement Overview | TheArtStory
    Nov 22, 2011 · Four Self-Distorting Grids exemplifies the influence of Kinetic art on Op. Movement was crucial to Op art, and while Op artists generally relied ...Summary Of Op Art · Key Ideas & Accomplishments · Beginnings Of Op ArtMissing: music connection
  88. [88]
    Take On Me: Revisiting the Making of A-ha's Trailblazing Animated ...
    Apr 5, 2022 · Take On Me: Revisiting the Making of A-ha's Trailblazing Animated Music Video. Ramin Zahed. By Ramin Zahed. April 5, 2022. Facebook.Missing: abstract 1980s- 2000s
  89. [89]
    VJ: Audio-Visual Art and VJ Culture: Includes DVD - Amazon.com
    A comprehensive guide exploring VJ culture, showcasing international artists and providing technical instruction for audio-visual performances in clubs and ...
  90. [90]
    Canvas - Spotify for Artists
    Engage your fans in a whole new way with Canvas, a short looping visual you can add to each of your tracks on Spotify. It's album artwork, for the streaming ...Adding a Canvas to Spotify · Canvas Metrics Guide · Canvas guidelines
  91. [91]
    Synesthesia - Live Music Visualizer - VJ Software
    Synesthesia is a live music visualizer and VJ Software. Create beautiful visuals that react to music using MIDI mappable controls. Download the free trial.Download · General - Synesthesia Docs · Synesthesia Docs · Synesthesia Installers
  92. [92]
    (PDF) "The eye listens": light music and visual perceptions by James Turrell
    ### Summary of "The eye listens": light music and visual perceptions by James Turrell
  93. [93]
    Music on a Long Thin Wire by Alvin Lucier (1977) - Socks Studio
    Jul 12, 2016 · “Music On A Long Thin Wire” is a sound and installation piece conceived in 1977 and repeatedly staged thereafter.Missing: graphical | Show results with:graphical
  94. [94]
    Adventure Club Teases New NFT Release - EDM
    Aug 6, 2025 · Adventure Club Teases New NFT Release. Their first release was touted as “one of the most important audio-visual NFT releases” of all-time.
  95. [95]
    Christie's AI-Generated Art Auction: Who Profits And Who Pays The ...
    Feb 19, 2025 · The "Augmented Intelligence" auction is scheduled to open for bidding online and at Christie's New York on Feb. 20 and runs through March 5.