Fact-checked by Grok 2 weeks ago

Live2D

Live2D is a technology and developed by Live2D Inc. that enables the creation of dynamic, real-time expressions for static illustrations—typically anime-style characters—by dividing images into layered parts and applying deformation, physics simulations, and motion controls without converting to models. The flagship tool, Live2D Cubism Editor, allows artists to model and animate artwork while preserving its original stylistic charm, supporting features like intuitive , lip-sync, and physics for lifelike movements. Originally developed by Cybernoids Co., Ltd. (renamed Live2D Inc. in 2014), the company was founded on July 31, 2006, in Tokyo, Japan, with initial work on Live2D technology beginning as an innovative approach to 2D graphics animation. Key milestones include the 2008 beta launch of Live2D Vector, the 2013 release of Live2D Cubism 1.0, and subsequent versions such as Cubism 3 in 2017, Cubism 4 in 2019, and Cubism 5 in 2023, marking over a decade as an industry standard for stereoscopic 2D animation. The technology has powered notable works, including the 2011 visual novel Oreimo Portable, the 2021 film Belle, and the 2024 TV animation Tono to Inu. Live2D's applications span mobile apps, console games, virtual (VTuber) streaming, and video production, with integration via the Cubism SDK for platforms like , , , and web environments, supporting over 400 commercial titles worldwide. Its ecosystem includes free sample data for learning, educational programs for over 450 institutions (as of March 2025), and community resources, fostering adoption by professional creators for real-time character interactions that immerse audiences.

History

Origins and early development

Live2D technology originated in as a pioneering method to impart three-dimensional movement and expressiveness to two-dimensional illustrations, circumventing the complexities of traditional . The company behind its creation, Cybernoids Co. Ltd., was established on July 31, 2006, by programmer and CEO Tetsuya Nakajo, initially focusing on advanced and effects for digital content. Nakajo, recognized as a "Super Creator" by Japan's Information-technology Promotion Agency () in 2007, led the venture with an emphasis on innovative software solutions inspired by the demands of and manga-style . Development of the core Live2D technology commenced around 2008, when Cybernoids announced "Live2D ," an early prototype designed to enable real-time deformation of layered artwork for lifelike motion. This version introduced foundational concepts such as mesh-based deformation and parameter-driven , allowing illustrators to retain the flat aesthetic of art while achieving fluid, parallax-like depth. The prototype addressed key challenges in production, where static character designs needed dynamic interactivity without converting to resource-intensive models. By 2009, Cybernoids released the full version of Live2D , marking the technology's initial entry into professional workflows and establishing its viability for applications in and mobile media. The technology gained public traction in 2010 with the beta launch of Live2D Cubism, an evolved iteration that optimized performance for real-time rendering and expanded platform support to include , , and (PSP). This version facilitated the first commercial integration in a , "Barcode Kanojo," where animated characters interacted dynamically with users, demonstrating Live2D's potential for engaging, low-overhead animations in handheld devices. The was further highlighted that year when Live2D was selected as a finalist in the IPA's Mitou software contest, underscoring its technical breakthrough in blending artistry with 3D-like expressivity. Early adoption accelerated in 2011, exemplified by its use in Bandai Namco Games' PSP title "Ore no Imōto ga Konna ni Kawaii Wake ga Nai Portable" (Oreimo Portable), where Live2D animated character portraits and cutscenes, enhancing narrative immersion without altering the source material's 2D style. This integration showcased the technology's scalability for console gaming, influencing subsequent developments. By 2012, Cybernoids advanced Cubism's beta with compatibility for engine, , PS Vita, , and , solidifying Live2D's role as a commercial standard and setting the stage for dedicated SDK releases to broaden industry access. In 2013, the first official version, Live2D Cubism 1.0, was released.

Key milestones and evolution

Cybernoids Co. Ltd. was renamed to Live2D Inc. in 2014 in Tokyo, Japan. The beta launch of occurred in 2010, with the first official release ( 1.0) in 2013, marking the transition from research prototype to a user-accessible tool for and . In 2014, Live2D launched 2.0, which introduced multi-layer deformation features that enabled more complex and nuanced movements by allowing independent control over multiple image layers. This version expanded the tool's capabilities for professional use in . Cubism 3.0 arrived in 2017, incorporating a physics simulation system that automated realistic secondary motions such as hair swaying and clothing dynamics in response to character actions. This release coincided with Live2D's expansion into the emerging Virtual YouTuber () market, where motion-captured Live2D avatars became a staple for , driven by pioneers adopting the technology for real-time facial tracking and expression. Following this, adoption grew rapidly, with Live2D powering characters in VTuber productions and contributing to the format's mainstream popularity. Subsequent updates built on these foundations. Cubism 4.0, released in 2019, added specialized tools for , including improved export options for non- media and enhanced parameter editing for smoother workflows. In September 2023, Cubism 5.0 introduced AI-assisted features, such as automatic generation of facial movements, alongside optimizations for rendering on modern hardware like . As of October 2025, Cubism 5.3 is in beta, with beta3 released on October 16, 2025, focusing on advanced rendering enhancements, including expanded color blending modes for richer visual effects, with ongoing beta testing emphasizing compatibility for immersive applications. Over its evolution, Live2D shifted from a fully suite to a hybrid model featuring an open SDK available at no initial cost, governed by the Live2D Proprietary Software License Agreement and Live2D Open Software License Agreement. This licensing structure allows free development and integration across platforms, with revenue-based royalties for commercial releases, facilitating widespread adoption. By 2025, Live2D had surpassed 400 commercial titles, spanning games, VTubing, and interactive media.

Technology

Core principles and rigging

Live2D operates by dividing static 2D illustrations, typically prepared in layered files, into discrete components such as eyes, mouth, hair, and body parts to facilitate targeted deformation for pseudo-3D effects. Upon importing a file into the Live2D Cubism Editor, each layer or group is automatically converted into an ArtMesh, which is a polygonal overlaid on the image content of that layer. This separation allows independent manipulation of elements, enabling realistic movements like blinking or head tilting without affecting the entire illustration. The rigging process involves assigning to these ArtMeshes to control transformations such as , scaling, and warping. function as hierarchical containers that group and modify one or more ArtMeshes by adjusting their underlying vertices. There are two primary types: warp deformers, which enable non-rigid distortions by directly shifting vertices for effects like facial expressions, and deformers, which apply rigid rotations around a specified axis while preserving shape integrity. These are linked to parameters, numerical values that drive the degree of transformation; for instance, parameters like Angle X and Angle Y typically range from -30 to 30 degrees to simulate natural head turns. At the core of Live2D's deformation is a triangular structure applied to each ArtMesh, where the image texture is mapped onto a grid of connected triangles defined by . Movement is achieved by altering positions based on values, creating fluid pseudo-3D animations from assets. Parameter-driven changes propagate hierarchically through , ensuring coordinated motion across related parts like synchronizing eye and head rotations. Prior to animation, models undergo by setting all parameters to their default values (typically 0, representing the neutral pose) to establish a consistent base form. is then tested in the editor's preview , where users can interactively adjust parameters to verify deformation quality, check for artifacts like tearing, and refine densities for smoother results. This iterative preview ensures the rigged model behaves predictably before export for integration into applications.

Animation techniques and integration

Animation in Live2D involves manipulating defined during the process to create dynamic movements, such as eye blinks or head tilts, through keyframing techniques. Creators set keyframes on a by adjusting parameter values via sliders or numerical inputs, with the software automatically generating in-between frames for smooth . This allows for reusable animation templates, like cyclic eye blink motions, which can be applied to maintain natural expressions without manual repetition for every instance. Physics-based effects enhance realism by simulating natural sway in elements like or using spring-mass models. These systems take inputs from core parameters, such as head angle or body position, and output normalized values (up to 100%) to drive secondary parameters controlling deformation. Settings include duration, ease of swinging (typically 0.7–0.99 for ), and reaction time, enabling pendulum-like motion that responds in to primary s. Physics calculations run at selectable frame rates (e.g., 60 ) and can be baked into keyframes for consistent playback. Facial expressions are achieved using blend shapes, which apply differential deformations to objects like ArtMeshes or deformers for subtle variations in features such as smiles or frowns. These shapes are parameterized and limited to prevent over-deformation, ensuring expressions blend seamlessly with keyframed movements. For lip synchronization, audio-driven maps sound amplitude to mouth-opening parameters via components like MouthMovement in the , producing realistic speech-aligned motions. Integration of Live2D models into applications occurs through the SDK, which supports exporting rigged and animated assets in formats like .moc3 for models and for motions and poses. Runtime libraries handle rendering via or , with dedicated frameworks for platforms including and , allowing developers to load models and update parameters programmatically. Real-time face tracking is facilitated by hooks to external APIs, such as those in nizima LIVE, which map user facial movements to model parameters for interactive control. Performance optimization is critical for deployment, particularly on mobile devices, where the SDK targets 60 FPS on low-end hardware through techniques like efficient draw order management and model structure tuning. In Unity, profilers identify bottlenecks in parameter updates or rendering, enabling reductions in ArtMesh count or physics complexity to balance quality and frame rates across Android and iOS. High-end applications, such as VR, leverage the SDK's native support for advanced rendering while maintaining real-time responsiveness.

Software and tools

Official software from Live2D Inc.

Live2D Inc. develops a suite of tools centered around the Editor and SDK, enabling users to create, animate, and integrate 2D models with realistic movements. The flagship product, Live2D Editor, is available in and PRO versions, with the PRO edition requiring an annual subscription starting at ¥14,280 for the first year (for general users or small-scale enterprises with annual revenue under ¥10 million), with reduced rates of ¥11,760 in the second year and ¥10,680 thereafter, for full functionality without restrictions on model complexity or commercial use. The version supports basic model creation but limits texture files to one, ArtMeshes to 100, and motion parameters to 30, making it suitable for non-commercial experimentation. The Cubism Editor facilitates model rigging through features like automatic mesh generation, which creates deformable ArtMeshes from imported PSD layers by specifying density and boundary parameters, streamlining the preparation of illustrations for animation. Parameter binding allows users to link model movements—such as head tilts or eye blinks—to adjustable values, enabling expressive animations driven by physics simulations or external inputs like face tracking. The latest stable release (as of November 2025) is version 5.2.03, released on May 29, 2025, which includes enhanced tools for VR model previewing, allowing creators to test immersive interactions in virtual environments. Beta programs continue with version 5.3 alpha released in March 2025 and subsequent betas through October 2025, introducing advanced features like improved blend modes and IK parameter controllers for more natural limb movements. Complementing the Editor, the Cubism SDK in versions 5.x provides libraries for real-time rendering of Live2D models in applications, supporting platforms via C++ core and plugins for , with free access for non-commercial projects and commercial licensing based on usage scale. It enables efficient integration of animations into games and apps, handling tasks like model loading and parameter updates at 60 or higher on modern hardware. Additional official tools include the Cubism Viewer, a standalone application for testing exported models and motions outside the Editor, ensuring compatibility before deployment. Sample Data packages offer pre-rigged tutorial models and assets to guide users through rigging and animation workflows. For scene management, Nizima ACTION! serves as a web-based video editor updated in 2025 to support up to 20 scenes per project, allowing multi-model compositions with timeline-based animations and exports for streaming or social media. Licensing across these tools follows annual update cycles, with beta access available to registered users for early feature testing, such as the 5.3 alpha program initiated in early 2025.

Third-party animation and rigging tools

VTube Studio, developed by DenchiSoft and first released in 2020, serves as a prominent third-party application leveraging the Live2D Cubism SDK to pre-rigged models, particularly for workflows. It includes an auto-setup feature for model parameters, enabling rapid configuration of expressiveness, , and hand gestures to enhance fluidity without requiring deep modifications to the underlying rigging. The software supports cross-platform use on Windows, macOS, , and , and operates on a model where the core version is free, with optional donations or a one-time purchase for advanced features like full access. PrprLive, launched around 2020 as a Live2D-compatible viewer and assistant, functions as an alternative tool for previewing models and executing basic animations through multi-expression switching and targetless auto-tracking. It emphasizes high-frame-rate rendering for smooth playback during streaming or testing, allowing users to load .moc3 files exported from the official Editor for quick iteration on parameter-driven movements like expressions. Although development ceased and it was delisted from in 2024, community-maintained versions persist for legacy support up to 2025. Additional third-party integrations include , a skeletal animation tool from Esoteric Software, which supports Live2D export via community-developed plugins that convert bone-based rigs into compatible .moc3 formats for further animation in Live2D environments. These plugins facilitate hybrid workflows where handles initial rigging before exporting to Live2D SDK-compatible applications. Similarly, extensions for enable parameter syncing between Animate's timeline animations and Live2D models, allowing creators to blend traditional frame-by-frame techniques with Live2D's deformable mesh system for enhanced interactivity. Despite their utility, these third-party tools remain dependent on the Live2D SDK for loading and rendering models created in the official Editor, limiting standalone capabilities and ensuring compatibility only with properly exported .moc3 files. Live2D Inc. provides no official support or endorsements for these applications, requiring users to rely on developer documentation and community resources for troubleshooting.

Face-tracking and runtime software

Face-tracking software for Live2D models enables real-time of characters using or sensor inputs, mapping facial movements to model such as eye gaze, blinks, eyebrows, and mouth shapes for applications like virtual streaming. These tools typically leverage blendshape data from ARKit-compatible devices or webcam-based libraries to drive Live2D's , allowing VTubers to synchronize expressions without manual keyframing. VTube Studio, a widely adopted application for Live2D VTubing, utilizes the open-source OpenSeeFace library for webcam-based face tracking, supporting features like eye gaze direction, blinking, and blendshape expressions on consumer hardware such as standard laptops with integrated cameras. Released in 2020 with ongoing updates through 2025, it integrates seamlessly with OBS Studio via the Spout2 plugin for transparent overlays in streams, achieving smooth performance at 30-60 FPS on mid-range PCs. Additionally, VTube Studio incorporates Google MediaPipe for enhanced webcam tracking accuracy, comparable to NVIDIA's solutions, and supports ARKit via connected iPhones for more precise LiDAR-based capture of 52 facial blendshapes. nizima LIVE, developed by Live2D Inc., serves as an official runtime tool for real-time Live2D model control, featuring face tracking and iPhone ARKit for detailed expressions including movements and via its "Perfect Sync" . The free version includes an plugin for direct streaming and supports filters like dynamic accessories (e.g., cat ears) that respond to tracked motions, running efficiently on entry-level hardware with minimal latency; it has received updates as recent as June 2025, adding features like scene functions. iFacialMocap provides high-fidelity LiDAR tracking for Live2D models, capturing and transmitting 52 ARKit blendshapes in real-time to compatible runtimes, enabling nuanced animations such as tongue and jaw movements for VTubing setups. This tool maps sensor data directly to Live2D parameters, supporting integration with applications like VTube Studio over local networks, and performs reliably on or later models without requiring high-end PCs. These software solutions rely on APIs like Apple's ARKit for depth-aware facial capture and Google's MediaPipe for cross-platform webcam processing, converting raw tracking data into Live2D-compatible parameters (e.g., angle_X for head tilt) with low overhead on consumer GPUs, typically under 10% utilization during streams. SDK export formats from Live2D facilitate this parameter mapping for seamless runtime playback.

Applications

Video games and interactive media

Live2D's application in video games has primarily focused on enhancing character expressiveness through real-time 2D animation, allowing static illustrations to convey emotions, movements, and interactions without full . This technology has been particularly impactful in genres like visual novels, rhythm games, and gacha-based titles, where animated portraits deepen player immersion during dialogues, performances, and collection mechanics. By integrating Live2D via SDKs, developers can create responsive characters that react to user input or game events, bridging 2D art with . Early adoption in occurred with the 2011 PSP Ore no Imōto ga Konna ni Kawaii Wake ga Nai Portable, developed by guyzware and published by Games, which marked the first console title to utilize Live2D for animating character portraits and expressions during narrative scenes. This implementation allowed for subtle facial movements and body gestures, making dialogues feel more lifelike compared to traditional static sprites. The game's success helped demonstrate Live2D's potential for portable platforms, paving the way for broader use in . In mobile rhythm games, Live2D has been employed to synchronize character animations with music and player actions, adding visual flair to performances. A representative example is Hatsune Miku: Colorful Stage! (2020), developed by Colorful Palette and published by , where Live2D animates characters like during live shows, enhancing the rhythm-based gameplay with expressive dances and reactions. This approach has contributed to the game's popularity, with millions of downloads and ongoing updates featuring new animated content. Gacha mechanics in mobile games have benefited significantly from Live2D, as animated "idols" or characters make pulls more engaging by showcasing dynamic reveals and interactions. Azur Lane (2017), a naval warfare gacha game by Yostar, exemplifies this, using Live2D for select ship girl skins that respond to touchscreen inputs with movements like head tilts and blinks, fostering emotional connections in collection systems. Similarly, Honkai Impact 3rd (2016), miHoYo's action RPG with gacha elements for consoles, PC, and mobile, incorporates Live2D in character splash arts, birthday events, and pull animations—such as the Herrscher of the Void's fluid motions—to heighten the excitement of acquiring new valkyries. These features have played a key role in player retention, as the technology turns abstract summons into personalized, lively encounters. On console and PC platforms, Live2D has appeared in action and titles to support UI elements or companion characters. These integrations blend seamlessly with gameplay, providing cost-effective expressiveness for character rosters. Overall, Live2D's role in over 400 commercial titles by 2025 has transformed and interactive media by enabling affordable, high-fidelity animations that boost engagement—evident in how Live2D idols in games like and drive social sharing and repeat plays through their charm and interactivity. This has solidified Live2D as a staple for anime-style games, influencing design trends toward more emotive, player-responsive characters.

Virtual YouTubers and live streaming

Live2D has become integral to the Virtual YouTuber () phenomenon, enabling real-time, expressive avatars that facilitate interactive on platforms like and . The technology's adoption surged with the launch of major agencies, starting with in early 2018, which was founded by ANYCOLOR Inc. (formerly Ichikara Inc.) specifically to promote Live2D models as a cost-effective alternative to for live broadcasts, allowing multiple talents to debut simultaneously and establishing the modern VTuber format of facial integrated with 2D rigging. followed suit in 2018 by transitioning several talents from initial 3D models to Live2D avatars, enhancing accessibility for live interactions and contributing to the agency's rapid growth in subscriber numbers and stream viewership. This shift marked Live2D's dominance in enabling scalable, high-engagement VTubing, building on the groundwork laid by pioneers like , who debuted in late 2016 as the first self-proclaimed VTuber using and to popularize the concept of virtual personalities engaging audiences in real-time videos. VTubers leverage Live2D through seamless integration with streaming software, particularly Open Broadcaster Software (), via plugins that support face-tracking for natural movements. Tools like VTube Studio, a free application designed for Live2D models, connect directly to as a virtual camera source, utilizing webcam-based facial recognition to animate expressions, head tilts, and eye gazes in , often requiring only a standard smartphone or computer camera for setup. Custom rigging in Live2D Cubism allows creators to design layered deformations for specialized emotes—such as winks, blushes, or exaggerated reactions—and full-body dances, where parameters control limb articulations and physics simulations for fluid, performative sequences during streams. These features enable VTubers to respond dynamically to interactions, enhancing viewer immersion without the need for expensive hardware. By 2025, the VTuber ecosystem has expanded dramatically, with over 10,000 active creators worldwide driving a market valued at approximately USD 2.86 billion, fueled by live streaming's accessibility and global appeal. Live2D remains the predominant technology for 2D avatars, powering the setups of leading agencies like Nijisanji and Hololive, where it facilitates the expressive animations essential for maintaining audience engagement in long-form broadcasts. Monetization through YouTube's Super Chat feature has profoundly impacted VTuber revenue, with donations during streams accounting for a significant portion of earnings; in early 2025, VTubers comprised 16 of the top 20 Super Chat recipients globally, generating hundreds of millions in collective income and underscoring the technology's role in enabling direct fan support. A 2025 analysis of over 1,900 VTubers revealed that Super Chats represent about 39% of primary income sources, though income inequality persists, with top agency-affiliated talents capturing the majority while independents face barriers to visibility. Recent evolutions in 2025 highlight Live2D's adaptability to emerging trends, such as (AR) overlays that blend virtual avatars with real-world environments during streams, allowing VTubers to interact with physical props or backgrounds via tools like ARwall for enhanced production value. Additionally, multi-model switching has gained traction, enabled by software like VTube Studio's hotkey system, which permits seamless transitions between different rigged avatars mid-stream—such as from a casual outfit to a performance costume—facilitating varied content without interrupting broadcasts. These advancements, combined with face-tracking integrations, continue to lower entry barriers and expand creative possibilities for live performances.

Other uses in animation and media

Live2D technology has found applications in animated series and films, enabling efficient production of expressive 2D characters for scripted . In 2018, acquired a majority stake in Live2D Inc. and announced a to produce a feature-length animated using the software's 2D-to-3D capabilities, marking a significant step in integrating Live2D into theatrical releases. This partnership extended to practical implementations, such as Live2D Creative Studio's contributions to character in the 2021 feature film Belle, directed by Mamoru Hosoda, where the technology supported dynamic facial expressions and movements in key sequences. A notable example of Live2D's role in television animation is the 2024–2025 series Tono to Inu (The Corgi and the Samurai), co-produced by OLM and Live2D Creative Studio. This short-form anime, airing in multiple versions with different voice actors, utilizes Live2D almost entirely for its character animations, allowing for fluid, real-time-like expressions in a historical comedy setting. Similarly, the 2025 original TV anime Inherit the Winds, produced by Aniplex with animation by Live2D Creative Studio and Drive, incorporates Live2D to animate characters in a historical narrative set during the Bakumatsu period, enhancing emotional depth through subtle 2D deformations. In , Live2D supports interactive banners and digital promotions, where characters respond to user interactions for engaging brand experiences, though specific campaigns remain proprietary. For educational purposes, Live2D is integrated into curricula at art institutions to teach and character design. The Toyo Institute of Art and Design employs Live2D in its programs to train illustrators in , emphasizing its role in creating versatile animations for media applications. The Kyoto Institute of Design's Comic Illustration Course similarly uses Live2D to build practical skills in rigging and , preparing students for industry demands in character . Live2D Inc.'s Education Aid Program provides licenses to eligible schools worldwide, facilitating hands-on tutorials and training in and expression techniques as of 2025. By 2025, Live2D's adoption in e-learning platforms has grown through animated avatars in interactive courses, supported by official resources like the online training program, which teaches professional for educational . This integration enhances engagement in digital learning environments by enabling customizable, expressive instructors.

Company and ecosystem

Live2D Inc. overview

Live2D Inc. is a technology company specializing in software and services. Founded on July 31, 2006, in , the company is headquartered at Koike Bldg. 7F, 1-20-2 , Shinjuku-ku, 160-0022. As of November 4, 2025, it employs 134 individuals. The firm is led by CEO Tetsuya Nakajo, who also serves as a alongside Genjiro Ishikawa, Masayuki Todate, and outside directors Atsuhiro Iwakami and others. The company's core business revolves around the development of Live2D-related software and services, including the flagship Editor and SDK for . It also provides Live2D graphics production and film services, supporting applications in , virtual streaming, and production. Revenue streams primarily derive from professional software subscriptions, SDK licensing fees, and partnerships that integrate Live2D technology into commercial products, powering over 400 titles worldwide. From its origins as a startup focused on innovative 2D expression techniques, Live2D Inc. has evolved into an industry standard, particularly in the and sectors. Key growth milestones include the release of advanced versions like 5 in 2023 and expansions into educational programs such as the LEAP Co-Branding initiative in 2023. The LEAP program supports over 200 institutions worldwide as of 2025. The company has forged strategic partnerships with entities like for seamless SDK integration in game development and Inc. since 2018 for animation collaborations. While facing competition from tools that offer broader spatial capabilities, Live2D Inc. emphasizes its niche in efficient, anime-style 2D to sustain market leadership.

Marketplace, resources, and community

The official Live2D , known as nizima, was launched in initially under the name "2jima" before being renamed and redesigned in 2019. It serves as a platform for buying, selling, and commissioning original illustrations, Live2D models, assets, streaming resources, motions, and backgrounds, enabling creators to monetize their work through primary and secondary sales. By 2025, the hosts over 10,000 assets and templates, supporting a diverse ecosystem for animation content. Creators benefit from a low 10% commission on sales, retaining 90% of proceeds for primary transactions, while secondary sales allow customizable royalty rates up to 60% for original artwork derivatives. Live2D provides extensive free resources to support users, including comprehensive tutorials, manuals, and sample models available on the official documentation site. These materials cover Editor basics, SDK integration, and advanced techniques like model and , with downloadable sample data under a free material license agreement for learning and testing. The platform also offers an Education Aid Program to foster skill development among creators worldwide. The Live2D thrives through official s, servers, and regional groups, such as a online with approximately 40,000 users focused on discussions, learning, and showcasing creations. The Creators allows users to report issues, share tips, and request features related to tools. Annual events like "alive," held since 2014, bring together professional and amateur creators for hybrid in-person and online gatherings to demonstrate techniques, announce updates, and celebrate contributions, with the 2023 edition marking its 10th anniversary at UDX. Open-source engagement occurs via the official organization, which maintains 18 repositories including SDK samples and frameworks, encouraging bug reports, pull requests, and suggestions to enhance accessibility. In 2025, ongoing updates to documentation and tools, including multilingual support in English and for apps and s, continue to expand global accessibility.

References

  1. [1]
    Live2D Cubism Editor
    Live2D Cubism Editor is a professional 2D modeling tool that animates illustrations for dynamic expressions, used for modeling and animation.
  2. [2]
    Company | Live2D inc.
    Company Profile ; July 31, 2006 · Development of Live2D related software and services Live2D graphics production Live2D film animation production · 134(as of ...
  3. [3]
    For Enterprise Use - Live2D Cubism
    An industry standard for 2D real-time expression, immersing audiences with characters that leap off the screen. Powering 400+ commercial titles worldwide.
  4. [4]
    Live2D Sample Data (for Free)
    Free Live2D sample data includes model data, basic motion, and background images. It's for learning Cubism Editor and testing SDK, with a license agreement ...
  5. [5]
    Live2D Education Aid Program
    The Live2D Education Aid Program leases free software licenses to eligible educational institutions for practice, education, and training purposes.Missing: integration e-
  6. [6]
    Animation effects startup Cybernoids raises $557K for global ...
    Mar 26, 2013 · Cybernoids was founded in July of 2006, and the Live 2D technology was chosen as one of the year's finalists for Mitou, a bi-annual software ...
  7. [7]
    Live2D Cubism Editor 2.0 - Update History (2.0) - Google Sites
    THIS SITE IS FOR CUBISM 2 (OLD VERSION). PLEASE CLICK HERE FOR THE LATEST VERSION OF CUBISM. DOWNLOAD PAGE · Live2D Cubism Editor 2.0 · Home · License Agreement.
  8. [8]
    Live2D Database
    A database of commercial works in which Live2D has been used. Information on Live2D expression, including information on Live2D, is accumulated and ...Missing: adoption 100+ 2015
  9. [9]
    About Physics | Editor Manual - Live2D Cubism
    Dec 17, 2024 · With Cubism, you can set up and export physics operations that will create hair swinging movements in real time as the face turns around.Missing: 2017 | Show results with:2017
  10. [10]
    Live2D Cubism | See your creation come to life. Software that ...
    Live2D is the industry standard tool for 2D real-time expression, chosen by professional creators around the world.Download Live2D Cubism EditorLive2D Cubism EditorLive2D Sample Data (for Free)Compare FeaturesLive2D Cubism SDK - About
  11. [11]
    What's New in 4.0 | Editor Manual - Live2D Cubism
    May 8, 2020 · Live2D Cubism gets a major update to version 4.0! New functions specific to video production and features that greatly improve embedded data!Missing: AI- assisted rigging
  12. [12]
    5.0 New Features | Editor Manual - Live2D Cubism
    Sep 21, 2023 · The Cubism Editor/Viewer UI is automatically scaled without blurring according to the OS display settings. Native support for Apple M series.Missing: 2024 rendering
  13. [13]
    What's New in 5.3 | Editor Manual - Live2D Cubism
    Aug 26, 2025 · More than 10 different color blends using RGB and alpha blending by alpha can be set individually. This enables advanced drawing expressions ...Missing: August VR
  14. [14]
    SDK Release License (Publication License Agreement)
    License agreement required for business entities publishing a content using Live2D SDK. Individuals and Small-Scale Enterprises are exempted from the license ...Missing: 400 | Show results with:400
  15. [15]
    About ArtMeshes | Editor Manual - Live2D Cubism
    Aug 26, 2025 · On this page you will learn about ArtMeshes. What Is an ArtMesh? PSD images loaded into the model workspace are automatically assigned a mesh for each layer ...Missing: rigging | Show results with:rigging
  16. [16]
    About Deformers | Editor Manual - Live2D Cubism
    Feb 2, 2023 · Cubism allows you to deform objects by moving individual mesh vertices to change the direction of a character's face or to add arm movements.
  17. [17]
    Standard Parameter List | Editor Manual - Live2D Cubism
    Aug 26, 2021 · It is recommended to set parameters according to the following rules. Principle The eyes and mouth should be set to 0 when normally closed and 1 when normally ...
  18. [18]
    Edit Mesh manually | Editor Manual - Live2D Cubism
    Aug 26, 2025 · Manual editing of the mesh allows the positions of the vertices to be edited one at a time. For facial parts such as eyelashes and mouth area, ...Missing: bilinear | Show results with:bilinear
  19. [19]
    About Parameters | Editor Manual - Live2D Cubism
    Aug 26, 2025 · A parameter is a setting that expresses a specific movement, such as [Angle X] or [Mouth Open/Close]. In the example below, two keys are created ...
  20. [20]
    Automatic Mesh generator | Editor Manual
    Sep 21, 2023 · On this page you will learn how to automatically generate an ArtMesh. For details on manual editing, see “Edit Mesh Manually.”Missing: bilinear | Show results with:bilinear
  21. [21]
    6. Creating Animations | Editor Tutorial - Live2D Cubism
    This section describes the operations for attaching motion to the model. Adding motion makes the model even more attractive, so learn how to use it here!Missing: core principles documentation
  22. [22]
    Blend Shape | Editor Manual - Live2D Cubism
    Oct 16, 2025 · Blend Shape can be added to the following six types of objects. ArtMesh; Warp Deformer; Rotation Deformer; Part(s); ArtPath; Glue. Create Blend ...Edit Form · Cubism Editor 4.2 · Limit settings for blend shape...
  23. [23]
    Lip-sync Settings | SDK Tutorial - Live2D Cubism
    Jan 30, 2020 · To set up lip-sync, the Cubism SDK uses a component called MouthMovement. To set up MouthMovement on a Cubism model, do the following three things.
  24. [24]
    Live2D Cubism SDK
    Live2D Cubism SDK is a software development kit for rendering models and animations created with Cubism Editor on applications.
  25. [25]
    Cubism SDK Manual
    This is the manual website for Cubism SDK. Live2D Official Website · Live2D Cubism Product Website · Cubism SDK Download · Cubism SDK Tutorial Website · GitHub ...Missing: integration | Show results with:integration
  26. [26]
    Camera Settings | nizima LIVE Manual - Live2D Cubism
    Jun 18, 2025 · Face tracking: Select the webcam to be used to animate the Live2D model. · Hand tracking. Select the webcam to be assigned to hand tracking.Missing: real- | Show results with:real-<|separator|>
  27. [27]
    Performance When Working With Models in the SDK | Editor Manual
    Jun 28, 2024 · When displaying models on the SDK, performance may be degraded depending on the order in which ArtMeshes are drawn.Missing: mobile | Show results with:mobile
  28. [28]
    Performance Tuning of Cubism SDK for Unity | SDK Manual
    Jan 31, 2020 · Typically, performance tuning should be done using Unity's profiler to check for heavy processing areas. In the Cubism SDK, the structure of the model may ...Missing: mobile | Show results with:mobile
  29. [29]
    Platform support status | SDK Manual - Live2D Cubism
    Dec 19, 2024 · This is the list of SDK platforms that Live2D officially supports. Native. Rendering engine. SDK type, Windows, macOS, Linux *1, Android, iOS *1 ...Missing: integration | Show results with:integration
  30. [30]
    Compare Features – Live2D Cubism FREE vs PRO
    PRO has no restrictions, while FREE is for basic models with limits on texture files, ArtMesh, and motion parameters. FREE has commercial use restrictions.
  31. [31]
    I want to buy a PRO version Subscription. - Live2Dのヘルプ
    Sep 4, 2019 · The prices consist of a price for General Users and Small-Scale Enterprises (annual sales for the most recent year less than 10 million yen) and ...
  32. [32]
    Live2D Cubism Editor 最新アップデート情報
    Cubism Editor 5.2.00 released! The much-anticipated Parameter Controller for IK operations is included! Through Parameter Controller and Loop editing ...
  33. [33]
    Download Live2D Cubism SDK for Unity
    Download for the first time. Individual Enterprise. Cubism SDK for Unity. Download the latest version. Update history [2025/07/17] Cubism 5 SDK for Unity R4_1.
  34. [34]
    Download Live2D Cubism Viewer for Unity
    Download Live2D Cubism Viewer for Unity allows you to check how exported data created with Cubism Editor will render in Unity.Missing: VR | Show results with:VR
  35. [35]
    Update History | Nizima ACTION!! Live2D Manuals & Tutorials
    Added “Edit history and restore (preview)” feature. Facial expressions and motions will be automatically extracted when uploading Live2D models. The BPM grid ...
  36. [36]
    Download Live2D Cubism Editor
    Download Live2D Cubism Editor. Feel the sensation of breathing life into your original character! Start with 42-day Trial of PRO version.Live2D Communities · Live2D Sample Data (for Free) · Download Cubism MOC3...Missing: formation 2012
  37. [37]
    VTube Studio – Official Website
    VTube Studio is an app for Virtual YouTubers that brings Live2D models to life, with accurate face and hand tracking, and features like hotkeys and props.VTube Studio Documentation · License · Team · Branding
  38. [38]
    VTube Studio on Steam
    Rating 9/10 (2,944) VTube Studio is the tool of choice for any Live2D VTuber! · VTube Studio can be used directly with your webcam on any Windows PC or Mac. · VTube Studio · VTube ...
  39. [39]
    PrprLive - Steam Community
    PrPrive is a Live2D live broadcast assistant software developed by us. The software can achieve a high frame rate, multi-expression switching, targetless auto- ...
  40. [40]
    Making Vtuber Rigging - Spine Forum
    Aug 3, 2025 · Spine can indeed be used for rigging 2D models for Vtubing. There is a third-party tool available for Vtubing using publicly available Spine ...
  41. [41]
    emilianavt/OpenSeeFace: Robust realtime face and facial ... - GitHub
    VTube Studio uses OpenSeeFace for webcam based tracking to animate Live2D models. A renderer for the Godot engine can be found here. This project implements a ...
  42. [42]
    Mediapipe Webcam Tracker · DenchiSoft/VTubeStudio Wiki - GitHub
    Jun 9, 2023 · VTube Studio now supports high-quality webcam tracking using the new Google Mediapipe Webcam tracker. The quality is roughly comparable to the NVIDIA tracker.
  43. [43]
    Download For Free - nizima LIVE
    An intuitive tracking application that allows anyone to sync their motions and facial expressions to a Live2D model. Whether you're a pro or a casual hobbyist, ...Missing: real- | Show results with:real-
  44. [44]
    Link nizima LIVE with Your Streaming Software - Live2D Cubism
    Jun 14, 2024 · 1. Download OBS Studio from the official OBS Studio website. 2. Get the nizima LIVE OBS plug-in installer using the “Download OBS plugin” button ...
  45. [45]
    iFacialMocap: HOME
    iFacialMocap is an app that captures facial movements with an iPhone equipped with FaceID or Webcam. This app sends the captured motion in real time to 3DCG ...iFacialMocap communication... · Download · Blender Tutorial · TutorialMissing: Live2D | Show results with:Live2D
  46. [46]
    Download - iFacialMocap
    This allows face tracking using only a web camera without using an iPhone. This is a paid software with a 2-week free trial period. This software allows you ...Tutorial · ダウンロード · Older VersionsMissing: Live2D | Show results with:Live2D
  47. [47]
    Ore no Imouto ga Konna ni Kawaii Wake Ga Nai Portable
    Jan 26, 2011 · Use the Live2D feature to make use of it! 2) A Limited Edition "Zuttoko No Mama" Pack includes the Oreimo Portable game and a special booklet!
  48. [48]
    Love Live! School Idol Festival - Wikipedia
    It featured songs and characters from the series Love Live! School Idol Project and Love Live! Sunshine!!, newly introduced girls, and storiesGameplay · Nijigasaki High School Idol Club · Other mediaMissing: Live2D | Show results with:Live2D
  49. [49]
    Azure Lane - Live2D Cubism
    Sep 14, 2017 · Azure Lane is a naval warfare game with RPG, 2D shooter, and tactical elements. It features 2D side-scroller gameplay, over 300 ships, and Live ...
  50. [50]
    Live2D - Azur Lane Wiki
    Feb 28, 2023 · Live2D in Azur Lane animates 2D illustrations in real-time, allowing interactive touchscreen input for specific ships and skins. It can be ...
  51. [51]
    Live2D Palatinus Equinox Honkai Impact 3rd - HoYoLAB
    Apr 24, 2022 · HoYoLAB is the gaming community forum for HoYoverse games, including Honkai Impact 3rd, Genshin Impact, and Tears of Themis, with official ...
  52. [52]
  53. [53]
    All Live 2D Animation for Character Splash Art - Genshin Impact
    Sep 24, 2021 · This is all Live 2D Animation Splash Art for characters I can gather (v2.1).. The Full Playlist of "just for fun" is Here: ...
  54. [54]
    About the existence of VTuber office | MONOLITH LAW OFFICE
    Aug 31, 2023 · In February 2018, the VTuber agency “Nijisanji” emerged. Initially, it was a group started by ANYCOLOR, Inc. to promote the Live2D app used by ...
  55. [55]
    Culture | COVER Corp.
    We are continuing to build a platform to bring together unique and diverse content creators with dazzling abilities while providing a stage for them to perform ...<|separator|>
  56. [56]
    VTubers: The Rise of Synthetic Media in Entertainment - Respeecher
    Feb 10, 2021 · Presumably, Kizuna's videos are created using a free Vtuber software known as MikuMikuDance, along with other techniques for capturing facial ...
  57. [57]
    Vtuber Statistics Statistics: ZipDo Education Reports 2025
    May 30, 2025 · With over 10,000 active VTubers worldwide and a booming industry projected to grow at 22% annually, it's clear that virtual entertainers have ...
  58. [58]
    VTuber Market Size & Share Analysis - Growth Trends
    Mar 31, 2025 · The VTuber Market is expected to reach USD 2.86 billion in 2025 and grow at a CAGR of 9.52% to reach USD 4.50 billion by 2030.<|separator|>
  59. [59]
    VTubers Surpass Real-Life Streamers in 2025: YouTube's Top Trends
    Jun 25, 2025 · As of February 2025, 16 of the top 20 Super Chat earners are VTubers; COVER Corp's 2024 revenue hit ¥43.4 billion JPY (approx. $297 million ...
  60. [60]
    Who Reaps All the Superchats? A Large-Scale Analysis of Income ...
    Apr 25, 2025 · In this study, we analyzed over a million hours of publicly available streaming records of over 1900 VTubers on YouTube to understand their monetization ...
  61. [61]
    How to Be a VTuber | From Free Setup to Virtual Studio Pro - ARwall
    Oct 30, 2025 · Master VTubing from free tools to professional AR/XR setups. Learn how ARwall helps VTubers create stunning virtual environments, ...
  62. [62]
  63. [63]
    Aniplex, Live2D Announce Partnership With Plans for Anime ...
    Mar 28, 2018 · Aniplex has acquired a majority stake in Live2D, and plans to implement Live2D drawing and animation software in a feature-length animated film.
  64. [64]
  65. [65]
    Inherit The Winds is an Upcoming Anime Co-Produced By Live2D ...
    Sep 13, 2025 · Aniplex has unveiled Inherit The Winds, a historical TV anime co-produced by Live2D Creative Studio and Drive.<|separator|>
  66. [66]
    Toyo Institute of Art and Design - Creator Interviews | Live2D Cubism
    The goal of our school in introducing Live2D was to develop illustrators capable of designing all kinds of motion, primarily for characters. The world of CG ...
  67. [67]
    Kyoto Institute of Design, Comic Illustration Course - Creator Interviews
    The course focuses on practical drawing skills, 2D skills, and uses Live2D to stimulate broad capabilities, with a strong focus on 2D for industry recognition.
  68. [68]
    Live2D - Wikipedia
    Live2D is an animation technique used to animate static images—usually anime-style characters—that involves separating an image into parts and animating ...
  69. [69]
    nizima is an official Live2D market where you can buy, sell and ...
    nizima is an official Live2D market where you can buy, sell and order original illustrations and Live2D models. With nizima, you can purchase attractive ...Missing: assets 2025
  70. [70]
    Business Overview | Live2D inc.
    "nizima" is the official Live2D marketplace where users can buy and sell illustrations, Live2D data, VTuber models, streaming assets, and more. nizima also ...
  71. [71]
    Rich Materials Collection - nizima ACTION!! by Live2D
    Access over 10000 assets and templates, and easily create videos with Live2D using a wide variety of sample models and items.Missing: 2025 | Show results with:2025
  72. [72]
    nizima glossary | What you can do with "nizima"
    You can set a “minimum secondary sales price” and a “royalty rate (0-60%)” for selling your illustration works. · Live2D models may not be sold for less than the ...Missing: percentage | Show results with:percentage
  73. [73]
    Live2D Manuals & Tutorials - Live2D Cubism
    This is a comprehensive manual for Live2D Cubism. Editor Tutorial This is a tutorial for Live2D Cubism. If you're new to Live2D Cubism, start here!Live2D Cubism Manuals · Live2D Cubism Tutorials · Live2D Cubism SDK TutorialsMissing: principles | Show results with:principles
  74. [74]
    Live2D Cubism Tutorials | Editor Tutorial
    Sep 5, 2025 · This is a tutorial site for Live2D Cubism. See “official website” for a detailed introduction to Cubism products. How to use the tutorials.Live2D by Diagrams · Live2D Glossary · 10mins to learn “Blinking”Missing: principles rigging
  75. [75]
    Live2D Communities
    Live2D communities include a Discord server, Reddit, official forums, and Twitter hashtags for sharing creations. There are communities for all creators.Missing: PrprLive | Show results with:PrprLive<|separator|>
  76. [76]
    Live2D Creators Forum
    If you have any questions, reports, suggestions, or requests about Live2D, please send them to this forum.Running Live2D Cubism on... · Help · Tips and Tricks · BugMissing: Discord members count
  77. [77]
    alive 2023 – December 1, 2023 (Fri) @Akihabara UDX ... - Live2D
    Oct 18, 2023 · “alive” is an annual event for all things Live2D. Creators come together to share their techniques and passion. The 10th-anniversary event ...
  78. [78]
    “alive” hosted by Live2D | CLIP STUDIO Solution | CELSYS
    Celsys sponsored "alive 2024", a celebration of creators hosted by Live2D, which was held at Akihabara UDX in December 2024. Live2D Inc.'s Live2D technology ...Missing: Expo | Show results with:Expo
  79. [79]
    Live2D - GitHub
    Open code straight from Japan. Live2D has 18 repositories available. Follow their code on GitHub.Live2D/CubismWebFramework · Live2D/CubismNativeFrameworkMissing: PrprLive | Show results with:PrprLive
  80. [80]
    Live2D/CubismNativeFramework - GitHub
    There are many ways to contribute to the project: logging bugs, submitting pull requests on this GitHub, and reporting issues and making suggestions in Live2D ...<|separator|>
  81. [81]
    Live2D Cubism Manuals | Editor Manual
    This is the Live2D Cubism manual website. For a detailed product introduction, please refer to the official Cubism website.Notes On Using Cubism Editor · Manual List · You Can Also Download The...