VBridger is a face tracking plugin designed for Vtube Studio and Live2D, which allows the user to make better use of IphoneX ARKit tracking on their live2D model.
All Reviews:
Mostly Positive (69) - 76% of the 69 user reviews for this software are positive.
Release Date:
Apr 10, 2022
Developer:
Publisher:
Tags

Sign in to add this item to your wishlist, follow it, or mark it as ignored

Buy VBridger

Buy VBridger + Editor BUNDLE (?)

Includes 2 items: VBridger, Vbridger - Developer Mode

-15%
$25.48

Downloadable Content For This SoftwareBrowse all (1)

 

About This Software

VBridger allows VTubers to use face tracking to it's fullest potential

VBridger allows for the augmentation of tracking data, allowing users to combine, mix, and modify live data for use with VTuber Avatars. It comes with several base settings and samples for different types of models. If you have a standard live2D model, you can use our VTS Compatible settings to enhance the tracking quality. If you have a model rigged for ARKit using our parameter guide, you will be able to use our ARKit settings for advanced expression control. If you have an ARKit compatible VRM model, you can use VMC to send tracking data, allowing you to use input curves to tune and calibrate the tracking to better fit your face.

With the VBridger - Editor DLC, riggers can unlock the full potential of VBridger and their rigs by gaining the ability to create new outputs and custom controls for their models. Use your face to toggle VRM exressions via VMC, or create logic based expressions to add flutter to your live2D wings- the possibilities are endless.

Current available input sources:
•Tracking data from the IPhone ARKit FACS based face tracking system from the following apps
•ifacialmocap (iPhone)
•FaceMotion3D (iPhone)
•MeowFace (Android)
•VTubeStudio (iPhone)
•MediaPipe (Webcam)
•NVIDIA (Webcam) *Requires the NVIDIAa webcam DLC from VTube Studio, it's free!
•Additionally, use your microphone to generate audio inputs.


Current available output software:
•Vtube Studio via the Vtube Studio API allowing for the control of live2D models.
•Virtual Motion Capture (VMC) Protocol allowing for any VMC compatible app to recieve face data from VBridger. As long as an output shares the name of a blendshapeclip on your VRM, you can control it with VBridger.
VMC will only work on the facial tracking of the model, it cannot send head rotation, eye rotation, or control the body at the moment.

More on the way!

System Requirements

    Minimum:
    • Requires a 64-bit processor and operating system
    • OS *: Windows 7+
    • Processor: AMD / Intel CPU running at 2.5 GHz or higher
    • Memory: 1 GB RAM
    • Graphics: AMD/NVIDIA graphics card with at least 2GB of dedicated VRAM and DirectX 11+
    • DirectX: Version 11
    • Network: Broadband Internet connection
    • Storage: 500 MB available space
    Recommended:
    • Requires a 64-bit processor and operating system
    • OS: Windows 10
    • Processor: AMD / Intel CPU running at 3.0 GHz or higher
    • Memory: 4 GB RAM
    • Graphics: AMD/NVIDIA graphics card with at least 4GB of dedicated VRAM and DirectX 11+
    • DirectX: Version 11
    • Storage: 1 GB available space
* Starting January 1st, 2024, the Steam Client will only support Windows 10 and later versions.

What Curators Say

2 Curators have reviewed this product. Click here to see them.

Customer reviews

Review Type


Purchase Type


Language


Date Range
To view reviews within a date range, please click and drag a selection on a graph above or click on a specific bar.

Show graph



Playtime
Filter reviews by the user's playtime when the review was written:




No minimum to No maximum
Display As:
Show graph
 
Hide graph
 
Filters
Excluding Off-topic Review Activity
Playtime:
There are no more reviews that match the filters set above
Adjust the filters above to see other reviews
Loading reviews...