A Windows application that uses multiple webcams to capture human motion using AI. Captured motion can be exported via OSC (VMT, VMC) protocol or shared memory in real time, or saved as BVH files or FBX files. Hand tracking and face tracking supported.
所有評論:
7 篇使用者評論 - 使用者評論數量不足,無法產生整體評價
發行日期:
2021 年 12 月 30 日
開發人員:
標籤

登入以將此項目新增至您的願望清單、關注它,或標記為已忽略

搶先體驗軟體

立即開始使用,並參與軟體開發過程。

備註:這款搶先體驗軟體尚未開發完成,且內容可能會有所變動。如果目前的開發進度未使您產生興趣,建議您等待本軟體開發至下一階段再決定是否購買。了解更多

開發者的話:

為何採用搶先體驗模式?

「Our application has already been used by a large number of users since its release in June 2021. We have received a lot of feedbacks on our Discord server, and we have upgraded our app more than 15 times so far, though many requests are still on the ToDo list. We are planning to continue upgrading as follows and accepting feedbacks from users.
Q1 2022:
- Add upper body only mode (=> Released in v1.15)
- Improve performance by reducing rendering (=> Released in v1.15)
- Add read/write function of setting preset
Q2 2022:
- Improve tracking accuracy using time series data
- Add data output format (=> Released in v1.16)」

這款軟體的搶先體驗時間大約會持續多久?

「For 1 to 2 years.」

正式版預計會與搶先體驗版有何不同?

「In the future version, we plan to improve the user interface, reduce CPU/GPU usage, improve estimation accuracy, and add data output options.」

搶先體驗版目前的開發進度如何?

「Basically, all necessary functions have been implemented, and it is in a state where it can be practically operated in terms of load and accuracy, such as use with VR applications and send data to DCC tools or game engines.」

軟體售價在搶先體驗期間前後會有所變動嗎?

「We plan to raise the price as new features are implemented.」

在開發過程中,您打算如何與社群互動?

「We accepts suggestions and bug reports at our Discord server.」
繼續閱讀
不支援繁體中文

本產品尚不支援您的目前所在地的語言。購買前請先確認語言支援清單。

下載 MocapForAll Demo

購買 MocapForAll

 
查看所有討論

在討論區回報錯誤並留下對於此軟體的意見

關於此軟體

--------
PLEASE TRY DEMO VERSION FIRST
Make sure the app works in your environment before purchase. The demo version has limited functionality only for data export.
--------

This is a Windows application that uses multiple webcams to capture human motion using AI.

No special equipment required

You can capture human motion if you have the followings:
  • a middle range PC
  • 2 or more webcams
  • a room of about 2.5m x 2.5m
You can use regular webcams like the ones used in video conferences. You can also use apps that turns your smartphones or tablets into webcams.

Realtime with a middle range PC

For examples, it runs at
  • around 17 fps on Surface Pro 7 which does not have a dedicated GPU
  • 30 to 60 fps on GTX 1080 Ti

What you can do with MocapForAll

  • You can output captured motion to the network(*1) via VMT protocol(*2) and VMC protocol(*3) in real time.
  • You can save the captured motion to files in BVH format and FBX format.
  • You can output captured motion to the shared memory in real time.

(*1) Both VMT protocol and VMC protocol use UDP/OpenSound Control.
(*2) "VMT protocol" here refers to the message format used in the communication of Virtual Motion Tracker. The official HP of Virtual Motion Tracker does not use the word "VMT protocol", but MocapForAll uses the word "VMT protocol" for convenience.
(*3) "VMC protocol" is a message format used for communication between applications such as VirtualMotionCapture. Note that VirtualMotionCapture itself is not required for other compatible apps to communicate with each other using VMC protocol.

These allow you to do the followings:

Use in SteamVR via Virtual Motion Tracker
Through Virtual Motion Tracker, the capture motion can be used as virtual trackers in applications running on SteamVR.

Use in Unreal Engine4, Unreal Engine5, Unity
You can send the captured motion to Unreal Engine4, Unreal Engine5, or Unity for game development or video production.
  • Plugins for linking data directly to UE4, UE5, or Unity are available from the online manual.
  • As described in Use in other apps via VMC protocol, it is also possible to link by EVMC4U and VMC4UE using VMC Protocol.
  • A Unity sample to read data from the shared memory written by MocapForAll is available from the online manual.

Use in other apps via VMC protocol
You can send the captured motion to various applications via the VMC protocol. The following are confirmed to work:
  • Sending bones and facial expression morphs to Unity using EVMC4U
  • Sending bones and facial expression morphs to Unreal Engine using VMC4UE
  • Sending bones to Blender using VMC4B
  • Sending bones to VSeeFace, and receiving facial expression morphs from VSeeFace
  • Sending tracker to VirtualMotionCapture

Save animations to files
You can save the captured motion as BVH files and FBX files. It can be used with Blender etc.

Create programs to receive data
As the output specifications are open to public (except for FBX), you can even create your own programs to receive data from MocapForAll.

系統需求

    最低配備:
    • 需要 64 位元的處理器及作業系統
    • 作業系統: Windwos 10
    • 處理器: 1.1GHz quad-core processor
    • 記憶體: 8 GB 記憶體
    • 儲存空間: 2 GB 可用空間
    • 備註: Tested on Surface Pro 7
    建議配備:
    • 需要 64 位元的處理器及作業系統
    • 作業系統: Windwos 10
    • 處理器: 4.5GHz quad-core processor
    • 記憶體: 16 GB 記憶體
    • 顯示卡: Geforce GTX 1080 Ti
    • DirectX: 版本:12
    • 儲存空間: 2 GB 可用空間

顧客評論

評論類型


購得方式


語言


日期範圍
如欲檢視特定日期範圍內的評論,請在上方圖表框選一段日期範圍,或點擊直條。

顯示圖表



遊戲時數
依照使用者撰寫評論當時的遊戲時數篩選:



不限最低時數不限最高時數
顯示:
顯示圖表
 
隱藏圖表
 
篩選條件
排除離題評論活動
遊戲時數:
無其它評論符合上述篩選條件
調整上方的篩選條件以檢視其它評論
評論讀取中…