Skip to content

Commit 139a4b3

Browse files
committed
Merge branch 'main' of https://github.com/MetaGenAI/MetaGenNeos into main
2 parents 35ce2f3 + a229689 commit 139a4b3

1 file changed

Lines changed: 32 additions & 3 deletions

File tree

README.md

Lines changed: 32 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -13,8 +13,8 @@ You can visit http://metagen.ai/ to find more about the vision and motivation be
1313

1414
## How to set up and load the plugin
1515

16-
1. Download `metagenneos_v0.8.5.zip` and `run_neos_metagen.bat` and `grpc_csharp_ext.dll` from the [Releases tab](https://github.com/MetaGenAI/MetaGenNeos/releases/).
17-
2. Unzip the contents of `metagen-0.9.zip` into the Neos installation folder, and copy `run_neos_metagen.bat` and `grpc_csharp_ext.dll` into the Neos installation folder also.
16+
1. Download `metagenneos_v0.8.9.zip` and `run_neos_metagen.bat` and `grpc_csharp_ext.dll` from the [Releases tab](https://github.com/MetaGenAI/MetaGenNeos/releases/).
17+
2. Unzip the contents of `metagenneos_v0.8.9.zip` into the Neos installation folder, and copy `run_neos_metagen.bat` and `grpc_csharp_ext.dll` into the Neos installation folder also.
1818
3. Download ffmpeg from here https://www.gyan.dev/ffmpeg/builds/ffmpeg-git-full.7z, and the contenst of `bin` to the System32 folder (`C:\Windows\System32`).
1919
4. Create a folder called `data` inside the NeosVR folder.
2020
5. Execute `run_neos_metagen.bat` (double click on it). This will start Neos with the plugin loaded, and it will open your local home
@@ -28,10 +28,12 @@ The basic functionality is the same as explained in the release of Metagenbot Be
2828
* [Video in English here](https://www.youtube.com/watch?v=PgQmuIQYoBE&ab_channel=GuillermoValle) ([another video by sirkitree](https://www.youtube.com/watch?v=79xguu735XE&ab_channel=sirkitree))
2929
* [Video in Japanese here](https://twitter.com/sleeping_vrc/status/1355868840081510400) (thanks sleepingkaikai!)
3030

31+
Note: these videos are not completely up to date now, but should give an overall idea of what can be achieved with the plugin.
32+
3133
### Recording
3234

3335
You can record yourself and other users by pressing start record. This will record the movement, voice, and hearing (hearing only for one user). Optionally it can also record the vision. Note:
34-
* Only the users who have the (local) checkmark "Record me" checked will be recorded.
36+
* Only the users who have a checked value field in the slot `metagen config/users` will be recorded.
3537
* The recordings are saved in a folder called `data` in the Neos installation folder, and are organized by world, and by recording session (each given a unique hash). Within each recording session the recordings are found in numbered folders.
3638
* The recordings are done in a robust way that implies that if you crash during a long/important recording, the resulting files will either not be affected, or be easily fixable. Also all recordings are written direclty to disk and thus one can record arbitrarily long recordings without memory issues, although right now recordings are automatically chunked in maximum lengths of 10 minutes.
3739

@@ -47,8 +49,33 @@ You can press Start play to play the last recording made in the current world. Y
4749
You can export the recordings as either a native Neos animated mesh (as showcased in the above videos), and/or as a Bvh file. You can do these by selecting the checkboxes. These checkboxes will determine whether these exports are generated while doing a recording, but also while doing a playback! The later is useful in case you want to generate an animation of a previously recorded motion/voice acting, but with a different avatar, or you want to generate a bvh file for a different skeleton! Note:
4850
* Generating an animation while recording has the advantage that the animation will record all blendshapes and bone transforms of every mesh under the user avatar. (for example it will record if you are holding and using a gun, or if you have facial or other animations!)
4951

52+
### Recording fields and objects
53+
54+
This is a very cool feature Joris requested. We can record arbitrary value fields and objects.
55+
56+
See example for field recording here https://streamable.com/5eq7nx.
57+
58+
TODO: write docs for object recording.
59+
60+
### Puppeteer Neos avatar from Unity
61+
62+
These are the instructions to puppeteer a Neos avatar from Unity. Here it is driving it from a BVH file, for example: https://twitter.com/guillefix/status/1394045763345334273.
63+
**Note:** this only currently works fot the rig provided. Different rigs need different translation between the rig and Neos proxy conventions. An automated system to adapt to any will be comming in the future (see discussion with lox in the MetaGen discord)
64+
65+
1. Download this Unity project: https://github.com/guillefix/shadermotion-bvh-utils
66+
2. Open the scene in `Assets/shadermotion-dev/Example/Example2_motion_stream.scene`
67+
3. Press play.
68+
4. In Neos, having loaded the plugin as described above, check the "External source" checkmark on the Debug play section, and the press Start Play. The bvh gn1.bvh should be driving the rig in Unity (dancebase) which should be driving the default avatar in Neos
69+
70+
You can change the avatar by dropping the avatar slot in the avatar slot field in the MetaGen UI. Some avatar won't work because of some scaling issue. Making the scale in the AvatarRoot component in the avatar to match the scale of the avatar slot sometimes fixes this (need to investigate furhter).
71+
72+
In the Unity scene, in the dancebase gameObject, in the BVHAnimationLoader script, there're two fields, `SkeletonScale` ans `SkeletonShiftY` which need to be tweaked a bit sometimes when using avatars which are smaller/larger (will try to investigate a way to automate this too in the future).
73+
74+
5075
### Generate animations from video
5176

77+
This is a special case of the above instructions, but for using ThreeDPoseTracker
78+
5279
This is using a version of [ThreeDPoseTracker for Unity](https://github.com/digital-standard/ThreeDPoseUnityBarracuda) which I've modified to send pose data to Neos. This allows you to generate an animation from a video. You can check [here](https://youtu.be/x-VGy3X0bME?t=162) for guidelines of which videos will give best results, but this is still a technology in development. Check this video for a demo:
5380

5481
[![image](https://user-images.githubusercontent.com/7515537/111398938-96675b00-86c4-11eb-8f9f-7bbe0e34d8b7.png)](https://www.youtube.com/watch?v=k5a_MJhzbdc&ab_channel=GuillermoValle)
@@ -58,6 +85,8 @@ To use it:
5885
1. Download this Unity project and unzip it: https://drive.google.com/file/d/1G2OTyhVysEKXAmIU0-K2IkWU99DgeARa/view?usp=sharing
5986
2. Open the Unity project. Open the scene SampleScene in Scenes. Select Video Player and on its Video Player component drop a video on the Video Clip property. Then play the scene, and after it's begun playing, go to Neos, check the "External source" checkmark on the Debug play section, and the press Start Play. The 3D movement inferred by ThreeDPoseTracker should be reproduced on the avatar which you are playing with.
6087

88+
89+
6190
-------------
6291

6392
Credits:

0 commit comments

Comments
 (0)