VR Avatar: Difference between revisions

From Computer Laboratory Group Design Projects
Jump to navigationJump to search
(Created page with "Client: Matthew Johnson or Olly Powell, Frontier <mjohnson@frontier.co.uk> With the growth of online VR spaces users lack a consistent identity and a way of expressing th...")
 
No edit summary
 
(One intermediate revision by the same user not shown)
Line 1: Line 1:
Client: Matthew Johnson or Olly Powell, [[Frontier]] <mjohnson@frontier.co.uk>
Client: Matthew Johnson, [[Frontier]] <mjohnson@frontier.co.uk>


With the growth of online VR spaces users lack a consistent identity and a way of expressing themselves to each other. Create a VR avatar API that provides an animating model representing the user. It should reflect the pose of the user with hand controllers and headset in the simulation. The avatar face should animate with voice input captured from the headset microphone. Consider customisation and expressive features.
With the growth of online VR spaces, users lack a consistent identity, and a way of expressing themselves to each other. Create 1) a VR Avatar API that provides an customisable animating model representing the user, and 2) an example app to demonstrate its application. The model must smoothly animate to reflect the user's physical pose from the hand controller and headset position and orientation data (perhaps employing inverse kinematics). Provide some means by which a user could drive emotive animations and expressions (such as shows of surprise, pity or anger). Consider customisations for the Avatars, and provide some back end service or 'Shop' for extensions or customisations to be downloaded.

Latest revision as of 19:12, 9 November 2018

Client: Matthew Johnson, Frontier <mjohnson@frontier.co.uk>

With the growth of online VR spaces, users lack a consistent identity, and a way of expressing themselves to each other. Create 1) a VR Avatar API that provides an customisable animating model representing the user, and 2) an example app to demonstrate its application. The model must smoothly animate to reflect the user's physical pose from the hand controller and headset position and orientation data (perhaps employing inverse kinematics). Provide some means by which a user could drive emotive animations and expressions (such as shows of surprise, pity or anger). Consider customisations for the Avatars, and provide some back end service or 'Shop' for extensions or customisations to be downloaded.