VR Confession box.
A VR confession box with an AI priest. Should use speech recognition and an embodied conversational agent
A VR virtual pet that can be interacted with via hand gestures
Use game AI and procedural animation to create a simulation of a virtual pet such as a puppy or maybe a peregrine falcon that can be interacted with via sound and hand gestures, tracked using the Leap Motion.
Simulation of bacteria in the gut microbiome using ECS in Unity
This aim of this project is to develop a realtime simulation of how bacteria in the gut microbiome interact. This project was suggested by Dr Alfonso Rodriguez-Herrera of Kilkenny Hospital, who is one of the world experts in the field.
Medical Training in VR
I recently visited Kilkenny hospital and discovered that there are lots of opportunities to train doctors in various techniques using VR. A lot of training that is currently being done using either videos or physical models can be augmented using VR simulations. A couple of examples include how to insert a feeding tube or how to inset a laryngeal mask. Check out this video:
VR Game AI Editing tools
I have developed lots of behaviours & procedural animations for creatures in C# for Unity. Currently to add or edit behaviours requires either using the Unity editor or Visual Studio. The aim of this project would be make an editor that allows creatures to be edited and created directly in VR using motion controllers. A bit like this dialog tree editor from Westworld:
Buck Rogers & the Planet of Zoom VR
A VR version of this cool game I used to play on the MSX
VR I Ching
The I Ching is a 3000 year old divination text from China. Six numbers between 6 and 9 are chosen by throwing sticks. These are turned into a hexagram, which can then be looked up in the I Ching book to provide insight and guidance into a question posed by the thrower. The aim of this project is to use motion tracking to simulate throwing the sticks and then to provide the text from the I Ching book for the corresponding hexagrams.
VR Potters wheel with deformable physics
A simulation of a pottery wheel that uses hand tracking and deformable physics to allow the user to learn how to make pottery
Cross platform Tunepal App using React Native
Tunepal is a popular music retrieval app with native clients for Android, IOS and the web. The aim of this project would be to rebuild and update Tunepal using React Native so that a common source code can be used to build all the client apps.
Exploring the psychedelic experience through virtual reality
Using VR to convey what a psychedelic experience looks and sounds like. First thing would be to go through trip reports on Erowid to discover the characteristics of various psychedelics. For example, LSD visuals are quite distinct from the visions and contact with non-human intelligences that people report when the smoke DMT. Then the project would attempt to reproduce the experience in VR so that people can see what the experience looks like without taking a psychedelic. This could also be sound driven like SoundSelf and also might incorporate shader programming to generate visuals and generative 3D art and texturing using geometry shaders and other advanced Unity programming techniques.
A framework for procedurally creating and animating walking creatures
The aim of this project would be to create a framework that can make a variety of random walking creatures built from components. Similar to the No Man’s Sky creatures. Check out my swimming creatures from DEEP
Real-time live music visualisation in VR
The aim of this project would be to create a virtual reality experience that can create a beautiful visual environment that responds dynamically to music. It could be use by someone while playing an instrument.
Generative, visionary art in virtual reality
The aim of this project would be to create some beautiful, generative psychedelic, visionary art that can be experienced in virtual reality. The experience can be created through a combination of pre-generated assets and art created algorithmically by using mathematical functions such as fractals.
Virtual Reality Wingsuit Simulation
A simulation of what it’s like to fly in a wingsuit. This could use the Vive controllers
Check out this demo of Tony Cullen’s (DT228/4) Game Engines assignment from a few years ago:
Conversational agent in VR
Build a chat bot that you interact with in VR
Simulation of Riding on the Back of a Dragon
Inspired by these scenes from the movie “How to Tame a Dragon”
The aim of this project is to create a VR simulation of riding on the back of a dragon. I expect you will build a hardware controller with some reigns, sensors and an arduino so that you can control the virtual dragon in the same way as the movie. You should be able to fly over a virtual world and get your dragon to breath fire etc. You could also use fans and haptics to enhance the sense of immersion. Check out my lovely vertical video where we tried to make something like this at a GameCraft:
ET Bike Chase Simulator
Similar to the above, but using VR, an exercise bike and fans to allow the user to experience this classic scene:
Networked Gesture Controlled Robot Arm
The aim of this project would be to use Kinect or Leap Motion to track a persons movements and transmit them remotely to a robot arm, or even a humanoid robot.
Nanobot Chemistry/Biology/Inside a Computer VR game
The aim of this project would be to give the user the experience of being shrunk down to nano size so so they can (for example) manipulate individual atoms and assemble molecules. You could also have the nano player do some gameplay based on biology or looking at modelling a simple computer in VR that the user can observe/manipulate. You could use the Kinect/Rift to allow the player to experience the world. This should be a serious game – in other words a fun game that can also teach the player something useful. Check out this prize winning project from DT228 grad Mark Dunne from some years ago as inspiration: