VR I Ching
The I Ching is a 3000 year old divination text from China. Six numbers between 6 and 9 are chosen by throwing sticks. These are turned into a hexagram, which can then be looked up in the I Ching book to provide insight and guidance into a question posed by the thrower. The aim of this project is to use motion tracking to simulate throwing the sticks and then to provide the text from the I Ching book for the corresponding hexagrams.
Exploring the psychedelic experience through virtual reality
Using VR to convey what a psychedelic experience looks and sounds like. First thing would be to go through trip reports on Erowid to discover the characteristics of various psychedelics. For example, LSD visuals are quite distinct from the visions and contact with non-human intelligences that people report when the smoke DMT. Then the project would attempt to reproduce the experience in VR so that people can see what the experience looks like without taking a psychedelic. This could also be sound driven like SoundSelf and also might incorporate shader programming to generate visuals and generative 3D art and texturing using geometry shaders and other advanced Unity programming techniques.
A framework for procedurally creating and animating walking creatures
The aim of this project would be to create a framework that can make a variety of random walking creatures built from components. Similar to the No Man’s Sky creatures. Check out my swimming creatures from DEEP
Real-time live music visualisation in VR
The aim of this project would be to create a virtual reality experience that can create a beautiful visual environment that responds dynamically to music. It could be use by someone while playing an instrument.
Generative, visionary art in virtual reality
The aim of this project would be to create some beautiful, generative psychedelic, visionary art that can be experienced in virtual reality. The experience can be created through a combination of pre-generated assets and art created algorithmically by using mathematical functions such as fractals.
Virtual Reality Wingsuit Simulation
A simulation of what it’s like to fly in a wingsuit. This could use the Vive controllers
Check out this demo of Tony Cullen’s (DT228/4) Game Engines assignment from a few years ago:
Conversational agent in VR
Build a chat bot that you interact with in VR
Unity Game AI Game
I have developed lots of Game AI examples in Unity and C#. For example these bots:
The aim of this project is to build a framework for AI bots to fight each other. You program a procedural scene with rooms, corridors and cover. You would then program the AI for a team of bots that patrol the scene, attack bots from other teams, find cover, defend themselves and manage their health and ammo. People could then program a team of bots to fight your bots. I would like to make the second semester Game Engines assignment for students to program a team of bots that would have to kill all your bots. There would be rules that people would have to follow. For example, they would have to use certain API’s for movement, firing and collision detection and would not be able to manipulate or query the state of any of the other bots directly. This is a bit like Robocode, but in C# and Unity. You would use steering behaviours, state machines, navgraph generation, A* and FSM AI. Possibly also behaviour trees.
Simulation of Riding on the Back of a Dragon
Inspired by these scenes from the movie “How to Tame a Dragon”
The aim of this project is to create a VR simulation of riding on the back of a dragon. I expect you will build a hardware controller with some reigns, sensors and an arduino so that you can control the virtual dragon in the same way as the movie. You should be able to fly over a virtual world and get your dragon to breath fire etc. You could also use fans and haptics to enhance the sense of immersion. Check out my lovely vertical video where we tried to make something like this at a GameCraft:
ET Bike Chase Simulator
Similar to the above, but using VR, an exercise bike and fans to allow the user to experience this classic scene:
Networked Gesture Controlled Robot Arm
The aim of this project would be to use Kinect or Leap Motion to track a persons movements and transmit them remotely to a robot arm, or even a humanoid robot.
Nanobot Chemistry/Biology/Inside a Computer VR game
The aim of this project would be to give the user the experience of being shrunk down to nano size so so they can (for example) manipulate individual atoms and assemble molecules. You could also have the nano player do some gameplay based on biology or looking at modelling a simple computer in VR that the user can observe/manipulate. You could use the Kinect/Rift to allow the player to experience the world. This should be a serious game – in other words a fun game that can also teach the player something useful. Check out this prize winning project from DT228 grad Mark Dunne from some years ago as inspiration: