Augmented and Virtual Reality Development in Unity
- From the projects outlined below, I have developed experience in various aspects of Augmented Reality and Virtual Reality development.
- In terms of AR, I have experience developing image-target-based AR systems that use real-world 2D images as triggers (or targets) to overlay virtual content into a real world scene, as well as using the Unity engine’s built in AR functionalities to create open-world interactive AR experiences for Android and iPhone.
- For VR, I have experience integrating SteamVR into Unity to transform standard Unity scenes into VR environments, developing teleport-based locomotion systems that allow users to navigate VR environments and implementing interactable objects in VR worlds.
- My specific skills in this area are summarised as follows:
- Configuring image targets and developing systems for projecting digital AR content into the real-world using Vuforia Engine and Unity.
- Designing 3D environments for AR and VR applications using pre-made 3D models and animations.
- Writing code for camera controllers that allow users to switch perspectives when inside AR and VR environments.
- Programming character controllers and locomotion systems that allow VR users to move around AR and VR environments.
- Writing code to enable user interactivity with characters and objects in AR and VR Unity scenes.
AR and VR Projects
Physical Computing and Interaction Design using Microcontroller Systems and Unity
- I have acquired project-based experience in the area of physical computing by developing several interactive systems that can sense and respond to phenomena in the real world.
- My projects have focused on either using Arduino-based microcontroller systems for specific functions, or using Arduino hardware to transform analog inputs into a software application developed in Unity.
- As a result of the broad types of projects completed, I have experience working with a wide range of sensory data types including motion, sound, light and climate data.
- My specific skills in this area are summarised as follows:
- Rapid prototyping using a wide range of sensors and devices.
- Writing code to control and interact with Arduino microcontroller systems.
- Applying knowledge of electronics to design and build circuits that can connect and synchronise different types of sensors, actuators and microcontrollers.
- Developing systems for automatically collecting and processing sensory data gathered in real-time from different types of sensors and devices.
- Developing applications for tracking and visualizing different outputs.
- Designing and implementing interactive user interfaces for smart devices and games.
Physical Computing and Interaction Design Projects
Digital Twinning, Deep Learning and Automatic Knowledge Graph Construction
- My recent research projects has been focused on the research areas of digital twins and developing deep learning algorithms for knowledge graphs.
- These research projects have allowed me to apply my programming skills to the tasks of:
- Extracting and processing numerical and textual data collected from real-world sensors and social media.
- Automatically generating graphical models using search and deep-learning algorithms.
- Visualising information extracted from data using 2D graphical models and advanced 3D visualisations.