VR in STEM Education

The proposed research is to develop a Virtual Reality (VR) learning scenario for STEM (science) Education. This is software that will enable students to interact with and augment an immersive 3D virtual learning environment, allowing them to experience inquiry based learning (IBL) in the science classroom.
IBL has established itself as the pedagogy (teaching strategy) of choice for the future of STEM education within Europe and Ireland. This project will seek to embed and scaffold the essential features of scientific inquiry – Question, Evidence, Analyse and Explain – within a virtual learning environment in order to facilitate student directed inquiry.
The project will have access to and use of Virtual Reality Device HTC Vive - https://www.vive.com/eu/product/.

Keywords: Virtual Reality, IBL, Tactile, Virtual Learning Environments, STEM Education, HTC Vive.

Requirements: knowledge of Steem VR, Unreal, Unity.
Main tasks:

    Tablet Reading Tool

    Spritz is an application/technique for reading content quickly without needing to move your eyes. It simply displays words on the screen in a fixed location and quickly moves onto the next word. You can comfortably read at 500+ words per minute using such a technology. However one problem is that your eyes get tired and you naturally need to take a break from time to time. The focus of this project is to implement a word-based reading tool based on spritz and then add in an element that observes a webcam/selfie-cam to identify when the user is no longer focused on the screen, and stop the playback of words accordingly. The playback can then begin again when the user focused on the words. I expect that this would be in the form of a reading-app.

    Keywords: reading, spritz, eye tracking, computer vision
    Requirements: knowledge of java/python programming (or equivalent).
    Main tasks:
    • gather an archive of reading material from online news (or equivalent) APIs
    • implement a variable-speed reading technology in the spirit of spritz
    • develop the eye tracking tool to observe user-focus
    • regulate the playback based on the user-focus

    Your Life in VR

    Virtual Reality is fast becoming a popular technology. At the same time, it is easy to capture vast archives of personal multimedia data. This project aims to being together both of these fields to develop a first-of-its-kind reality lifelog visualisation tool that allows an individual to explore their past life data in a 3D environment. This involves developing for the HTC Vive.

    Keywords: lifelog, VR
    Requirements: a strong ability to develop and in interest in developing visual gaming.
    Main tasks:
    • Gather lifelog data, or use existing datasets
    • Import the data into the Unity engine
    • Design and develop a 3D world for navigating your life experience data
    • Engage in a small user trial to evaluate the VR system.

    Brain Computer Interfaces

    Using contextual sources (e.g. wearable cameras) and EEG (sensing brain activity using wearable devices), it is possible to identify important events in real-time for an individual, record these events from their point of view, and then develop a life-experience search engine for the individual, as a form of digital memory archive. This includes learning the identifiers of interesting events from contextual and EEG sources, developing associated event segmentation and importance weighing algorithms and developing subsequent retrieval tools for this archive.

    Keywords: lifelog, EEG
    Requirements: knowledge of java/python programming (or equivalent) and an interest in data analytics.
    Main tasks:
    • Gather data using a portable EEG and a lifelogging camera
    • Time-align the data into a dataset
    • Apply a state-of-the-art event segmentation algorithm
    • Enrich and segment using EEG data
    • Begin a user trial

    Self-search Engine

    The challenge of this project it to build a search engine for personal lifelog data from wearable cameras, biometric data and smart-phone sensors. This search engine should operate in a manner similar to google, but search a person’s life, instead of web pages. Images and multimedia data can be made searchable using online APIs, such as the computer vision API from Microsoft. A knowledge of machine learning would be useful here, but not essential.

    Keywords: Lifelog, search engine, computer vision API
    Requirements: knowledge of java/python programming (or equivalent).
    Main tasks:
    • Interface with online APIs to get metadata for multimedia lifelog data
    • Index this data into a search engine
    • Provide Google-style search over this life data archive.
    • Evaluate the user-feedback for the system.