These are the project proposals from Prof Alan Smeaton, Director, Insight Centre for Data Analytics at DCU.
Details on me, who I am, what I do, etc. can be found here.
Students interested in having me as their FYP supervisor should note the following about me and the projects I supervise ...
For this year I'm interested in supervising projects like the following:
- I don't supervise easy projects, all my projects are challenging, and open-ended. That means that when we start a project, we don't always know where it will take us but in more than three decades (yep, that's right, three decades) of supervising projects like these, they've always ended up in a good place;
- Students who work with me tend to do well in their FYP marks ... on average the majority will end up with First Class marks for their projects ... over the last several years, for example, I think I had maybe a half-dozen projects that did not end up with Firsts, and I supervise about 5 or 6 projects each year, so I have a good ratio.
- If you put (1) and (2) together then you'll realise that students who work with me tend to be good students ... they work hard and do well, so draw your own conclusions
If you don't know what Watson is then check out some YouTube videos. I'll give a brief overview of project options with me to CA4 and EC4 classes during the second week of semester and after that I'll wait to be contacted by you if you're interested, but be warned, I have a cutoff and don't take all project supervision ideas.
I had several ideas for projects for previous years sketched out below, just to give an indication of the areas I work in. These were opening shots which got teased into fuller project proposals, most of them were taken.
- I have one of these https://www.myo.com/ It communicates in real time via BT to a laptop and sends EMG signals. I'd like to see how it could be used in conjunction with a handheld instrument which in turn has sensors like accelerometers. So think of a surgeon's scalpel with in-built accelerometer working in tandem with a myo to record a surgeon's actions in theatre. Or, as a second possibility, think of using Myo in conjunction with a dextrous use of the fingers, like playing piano or tin whistle ... so how accurately can music be replicated with less effort from the fingers, would we notice the difference in the music, would two music pieces be identical ?
- I recently did a segment on RTE TV with Jonathan McCrea on what makes some things more or less memorable than others, where he wore a consumer-grade EEG and a wearable camera and I quizzed him on what things he remembered, then correlated that with the EEG readings. It would be a really nice project to repeat this on a larger scale rather than just a few hours for TV, and to correlate real time EEG readings with prompting people of important things that are happening which they should remember but their EEG readings indicate they are not in a good ``memory encoding'' state.
- The Alexa family of home devices allow voice-controlled interaction because of their far-field microphones. The platforms include Amazon Echo and Google Home and they're used for controlling a home device. But what about managing your work, like managing your assignments, deadlines, material to be read on Loop, coursework, etc. I'd like to use one of these to help with a student's learning by interacting with the student, driven by material appearing on Loop. Details of the devices can be found here.
- One of our partners has developed technology to turn HD video footage from a drone, into a 3D model ... see this clip
from All Hallows, and there's similar for Trinity campus. They're doing DCU shortly. So what could you do with this 3D model ? THere's a bit of creativity needed but lots of potential.
- Another one of our partners have lots of EEG data and HD camera monitoring from infants in intensive care units in Cork University Maternity hospital. They're interested in identifying seizures that the babies have, using the EEG which absolutely identifies seizures, and also using whatever twitches or other facial features might indicate seizures. We've implemented Euler magnification (see here
) of video to accentuate tiny movements in MATLAB so that could then correlate with EEG determined seizures
- Finally, I'm also interested in supervising projects which sit on top of IBM's Watson system. I've supervised Watson projects for each of the last couple of years as I have access to a Watson instance through IBM contacts. In previous years I've supervised a video captioning system, a tourist app for 1916 hotspots, an evaluation of StackExchange answer ranking and a summarisation system for papers on immunology.
Intellectual Property arising from these projects
The School of Computing has formulated a policy on the distribution of intellectual property arising from student projects. This is based on an agreed a priori disbursement agreement between lecturer and student in the cases where the lecturer has a non-trivial involvement in the project and where the lecturer wants to put in place such an agreement and the student agrees.
In the case of projects from these there is a considerable amount of background knowledge provided and a significant amount of direction and facilitation by myself and other members of Insight Centre for Data Analytics. Consequently we shall require students to sign over intellectual property development rights for such projects before the commencement of the project. This has been a standard practice for all FYPs supervised by me for about the last decade and guarantees that investment in my work by funding agencies does not lead to IP leakage.
- Using eye tracking to determine relevance of images - The Tobii eyeglasses are a pair of glasses that record video combined with eye tracking of what the wearer is actually looking at. This allows us to record what the wearer sees and specifically what he is looking at which is useful in applications where human behaviour is important, like shopping, or searching for example. See http://www.youtube.com/watch?v=HncswXCcYBE/&feature=youtu.be for an example and the equipment we have is http://www.tobii.com/eye-tracking-research/global/products/hardware/tobii-glasses-eye-tracker/. This project will combine an image retrieval system which finds images from a database based on colour, shape, texture and objects, and the eye tracking glasses to see what parts of the images a user is looking at. Based on the user's eye tracking we can make judgements about relevance to feed back into the retrieval system. What is expected ... A working system which integrates an already-existing image retrieval tool with the software analysis from the eye tracker, and a set of experiments with real users.
... Video analysis from a UAV drone - We have access to UAV drones with built in stability and video camera transmitted back to an iPhone/Android handset, described in http://www.expansys.ie/parrot-ar-drone-green-outdoor-hull-indoor-hull-battery-charger-194641/. This project is about analysing the video streamed from the drone, in order to locate something known. So take a large open area and send the drone out to 'find' something. In an open indoor lab, it might be to find where I left my phone or keys. Its a bit of overkill for an application like that but its a proof of concept and would involve route planning, route tracking, etc.
- Fushigi is a form of contact-free juggling. It is based around using a transparent ball, about the size of a tennis ball, and performing a series of movements that create the illusion that the ball is gravity-free and suspended in mid-air. Its all trickery of course, and what happens is the ball is kept in the one place while the arms flail around doing different motions, creating the illusion. Details can be seen at http://www.youtube.com/watch?v=3BXbMFGWL-k This project is to create a smartphone app using the OpenCV library, to track the position of the Fushigi ball and measure how much it moves in space, thus giving feedback to the person on how well they are performing ... think if it as a smartphone based Fushigi training app with real time video analytics and user feedback.
... Analysis of a Twitter dataset. We have collected several tens of millions of Tweets specifically on the topics of fast food and soft drinks and the project here would be to analyse this dataset for social networking, sentiment, geography, and inter-connectedness. From the dataset I would like to determine a cohort of Twitter users whose tweets could then be tracked to monitor the sentiment towards fast food, or fizzy drinks.
... Yahoo! have created a collection of 100,000,000 images, each of them tagged and many geo-tagged, and for certain research groups (including ours) have provided access to processing resources to access and analyse these images. The project idea here is to take a user photo, locate fragments of the user photo that occur in the Yahoo! dataset, and automatically build a set of tags or captions for the new photo from the set of tags of images from the 100m dataset, to which it is similar. SO if you take a picture of the Molly Malone statue in Dublin, other photos from the dataset of the same thing will be tagged with things like DUBLIN, STATUE, MOLLY MALONE, and so your new photo can be similarly tagged
... DCU student records are like the public body records that most organisations have .. .they are messy and full of errors. Name variations (Donnacha, Donncha or Alan, Allan), spelling errors, variations in addresses (Millview Road, Millview Rd., M'view Rd., etc.) all cause incorrect processing. Of course when postcodes are introduced these problems will all disappear but until then we're stuck with messy data. This project will investigate data cleaning methods and evaluate some of them on a real dataset, from DCU.
(Student Agreement Form
- DOC; 26K)
Date: 21 September 2017.
Best way to contact me is by email ... don't just call to my office without checking with me first as I probably won't be there !