Showcased at ICAM Senior Exhibit with 150+ guest participation
Purpose: This is my senior project, which was showcased on June 11-13, 2019 at "Silience", ICAM Senior Exhibition
Categories: Computer Vision Program, Facial Recognition, Interactive Design
Outcome: Create a web application in 2 quarters, portraying machine learning in an ironic way
"Smart Select" is an online music listening platform that recommends songs based on users' current moods. Once started, users can either manually press buttons to indicate their emotions on the left or initiate the live detector on the right. "Smart Select" will then redirect users to recommended YouTube music playlists.
Yet, the recommendation results from the above two approaches are various. If users manually choose their emotions by clicking emotion buttons, YouTube will generate a music playlist based on mood keyword searching, for example, happy, sad, neutral, etc. If users prefer to input their emotions using the facial recognition method, they will receive lists of animated memes deliberately chosen by me. Even worse, "Smart Select" will secretly spy on users' emotions and steal users' data by taking screenshots.
What makes my app unique is its ability to detect users’ real-time emotions ---- in a funny way. Nowadays, the majority of apps provide recommendations based on usage history. Although using past data to guess users’ needs seems somewhat plausible, this method can hardly predict the future precisely. Gathering the real-time data, and returning with an immediate output, will significantly help existing apps in order to provide a more personalized recommendation to users. In such a fast-moving world with the increasing influence of data, I firmly believe that technology that involves using real-time data will soon be commonplace in people’s daily lives. However, I doubt the accuracy of guessing people's emotions by just using the machine will guarantee internet privacy and safety.
Therefore, "Smart Select" indicates my mixed feeling about machine learning.