Newshub - NUS' News Portal

Perfect dressing without putting on clothes

27 February 2013



A user can choose a garment to try out from the Magic Closet with a wave of the hand


The team comprising Dr Yan (second from left), Mr Nguyen (third from left) and Dr Liu (fourth from left) with their Best Technical Demo Award at the ACM MM Conference 2012

Help is at hand for those who have problem deciding what to wear or how best to mix and match their clothes for the best image.

A "magic closet" developed by an NUS team will allow users to try out different outfits with a wave of the hand. It will also dispense tips and suggestions on dressing up for different occasions. This eliminates the hassle of taking off clothes and putting on garments that may be of wrong sizes, as well as jostling with crowds lining up for fitting rooms.

The team that created the application comprises NUS Department of Electrical and Computer Engineering's Dr Yan Shuicheng, Dr Liu Si, Mr Tam V Nguyen and Mr Feng Jiashi.

Project leader Dr Yan highlighted that the Magic Closet can be tapped as an i-Phone or Android application for personal use or as a plug-in system for online shops to let their customers try out and select clothes they like. It can also be used by traditional stores as a virtual wardrobe for clients. "To the best of our knowledge, Magic Closet is the first one to investigate the task of occasion-oriented clothing recommendation and clothing pairing, which mines the matching rules among more semantic attributes from real images automatically," he said.

To choose the proper attire, the user just enters an occasion, such as for a wedding, conference, interview or general shopping. The Magic Closet cleverly pulls out from its database the most appropriate items from the user's personal clothing photo album, or automatically matches the user-specified reference clothing (upper-body or lower-body) with the most suitable one from online shops.

The Magic Closet first records the catalogue of clothes in a wardrobe from various predetermined sources, employing a Microsoft Kinect camera to capture defining features such as colour, cut, collar type and sleeve length. When the user places a request for a garment, the software selects the best matches from its database of 25,000 outfit images - from online shopping websites and photo-sharing sites - and displays pictures of selected attire.

The database images are then fed into Amazon's Mechanical Turk labour market, where online workers categorise the outfits. A sophisticated machine learning algorithm customised by the researchers learns some matching rules from the large-scale database to have the ability to fulfill the request.

Dr Yan revealed that the invention is almost ready for use commercially. "We are currently fine-tuning it to increase its ability to detect users' shape and figure from various angles besides the frontal view. We also plan to make use of the large-scale annotation data collected from Amazon's Mechanical Turk, such as 'red', 'V-shaped collar' and 'long sleeve', and develop some machine learning algorithms to further refine the retrieval performance," he said.

For its innovativeness, the Magic Closet which was jointly developed with China's Chinese Academy of Sciences clinched the Best Technical Demonstration Award at the 20th Association for Computing Machinery (ACM) Multimedia Conference 2012 in Japan.

A demo of Magic Closet can be viewed in this video


top