Google can now identify clothing with new fashion app
Recently, Google has launched a fashion photo identification tool called Style Match. Developed on the basis of Google Lens technology, this application allows users to take an image of clothing and then provides similar designs from around the Internet. Although many of today's tech companies have introduced such features, there is no denying that Google's involvement in the competition has contributed to image recognition feature, making it more popular.
Google Lens is the application first launched at Google I/O 2017. The application allows you to extract text and links from the image. It is also capable of identifying some of the world's famous landmarks. Google Lens was built on Google Pixel phones by the end of 2017, and then expanded on all Android phones in March this year. This application is also available on iOS devices since March 16th.
Google has recently announced some updates for Google Lens that make it even more convenient. At the I/O event in 2018, Google said that Google Lens could be integrated in camera applications on the phones from 10 manufacturers: LG, Motorola, Xiaomi, Sony, Nokia, Transsion, TCL, OnePlus, BQ, and Asus. Accordingly, when you open the camera on one of the devices manufactured by these labels, you can point the camera at an object and look for similar products.
When testing a casual outfit, the app will automatically analyze the data and then make recommendations accordingly.
It's noticeable that Google Lens can capture and identify key product characteristics with information collected through the Google Shopping app.
However, being a brand new invention, the application cannot complete its features and overcome its disadvantages. Some users have responded that they cannot find the correct results for designs from famous brands. For example, after posting a pair of black Converse shoes, Google Lens gives the image results of the same color shoes without any suggestion displaying the Converse brand name.
Users tried to choose an image from the recent advertising campaign of Dior. The luxurious Mary Janes high heels were selected for the analysis of suitable data and images...
However, the result did not contain the name of high-end French fashion house, there were products under completely different labels instead.
In addition, Google Lens cannot fully name the product precisely because of capture angle or light. For example, in a recent ad of Stuart Weitzman, supermodel Kate Moss wore a stylish sheer blouse. However, because of her pose, the application identified this as a long sleeve design with asymmetrical cuts.
Coat designs from Rick Owens or Elizabeth and James were suggested by this fashion application.
In addition, products with price differences placed side by side also raise concerns that this app will reduce the value and prestige of high-end brands. According to the image above, a $ 2,000 Rick Owens jacket was suggested next to a $ 39 blouse sold in popular stores.
Hope that Google will continue to improve the usability of such applications, while also addressing the shortcomings that are still existing to provide a more enjoyable and useful experience for users in the future.
By: Quinn Abrams