Young fashion shoppers today are demanding personalization more than ever. According to an IBM study, 52% of female Generation Z would like to see tools that allow them to customize products for themselves. This coincides with an ever-increasing expectation for speed in delivery of product. While several fast fashion retailers can get product to shelves in weeks, the majority of clothing items take anywhere from six to 12 months of development. Technology is impacting throughout the supply chain to shift this forward, including in the creative process itself.
At Tommy Hilfiger, we have always been industry pioneers and driven by our vision to break conventions. We are constantly seeking new ways to evolve and respond to our consumer's expectations. In this age of newness, it's more important than ever to evolve and deliver on the instant gratification that consumers are looking for. Two years ago, we had the unique opportunity to bridge the gap between the runway and the consumer and pave the way for the future of retail. In September 2016, we launched #TOMMYNOW, a "see-now-buy-now" model that shortens the typical 18-month production process into just six months, and built an experiential runway show around the product, creating content and engagement with our physical and virtual consumers.
Success in fashion retail was once predicated on having an instinct for the next big thing. But in an omnichannel, social media-saturated consumer environment, human powers of divination alone – even those informed by trend forecasters – are not enough. The best fashion businesses are increasingly combining human with artificial intelligence (AI) to improve customer service, identify consumer trends, offer greater personalisation and to ensure their supply chain is as responsive as it can be. Last month, Marks & Spencer announced it was replacing switchboard staff with AI technology to increase the speed it deals with customer complaints and queries. The system recognises human speech and routes calls to relevant departments, and will be used in all 640 M&S UK stores by the end of this month, as well as its 13 UK call centres.
With that in mind, we've outlined 50 ways in which retailers are putting AI into action, from personalising beauty to forecasting demand. While the predominant function of Sephora's Virtual Artist app is to allow beauty buyers to try on products virtually via augmented reality, the brand recently introduced a colour match tool, powered by AI. This tool determines the particular shade of a product on a photo and suggests similar products available at Sephora that the consumer can then try on and purchase. If there's one sector where AI has been making a lot of noise, it's beauty. Olay's Skin Advisor is an online consultation platform that can tell the true age of a user's skin from a selfie. By using AI to both evaluate and determine problem areas, as well as the overall condition of the skin, it also provides personalised skincare routines and reports.
A new partnership between computing giant IBM and The Fashion Institute of Technology (FIT) in New York is aiming to embed AI into the full spectrum of the fashion industry. The partnership will see a suite of artificial intelligence (AI) tools covering deep learning, natural language processing and computer vision applied to the fashion industry, across design and development, merchandising, supply chain and retail. It will see the FIT/Infor Design and Technology Lab (DTech Lab) build on a previous partnership with the technology heavyweight, which saw the DTech Lab work with IBM and leading fashion brand Tommy Hilfiger. The project, Reimagine Retail, focused on using AI to increase the brand's competitive position through optimisations in product design, supply chain and market insights. "Reimagine Retail was a powerful example of what happens when fashion partners with a global tech leader to advance challenging innovations," said Michael Ferraro, director of the FIT/Infor DTech Lab.