Belen Farias, Jonah Loeb, Fallon Shaughnessy
Shopping can be great. It can also be miserable. Amazon has made its fortune on recognizing the latter. Online shopping has increasingly taken over traditional mall browsing, as users enjoy the seamlessness of the online consumer experience from the comfort of their homes. However, online shopping comes with its drawbacks. Primarily, the clothes are only shown through pictures digitally. Shoppers lose the ability to try on their items in real time, to feel the prospective clothing options in real time. Our future human will have the customizability of the online experience while in stores, through the use of our personal shopping assistive device.
The future shopping assistant is an augmented reality device designed to enhance the in-person retail experience in malls by transferring the same powerful recommendation pipelines used in online shopping to the real world. Taking the form of either contact lenses, the future shopping assistant recommends products and stores by highlighting them from their surroundings. Our device then can provide pricing, reviews, and source information for any product the user picks up.
Hayden is a middle schooler who is passionate about keeping up with popular culture. Hayden and her friends often browse Instagram and VSCO to see the latest fashion trends. She enjoys following these popular styles and likes to know where she can find the newest, most popular clothing in the stores where she shops. Hayden, likewise to her peers, looks to conform her style to what’s considered liked by others her age. She worries what her friends think of her, and would like to know if a shirt is cool and trendy before she buys it.
John is a recent college graduate who has landed his first job at a bank in San Francisco. With a newfound income, John is in the market to find new work attire that can complement the clothes his mom bought him for graduation. John considers himself a shopping novice, so he desires a shopping experience that would allow him to compare brands easily. John prefers to try on his clothes and feel them before buying. However, he works long hours and is often tired after work, so time is of the essence.
Veronica is the director of an art gallery in New York City. She is well established in the art community, with over thirty years experience in the industry. Veronica cares a lot about her aesthetic; She describes her fashion sense as “professional with a flair”. She takes pride in the unique-ness of her wardrobe and spends hours handpicking pieces that cater to her taste. The last thing she wants is a cookie cutter wardrobe, as she sees her attire as a reflection of her gallery and brand.
The recommendation pipeline takes a holistic approach to determine the best products. It factors in the users shopping behavior, personality, and style along with current fashion trends, reviews, and sales (for a more detailed breakdown, see the machine learning input/output diagram). The assistant takes into account the social aspect of in-person shopping. When shopping in a group, the assistant will recommend stores that have something for everyone and can recommend products that go together, should the group want to coordinate.
If the user would like a demonstration of the product to see, for example, how an article of clothing fits, the future shopping assistant can produce a virtual model of the user wearing the clothes. The user can dress the model in the clothes from their closet back home or from the store to compare and can share the model with the others in the store and online.
- Mode 1: For users who need guidance on how to shop at specific stores, want recommended products
- Mode 2: For users who want help finding a specific item in a store.
- Mode 3: For users who want to style themselves on the go, can access their closet and products without having to be physically near it. Can create outfits and store them for later.
Users will have the option to choose what information they are alerted about. The built-in notifications include:
- If the item wasn’t available when you wanted it, it will alert you when it is.
- Notifies users of new products available that fit their criteria (style/budget).
- Alerts when stores have sales (in order to promote product use, could possibly team up with companies to offer discounts to users).
- Recommends products based on people whose style you’re interested in.
- Alerts you when new outfit combinations and style choices are available as a form of stylistic advice.
- Reminds users to update their choices (can swipe through recommended styles and brands in order to improve algorithm).
- Have to take pictures of the clothes/items you own in order to train the device. Mark your favorite clothing items.
- Have to fill out a survey on the colors, textures, patterns, and combinations you are interested in.
- Have to let product know what season you are looking for and if for a specific event.
- Record body measurements and ideal fit.
- Ideal price tag, overall budget for monthly clothing spending.
- Note brands and items you are specifically interested in.
- Import photos/styles you prefer.
The user would first buy the contact lenses.
They would then have to input all of their data (see inputs above).
User then selects mode they are interested in.
For Mode 1:
- User is currently at a shopping center and needs help selecting different styles
- The device would give recommendations based on their prior inputs
- You are given a list of possible stores and given directions on how to get there
- i.e. You walk into a store like “Zara”
- The lens would highlight clothes that you would potentially be interested in. You are able to see the star rating and reviews for the product.
- “Style Me” button shows the possible combination of the product with existing items in your closet. You are able to swipe through the combinations.
- The user is asked if they are satisfied with the recommendation, which would therefore improve machine learning.
- The lens also displays images of others and how they’ve used the item.
- It is able to mix and match the item with others in the store and create potential outfits.
- Once the product is bought, confirm on the lens to be added to the list of existing items
For Mode 2:
- User needs help finding an item near them.
- The lens would ask user to input what they are searching for and use their location data to display the item near them.
- A map would then be displayed with directions on how to get to the item
- Once the item is bought, it would be added to the list of items.
For Mode 3:
- The user would click on the “my closet” tab
- This feature has an image of the user where their clothes can be overlaid on their body
- Allows the user to mix and match different combinations of outfits
- Can save these combinations for future use
How We Get from Here to There
The technology we would incorporate in our shopping assistant is already emerging. Augmented reality, and specifically augmented smart glasses have already been produced by companies like Apple, Toshiba, and Epson. A lot of products currently on the market boast GPS, motion sensors, video and picture capabilities. Major drawbacks come in battery life and reliability. However, these early prototypes are promising in how we utilize augmented reality to transform a space.
Our product is inspired by the machine learning used by the current models used in online shopping. Particularly, we look to build on the ability to filter clothing selections based on the person, as well as provide the social component of peer reviews and trends. Amazon has begun to explore technology in the fashion domain, particularly with the creation of the Echo Look. The Echo Look provides style suggestions by analyzing what you’re wearing. With the combination of a high quality camera and machine learning algorithms that look at fit, color, and style, the Echo Look acts as a fashion advisor that responds as fast as within the minute.
We are beginning to see emerging technology that could eventually contribute to the product we propose for the future human. With the collaboration of machine learning and high functioning sensors and imaging used in augmented reality, the shopping assistant provides customizable experiences to consumers in a future that may not be as far off as we think. We project this reality could be here in the next fifty years.
Ethics and Society
- Everyone may not be comfortable using wearable devices: users would not have the ability to stop the display of targeted ads
- People may not feel comfortable with the idea of having their style chosen for them: Some shoppers may prefer an entirely hands-on shopping experience without assistance.
- Accessibility: Those who buy this item would probably be able to afford it, creating a big gap in the socio-economically.
- Depending on the way the AI is trained, it may not recommend appropriate styles for people of color since there are not a lot of models of color online
- Could therefore be biased in the types of clothes it recommends
- Might not be great for people that don’t fit the normal beauty standards
- May not be ethical to increase consumption so much: what does this mean for fast fashion?
Future Directions and Limitations
The personal shopping assistant we created utilizes augmented reality to transform the inside of shopping spaces. Because of the nature of augmented reality as an enhancement to an environment, we are limited in how the physical space is structured. For example, stores layouts can differ drastically in terms of organization. Some stores may have wide open spaces, while others are cluttered. Some stores could be very busy and loud, while others are empty and quiet. The user’s experience can vary based on the variables present in these settings, which is a drawback to our system which boasts customizability.
Therefore, a potential next step would be to create an entirely virtual environment that mimics the real store completely. This would permit users to have their own settings independent of other shoppers or environmental influences. This would be the ultimate blending of the online shopping experience into a physical environment, and would be the union of the in store and digital consumer experiences.