Multi-Modal Interaction for Samsung Note21
SRI Bangalore MMI UX team collaborated with Samsung's executive team to develop a new product name "Bixby Marker" for enhancing the everyday smartphone experience using Multimodal Interaction(MMI)
"Bixby Marker" is a cross-action application feature in Samsung devices that enables users to complete their action in a frictionless way through Region of Interest-based Multimodality with the help of Bixby. It Understands the user intention by extracting the foreground context by Region of interest and offers an appropriate response to user utterances.
MMI UX team led this entire effort from idea to prototype in collaboration with the NLU development team in SRI Bangalore. This idea is presumed to be commercialized by 2021 - '22.Case Study
Enhancing the future shopping journey with the help of AI and chatbots
Spatial AI & Emotional chatbot is an experimental project in collaboration with SR Experience and Insight Lab (E&I Lab) Samsung HQ for Thhe Future Tech Exhibition'21. The project brief was to create engaging and personalized Indoor/Retail experiences with Spatial Intelligence and Emotional Chatbot by Leveraging our capabilities in Natural Language Processing, Indoor Positioning, tracking, navigation, etc.
The objective is to Understanding the pre-pandemic and current purchase behavior of MZ consumers across retail settings (offline/online) and identify their needs, aspirations, and values; to envision what the best purchase experience of the future looks like.Case Study
Exploring the future of moment capturing and functional everyday uses cases with AR glass
Glance is an exploration to Understand, Design & Document Multi-modal natural interactions for Extended Reality devices by undertaking AR glass as the next computing device.
By Understanding the AR glass domain in terms of usability and emotional experience in everyday scenarios and moment capturing scenarios helped me to draw directives on Do's and Don'ts while designing for AR glass.
This study involves realizing the privacy concerns, human fashion of the interaction paradigm with computing devices like desktop, mobile, wearable, etc., and with the environment around them.
This project results as to extract high-level insights from research and develops rationally futuristic concepts and use cases for everyday AR GlassCase Study
Designed 3D emoji Outfits and Headgears for Samsung's AR emoji studio
AR emoji 3d Asset is a commercialized project done by our team for Samsung mobile phones. The project goal is to do an ethnographic study on Middle east and Indian Punjabi apparel and accessories to understand the fashion trends of Genz users. With the help of the research insights, we have created delightful 3D assets for the AR emoji studio.
Kindly click on the "Case study" for detailed information about this project.Case Study
A controller for everyday interaction for future Device (AR glass + controller)
The project brief is to come up with design concepts focused on interaction set for a near-future commercial AR glass.
This project helps me understand the hard truth about the actuality of Extended reality when it comes to the commercial aspect. The research and study direct the promised extended reality scenarios from industries and movies will not be applicable for an everyday AR glass scenario.
In general, Human's everyday interactions and spaces around them are dynamic. A user can be in public, private, or group scenarios in their day. So the interaction practice for an AR glass should be acceptable by society.
Our team set out to find insights and what will be the ideal way that user wants to interact in near future AR glass.we Collaborated with Samsung's Advance technical lab for demoing the interactions and do usability testing.Case Study
Multimodal Interaction (MMI) guide book for designing HRI and AI
It is a collaborative project between Samsung India, Samsung Head Quarters (South Korea ), and Kaist University (South Korea).
The Responsibility of this project is to create a monolithic interaction guidebook for the designer and developers who work on future Human-Robot interaction and its Artificial intelligence(AI) level.
To develop such a guide book our team went on in-depth research in understanding Robots & their market, Human computer Interaction (HCI) Extraction, Human-Robot interactions, Artificial intelligence. etc.
This project handed me an opportunity to work with diverse stakeholders like research professors from the university, Product Managers, and fellow designers. My considerable contribution in this project is primarily on Research, Testing & Analysis of framework, and collaboration.Case Study
A multimodal Interaction guidebook for designing XR
The core idea of this project is to create a MMI framework for XR to help Samsung designers and engineers.
To develop this framework, we done a intense research on different XR domains starts from the industrial revolutions, interaction trends, market structures, competitors' design processes, multiple modalities and distinct evaluation methods in XR, etc.
This process helped us to get a bottomless understanding of XR and derive the Multimodal interactions Do's and Don'ts while designing for Extended Reality.Case Study
Volumetric capture to create 3D digital human Avatar
Collaborated with Samsung AR vision intelligent team.
In the 1st phase of the project, the UX team tried to support existing 3D Avatar creation technology developed by the AR vision team and think up realistic experiences for the user to apply in their daily digital content creation.
Later in phase 2, we proposed concepts to the dev team which requires additional or future technology to scale up the existing experience and technology.In this project, I worked with different stakeholders inc PM, Engineers, and fellow UX designers.
© 2017 www.aravindmaya.com. All rights reserved.