Intuitive AI-Enabled Teaming: KELSIE
What It Is:
My Booz Allen team developed an AI-assistant that uses a layered approach to AI with mission context to control robotic assets through common language voice commands. The UX team designed a VR simulation demo utilizing my previous product TableTop Commander, showcasing how KELSIE could be employed on a Naval vessel to empower Human-Machine Teaming. The users tested KELSIE by giving the on-craft robots teammates commands in natural speech, and KELSIE would help the robot understand, or what we call “deduce the intent”.
The Challenge:
A prevalent limitation in a lot of the military’s wargaming and Communication & Command (C2) is lack of automation for tasking that could be delegated to unmanned teammates. KELSIE facilitates the C2 for human-machine teaming using Natural Language Processing (NLP) which will drastically decrease cognitive load of warfighters so they can stay mission-focused.
Where I Came In:
I was a part of the original project team, so I was lucky enough to be a part of the conception of the simulation development. My primary role was to facilitate the production of and collaborate on deliverables for our UX Lead.
Persona capture for the demo
Use Case capturing for the demo
Discovery & Defining
The first items our team produced were personas and use cases for KELSIE with the persona in mind: in this case, a member of the Robotic Control Watch (RCW) on board a Naval vessel. It was crucial that the demo we built reflect actual current Navy policies and procedures, as to make the users feel fully immersed, and to demonstrate use case applicability.
Some key deliverables in this phase were persona maps, use case identification, and subject matter expert interviews.
The complete SME interview document
Development & Measurement
Once our user and use case were solidified, we moved into developing the user flow and storyboarding for the virtual experience. We reviewed the user flows with our Navy SMEs to confirm realistic procedural parameters, and to ensure feasibility of commands.
Storyboard iterations created by our Graphic Artist depicting the virtual simulation experience laid out by our user flow.
Various iterations of the demo user flow
Delivery
The VR Environment
The final environment featured a very futuristic ambiance that appealed greatly to warfighters and command leadership, along with recognizable tabletop map features - but digitized and made more efficient than its physical counterpart.
Post-project Reflection
Aesthetically, the design choices for some of the modals featured in the demo (the white cards featuring Tutorial Prompts, etc. pictured in the video) were not strategized or created by the UX Design team since we were offboarded early due to other project timeline conflicts. Looking back, if we were able to stay on board the project longer we would have decluttered the environment and put all modals onto a single one with tabs so the user could more easily focus their attention in the experience.
The User Demo
The finalized demo walked any audience member through a step-by-step tutorial for how to utilize the capability given a very specific hypothetical scenario. The audience was primarily warfighters (Navy, even more specifically), so the language in the tutorial was heavily jargoned and targeted to create a more immersive experience.