• Innovation through disruptive and scalable technology .
  • Cutting-edge AI research .
  • Accelerating innovations in research and service .
  • We strive for (and achieve) excellence! .
  • “SotA” (State-of-the-Art) .
  • Visual demo of research and service innovation
  • Human. Machine. Experience Together .

Multimodal Interaction

Making interaction with AI technologies more impactful by using multimodal modalities

Natural HyperInteraction - In the past few years, AI assistant has gained a lot of traction and has seamlessly integrated into the lives of millions of Koreans. Most of the AI assistants presently uses just speech as input for wake up using specific wake up word (aka. Aria) . We are focusing on to make fundamental changes in interaction with AI assistant by adding multiple modalities to be used for interaction and we name the modified and improved version as “Natural Hyper Interaction” based AI assistant.

Hola (Hang on with Language Assistant) – AI robots for kids have found their place in many households. Most of these robots work based on voice commands and/or preliminary computer vision functionality. At AIC, we are working on the education robot using multimodal input (speech, vision, touch) to provide effective and natural way of learning.

AI for mobility – AI assistants for car primarily use speech inputs. We are working on AI assistants specific for in car usage by utilizing natural interaction methods used by human.

  • Lucas
  • Sud
  • Victor
  • Jace
  • Snow
  • Leo
  • Won