The Meta Assistant service provides natural control and response functions by using user profile, context, knowledge and search results to analyze sensor data and text recognized from user voice input. The scope of the service includes weather reports, navigation using public transport, IoT control, translation service, nearby search, YouTube search, news search, etc.
The system synchronizes with an NLP (Natural Language Processing) engine for analyzing text input, a UPS (User Profile Service) for user profile lookup, a KMS (Knowledge Management System) for knowledge consultation, and an SGW (Service Gateway) for search result collection.
The Meta Assistant service provided by ThinQ.AI has the following features:
Uses knowledge to determine the appropriate service and to generate and provide a system response based on user input or context.
Provides a function to organize information that maintains the context of the input text based on past conversations.
Provides the ability to generate the system's response text and digital human expression form.
|Provides image-based multi-modal service.||
Provides a new multi-modal audio-visual conversation UI that does not require invoking trigger words.
Meta Assistant takes user speeches as input and provides a natural response and control for the current state.
Meta Assistant can be used in various fields.
It can be used to provide users with customized daily information such as weather, directions, translation, and news search through conversations.
It is a knowledge-based intelligent system that provides users with the information they want and enables a digital human to play its role.