Google’s newest AI assistant was recently revealed in a big event together with the much-awaited Pixel smartphones. Included with the reveal was the announcement that it will be an open platform system.
Going open platform basically gives developers access to Google Assistant to develop their own set of features for the system which could be integrated into different products, reports The Next Web.
Developers could give Assistant the ability to look up information on local businesses, search through YouTube videos, and a lot more beyond what Google initially included in the system.
Assistant will essentially have two kinds of actions: Direct and Conversation actions. Direct action allows it to accommodate simple commands such as “turn off the light.” Conversation action, on the other hand, will require the back-and-forth interaction between the user to narrow down and accomplish the end goal, such as booking an Uber or looking up a specific place to eat at.
Developers will be working with what would amount to be a learning system where even more complex activities could be incorporated based on needs.
Additionally, natural language interactions that were developed using api.ai can be converted into actions for Assistant.
More details for developers will be released by Google come December as well as unveil an Embedded Assistant SDK by next year. The SDK will help hardware manufacturers to integrate Assistant into their products.
A growing AI that any developer can “teach” new tricks is certainly an intriguing proposition. It would be interesting what developers will bid the Assistant to do as the months roll by. Alfred Bayle
RELATED STORY