Breadcrumbs, a “scaled down” Siri API, Set to Support Informed, Intelligent Assistance

IMG_1283As the details surrounding Apple’s iOS 9 come to light, it is clear that Apple totally “gets” the idea of intelligent assistance. As reported by Mark Gurman in this article in 9to5 Mac, it is pursuing an initiative with the codename “Proactive” that will, like Google Now, ingest data from resources that reside on the device, including Contacts, Calendar, Passbook along with selected, installed 3rd-party apps to provide highly informed responses, search results and suggestions and either display them on a lockscreen (similar to Google Now) or inform responses to entries into the “Spotlight Search” text box or spoken input to Siri.

Gurman credits the 2013 acquisition of a personal productivity app called Cue (founded as Greplin) for providing the resources that transform personal information into cards that support scheduling and discovery of user intent. This is where it is compared with Google Now, which keys off of personal information, provided by phone owners that includes location, scheduled events, search results and even email contents in order to predict “what’s next” for the phones owner.

In a video interview with Robert Scoble in late 2013, Cue’s founder and CEO Daniel Gross noted that the app supported connections to over two dozen sources. In addition to Google properties (like search, Gmail, Google Calendar…), the list included DropBox, LinkedIn, Evernote and Salesforce. Cue would aggregate, scroll through and tag (create semantic understanding of) information from those sources to make an “intelligent snapshot” of a person’s context. That means it understands the impact of appointment descriptions or purchase confirmations and their impact on activity calendars.

Apple’s new API called “Breadcrumbs” sounds like the sort of intelligent connector that Cue created in order to incorporate 3rd Party data into results from Siri or Spotlight search. Beyond the creation of calendar events based on understanding a users context, Breadcrumbs creates a point of entry for 3rd-parties to influence how Siri can provide specific, individualized responses to the types of questions that, today, prompt her to say some variation of “Here’s what I found on the Web for you.”

9to5 Mac is in the camp of publications that believes Apple is very close to releasing a “full Siri API” which will enable the mobile personal assistant to gain full access to the data associated with 3rd Party apps. I would argue that this sounds like the sort of “one-way” connection that would not necessarily benefit the individuals who are the source of all that data and, indeed, Apple has reportedly backed off introducing such a connector in favor of Breadcrumbs which, according to Gurman “allows Siri to index parts of apps that have been recently accessed” thus enabling search results to reflect recent interests and activities.

What should interest the community of customer care and customer experience specialists are the aspects of Breadcrumbs that will facilitate intelligent interactions between the device-based mobile personal assistant (MPA) and the remote expertise embodied in enterprise intelligent assistants (IA). A truly open API into Siri would invite developers to build their own MPAs. It is highly unlikely that Apple would ever want that outcome. By contrast Breadcrumbs, like Cue, lays the foundation for Siri (or Proactive) to consult with and give structure to responses from enterprise IAs or remote specialists in order to make those responses more relevant to folks using iOS.

Defining how MPAs interact with a multiplicity of enterprise IAs is one of the major topics that will be baked into Opus Research’s coverage of the Intelligent Assistance field, and will serve as a topical discussion on the Intelligent Assistants Developers and Implementers group on LinkedIn and from the podium at the Intelligent Assistants Conference-2015 in New York in October.

 



Categories: Intelligent Assistants, Articles, Mobile + Location

Tags: , , , ,