A single smartphone assistant can be considered the best after crushing rivals during a new test



[ad_1]

The venture capital firm Loup Ventures recently tested the three main virtual digital assistants by performing a test. Google Assistant, Amazon's Alexa, and Apple's Siri each received the same 800 questions and tasks and were evaluated based on their ability to understand the query and respond to it or answer it correctly. The results? Google Assistant understood the 800 questions asked and answered correctly at 93%. Siri understood 99.8% of the questions and correctly answered 83.1% of the time. Alexa almost equaled the perfect understanding of Google Assistant with a score of 99.9% but did the worst in terms of accuracy with a score of 79.8%.
The three digital assistants improved on last year's results, when Google Assistant answered 86% of the questions correctly. Siri was right 79% of the time and Alexa gave the right answer 61% of the time. The report, written by analysts Gene Munster and Will Thompson, notes an important factor; This test, like that of last year, measures the response of these digital assistants on a smartphone rather than on a smart speaker. This is important because the lack of screen could change the response of a speaker as opposed to a phone. Since this test was done using phones, the questions were shorter and the use of a screen allowed digital assistants to answer certain questions without having to verbally announce the answer. .

Tests show that Google Assistant is the best digital assistant available on a smartphone.

Questions were put to each of the three digital assistants, classified into five categories: local, commerce, navigation, information and command. The Google Assistant got the highest score for each of them except Command. This last group of questions focused on phone-related tasks such as email, sending SMS, calendar and music. Siri ranks first in this category, with 93% to 86% more than Google Assistant. Siri finished second in the Local category ("Where's the nearest bookstore?") And Navigation ("Which subway should I take to get downtown?"), While finishing last in Commerce ("Order me a pack of 24 Coca-Cola and three boxes of M & M cones "). In that department, Alexa finished second while she also ranked second in Information ("What time are the Yankees playing tonight?"). Alexa, with her Amazon affiliation, was the Commerce Department's favorite, but failed to surpass Google Assistant. Alexa also had the disadvantage of being an application rather than a feature of the native operating system. This prevented him from doing well in the Command category.

The test was performed on an iPhone running iOS 12.4, a Pixel XL with Android 9 Pie installed, and Alexa was tested using the iOS app. Interestingly, while Alexa ranked last in accuracy, the Amazon Digital Assistant showed an improvement of 18 percentage points over the last 13 months, the best performance of the trio. Google Assistant saw a 7 percentage point increase in its score over the same period, while Siri gained 5 percentage points.

Since the first test in April 2017, Google Assistant and Siri have improved in the Commerce category. Alexa, who was not in the first test, also saw her biggest improvement in the last 13 months in the Commerce category.

[ad_2]

Source link