New Studies Find That Google Assistant Is the Most Accurate Virtual Assistant, While Alexa Is the Most Improved

We don’t need a study to tell us that the Google Assistant is the best virtual assistant on the market right now. Anyone who has used the Google Assistant, along with competitors such as Siri and Alexa will agree that it outshines everything else that the market has to offer. Now, two independent studies conducted by two separate firms confirm that the Google Assistant is indeed the best of the lot.

The two studies released by Stone Temple Consulting and roast gives us a bit of insight as to how these two options differ. The study from Stone Temple compares various assistants with over 5,000 different queries, and it rates them based on how many questions they would answer along with the accuracy of the answers.

google-assistant-mainRelated Google Starts Rolling Out Third-party Actions and Keyboard Input for Google Assistant

Here, the Google Assistant pulled ahead of all competition and was able to answer over 90% of the questions, with an accuracy rate of about 80%. Interestingly enough, the accuracy is varied when you switch from Assistant on your phone to that of Google Home. On the Google Home, it could only answer about 85% of questions, while having an accuracy rate of about 65%.

Surprisingly, Microsoft’s Cortana surprisingly comes in second place, just a couple of points behind Google Home, while Apple’s Siri fell in the last place, managing to answer 80% of questions, with just 40% of the answers being accurate. Amazon’s Alexa sees huge improvements between 2017 and 2018 studies. The study found that Alexa was able to answer over 80% of questions posed, as opposed to the 50% from last year.

google-assistant-2-2Related New Studies Find That Google Assistant Is the Most Accurate Virtual Assistant, While Alexa Is the Most Improved

One of the key observations we found is that the Google Assistant result didn’t always match the result found on a web search featured snippet answer box. Sometimes the assistant didn’t read out a result (even if a featured snippet answer box existed) and we also had instances of the assistant reading out a result from a different website than the one listed in the featured snippet answer box.

The second study conducted by ROAST also put Google Assistant to the test. This study was exclusively about the Assistant and involved 10,000 questions in total. The questions were split up into 22 topics, including the likes of hotels, restaurants, education, and travel information. In this study, Google Assistant was able to answer about 45% of questions posed.

Source: 9to5google