Rank Ranger Blog

How Diverse is 'People Also Ask'? An Intent Analysis Study


People Also Ask Study Banner


Related Questions, or as most of us know them, People Also Ask (PAA), has developed into one of the more intriguing features on the Google SERP. The feature is both prominent and powerful. This dynamically loading set of what are all but Featured Snippets has the potential to seriously alter a user's search process or search 'journey' if you will. 

I wanted to qualify that potential a bit by analyzing what intent looks like inside the PAA box. 

  • How many intents are represented by the initial four PAA questions? 
  • Does Google prefer to dig deep into one specific topic or does Google prefer to offer a broader topical look with its PAA questions? 
  • Does Google give one intent more attention than another?  
  • Does Google treat one keyword category and type differently than another in regards to the PAA? 

I think I've made my point.... I want to know how Google goes about meeting intent via the PAA! 





Summary of 'People Also Ask' Intent Analysis and Method Used



Let me quickly outline what I did here. I took 250 keywords and analyzed the first four (well, usually four) questions within the PAA box... the four that Google initially shows a user on the SERP... and I analyzed the intents reflected in those questions. In doing so I determined the number of unique intents reflected by the four questions so as to calculate the average number of intents within the initial PAA questions shown on the SERP. (For more on the process of how I made this determination see the section below). 

Out of these 250 keywords I created data subsets. That is, I compared the overall data set to subsets (each subset contained 30 keywords) such as long tail keywords, opinion based keywords, etc. I then did the exact same thing but according to niche industry (i.e., finance, health, etc.). 


'People Also Ask' Intent Data Analysis Summary
 


Before we start getting into nitty-gritty details... here's a quick summary of what I found. 
  • On average there are 2.8 intents represented by the initial four PAA questions 
  • The initial PAA intent is usually reflected in two of the four initial questions 
  • Only 7% of initial PAA questions are   unaligned to the original query 
  • 10% of PAA boxes contain an irrelevant question




The Difficulty in Analyzing Intent within 'People Also Ask' 




Complex Maze


Before I get into the heart of the data and analysis we need to talk about the difficulty in classifying intents within PAA. There is no purely objective criteria by which to determine whether or not a specific PAA question reflects a unique intent or is part of the same "intent landscape" that is reflected in another PAA question within the same box. Fundamentally, this is a subjective process that is made particularly hard by questions that are not always light years apart from each other. Obviously, within any PAA box, some questions are of a distinct nature compared to the others. However, and more often than you would think, the questions are merely slight variations of each other. The problem is, even a slight variation can reflect a unique intent. 

Let me show you. 

Here's the PAA for the keyword chocolate cake recipes: 


Chocolate Cake PAA


Have a look at the 3rd question; How do you make chocolate cake moist? Is that the same thing as asking how do you make chocolate cake per se? In a way, yeah. In a way, it's a bit more specific than that. In this instance, I decided it was a separate intent. Could you have gone the other way with it? Sure. Now, this case was a bit easier than others. If we look at the 4th question, we see that Google is predisposed towards catering to users worried about their chocolate cake's moistness. That supports the thinking that the question How do you make your chocolate cake moist? goes beyond the intent of finding a good chocolate cake recipe. 

This, in a way, highlights my method of parsing intents. When I had two questions that I thought might or might not reflect the same user intent, to differentiate in these cases, I asked myself, "Could the same user be satisfied with the answer to either of these questions?" If the answer was more or less "yes" then the questions reflected the same user intent and vice versa. 

Let me show you another case. This is the PAA box I saw for the keyword corelle plates:


Corelle Plates PAA


The first question wants to know if Corelle plates are oven safe while the second question does not even mention the brand! However, I considered them both to be reflections of the same user intent. The same user who wants to know if Corelle is oven safe may very well be on a quest to find dinnerware that can take a beating. I considered these two questions to be subsets of the same intent... finding long lasting and versatile dinnerware. Could you make the case that they should not be categorized together? I think it would be hard, but you could.    

Since this process can quickly turn into a 'splitting of hairs "I asked one unfortunate fellow on our team to review my determinations and offer their thoughts. We then went through a back and forth exercise until we hashed out the intent parsing we both saw as being the most correct. At a minimum, the data reflected in this study is consistent.  




How Diverse is Intent within 'People Also Ask'? 



How many intents, how many different users, are targeted within the initial four questions shown in your average PAA box? 


Average PAA Intents


The average PAA box is pretty diverse with nearly three intents being reflected within its initial four questions. The average box contains nearly three unique intents. In real terms, that means close to 3 out of the 4 initial PAA questions target unique users. This, of course, needs to be qualified. How wide is the intent spectrum here? In what manner is Google showing diverse intents? Is Google trying to go deeper into one topic or offer a wider range of topics altogether?

For right now let's break the overall average down a bit. To do that let's look at two different keyword datasets. The first I'm calling "Keyword Category." Meaning, the dataset contains various industry categories/niches. I could continue to poorly explain this or I could just show you the data: 


Average PAA Intents per Keyword Category


There's really not much deviation between any of the niches shown within the Keyword Category dataset as compared to the baseline or overall average (which, again, stood at 2.8 intents). At most, we're looking at a .3 percentage point increase when looking at keywords for Products & Brands and .3 percentage point decrease for Recipe keywords. All things considered, that's pretty darn consistent with the overall average and reflects just a 12% increase or decrease from the "baseline."    


I also looked at different types of keywords, long tail keywords for example. Here too, the data was relatively consistent with the overall average: 


PAA Average per Keyword Type


The data for some of the subcategories here is slightly more divergent from the overall average compared to some of the niche keywords. Both long tail and how to keywords reflected 3.2 average intents within the PAA box, that's a .4 increase compared to the overall average. 

The jump seen in the average number of intents in the PAA box for long tail keywords can be at least partially explained by the increased difficulty in interpreting the query type in general. 

For example, the keyword why did luke not kill darth vader at the end of return of the jedi seemed to be very tricky for Google. Here, Google shows a PAA with four unique intents: 


Star Wars PAA Box



As any good Star Wars fan can see, the questions seem loosely related to the query. That is, they are not a natural extension of the original query as most PAA boxes tend to be. 

A clear instance of long tail keywords presenting Google with a harder time with their interpretation is did elvis die on the toilet or was he abducted by aliens


Elvis PAA


Aside for being the most bizarre if not downright fun keyword I've ever used in an SEO study, the intents reflected in the PAA box are a bit all over the place with one question asking how a monarch died and another asking about Elvis' manager. 

I would argue that the opposite is the case for how-to keywords. That is, for this keyword subset Google does such a good job interpreting the query that it can offer more highly relevant intent instances than it usually can. 

The PAA box for how to inspect for black mold manifests this well: 


Black Mold PAA


I mean it's almost perfect. Google targets all of the main areas of concern when it comes to 'mold'. All of the top concerns someone could have about 'mold' are addressed in the first four questions of the PAA box. 





What Intent Patterns Does Google Display within 'People Also Ask'? 



How does the data look from a per question perspective? Obviously, not every question reflects a unique intent, otherwise, the average showed earlier would be 4, not 2.8. Google is doubling up intent somewhere within the average PAA box. The question is, where? Is there a consistent pattern to it? 

Analyzing this is a bit tricky. So what I did was break down the number of questions each intent inside the PAA boxes received. In other words, let's say the first question shown in the PAA box indicated intent "X." Now imagine if question #2 reflected intent "Y." Lastly, let us suppose that questions #3 and #4 also reflected intent "Y." In this instance, the first intent seen within the PAA Box here (i.e., the intent manifested by the very first question in the PAA box) has but one question that aligns to it, question #1. At the same time, intent "Y," or the second intent shown in the PAA box, has three questions that align to it. 

It's a bit of an interesting way to break it down, but it's the best I've got. Let's take a look a real example before I present the data itself. 

Below is a PAA box for the keyword penny stocks to buy


Related Questions on Penny Stocks


The first three questions, to varying degrees, all deal with the viability of buying penny stocks while the last question reflects a separate intent (i.e., how to go about buying penny stocks). According to how I've constructed this, intent #1 (the first intent shown by the PAA box, which obviously happens with the very first question) has three questions that align to it while intent #2 is only shown in the last question. 


For the entire keyword set studied the data looks like the following: 


Questions per PAA Intent


What's genuinely interesting here is that Google seems to have an intent hierarchy within the PAA box. The first intent, which is synonymous with the first PAA question, is the most prevalent. That is, the intent embodied by the first PAA question accounts for nearly two of the initial four PAA questions! 

Now, obviously, the fourth intent (should there be one) can only be present within one question, the fourth, because there is no initial fifth question. Still, the combination of the 3rd and 4th question into the PAA box's third intent is a marginal occurrence. On average, the third intent encompasses just 1.1 PAA questions.   

Here's where things get interesting, however. When I looked at how Google "chunks" intent within the PAA box by keyword type the data pretty much matched that seen in the dataset overall:


People Also Ask Relevant Questions Per Intent Type



However, when we started looking at different keyword categories/niches a different narrative emerges: 


People Also Ask Relevant Questions Per Intent Niches


PAA boxes within both the Health and Recipe categories had the initial intent embedded within two questions. Recipes reflected a .4 point increase in the average number of questions that represented the PAA box's initial intent. While that is not an enormous difference by any stretch, it does stand in sharp distinction to the other subcategories studied. (I will remind you, each of the data subsets analyzed contained just 30 keywords. However, the trends across all subsets are exactly alike).  

The bottom line is, Google gives the first intent extra opportunity within the PAA box. This naturally lends itself to indicating that Google believes the first intent within the PAA box to be more relevant to users. As otherwise why would it offer the user more opportunities to engage with the intent? 




Qualifying Google's Intent Practices within 'People Also Ask' 



Translating the data shown above into a qualitative analysis of Google's intent practices is a complex matter. The main question I was faced with having seen the results has to do with why Google offers such a diverse set of intents within the initial PAA questions. Is it because Google is so good at understanding user intent that it can offer access to so many related topics or is it because Google is a bit baffled at what users want (and is subsequently trying to cover all bases by offering a diverse set of intents)? 

To help answer this I analyzed the PAA boxes I came across to determine if the first question presented was highly relevant to the initial query. Having seen that the first PAA question reflects an intent that generally speaking appears within a second PAA question made knowing this all the more important. 


People Also Ask Relevant First Questions


Just 7% of all the initial PAA questions I analyzed were not aligned to the query's intent. That's not a lot at all. Just so you understand what I'm referring to when I say the first PAA question is unaligned, here's what I saw for the keyword how to shop online: 


Shop Online PAA


Obviously, the user is not intending on ordering an online shop... I'm not even sure how one would order an online shop being that a digital entity is hard to send in the mail. What's interesting here is that the questions that follow are right on target. Peculiar. 

Forget the first question... how accurate is the PAA box overall? Meaning, what percent of PAA boxes contain intents that are entirely aligned to the initial query? 


Entirely Accurate People Also Ask Boxes


Looking at the relevancy of all of the questions in a PAA box, not just the first, gives us a slight increase of three percentage points. Just to be clear, this stat doesn't mean all of the questions were off the mark. Rather, it reflects that at least one question was unaligned to the original search query. 

The point is... the PAA is highly accurate. To think then that Google is offering a diverse set of intents so as to deal with a comprehensive inadequacy doesn't fit. That is not to say Google doesn't do that from time to time. As mentioned earlier, long tail keywords seem a bit more difficult for Google (which is reasonable). 

The clearest example of Google struggling with intent that I came across was for the keyword ways FDR compensated for having polio as president


FDR PAA Box

The first question is on the mark. However, for the record, FDR was Franklin Delano Roosevelt... Teddy Roosevelt was a different American president who had no disabilities (see question #3). For some reason, for the second question, Google harped on the connection (or lack thereof - they were 5th cousins) between the two Roosevelts which has no connection to the query's intent (i.e., learning more about FDR in relation to his disabilities that resulted from contracting Polio). 

However, while Google's PAA might not be perfect, they do seem to be highly refined and well-tuned. 




Intent Target and 'People Also Ask' - Takeaways 



Time to regroup and take stock of where we stand with the PAA feature after having had a good look at a lot of data. That is, what does the above PAA data indicate and what can you take away from it on a more practical level? 


'People Also Ask' Intent is Diverse



The most notable outcome of the data is that it paints a clear picture of PAA intent. Google presents a diverse set of intents within the initial four PAA questions. Having seen that the quality of the questions are very much in sync with the original query, the PAA box speaks well of Google's ability to understand intent. 

Practically speaking, with more a more diverse intent showing comes, all things being equal, a more diverse set of sites. Meaning, in a general sense there are more opportunities for all sorts of sites within a given PAA box. That is, rather than drilling down into one topic, Google tends to offer users a broader set of subject matter within the PAA feature. The result is a broader range of sites becoming relevant to a given PAA box.  


The First 'People Also Ask' Question Gets Special Treatment 



Google seems to have an intent hierarchy. Meaning, the first question in the PAA box is there because it is highly aligned to the initial query. As shown, Google tends to double up on the intent reflected within the first PAA question. This doubling up of intent at the first PAA position clearly shows Google's inclination to consider one intent more relevant than another. 

Aside from pointing towards the complexity in how Google goes about inserting the questions that it does into a given PAA box, there are some real-world implications. With more than one relevant question a user has what to choose from. While that might mean more opportunity, it also means more competition at the same time. Though I have not specifically studied it, I would imagine being the first question shown for a specific intent is better than being the second question aligned to that intent... not hard to fathom. 

With that, it may be worth your while to target the PAA boxes you want to show within. That would mean studying the intents within the PAA boxes you so desire and identifying that "doubled up" intent and targeting it like there's no tomorrow!  


Google Treats 'People Also Ask' Boxes Equally Across the Board 



All things considered, the PAA box is quite consistent. The average number of intents within the PAA box hardly moves no matter the niche and no matter the type of keyword. That means, generally speaking, you don't really need to consider any special conditions for your industry when targeting the PAA box. Google's approach to the PAA appears to apply equally across the board. That makes a great deal of sense since the SERP feature aims at meeting a user's ancillary needs, which are a universal (as opposed to local SERP features that only apply to a locally minded user).  




Relating Questions to the People - The Creation of a Monster! 




Gorilla Eating a Bus


If you think Related Questions (the formal name of People Also Ask) is big now, this is just the beginning! Our SERP Feature Tracker shows that Related Questions reside on nearly 30% of all SERPs (desktop US) and that's for a normalized dataset! For the most common queries out there, that number jumps exponentially! 

People Also Ask is an easy way for Google to offer access to multiple intents on the SERP... which has very much become the search engine's mantra. As time goes on the feature is sure to get some "upgrades" and will surely become a source of a serious amount of chatter within the SEO industry. We're literally seeing the very beginnings of Related Questions becoming a major part of a sound SEO strategy. 

If I had to pick to a SERP feature that has gone a bit under the radar but will make a splash soon enough it would have to be People Also Ask. Should be interesting to see how things develop in the very near future!   


About The Author
​Mordy is the CMO of Rank Ranger as well as the host of The In Search SEO Podcast. Despite his numerous and far-reaching marketing duties, Mordy still considers himself an SEO educator first and foremost. That's why you’ll find him ​regularly releasing all sorts of original SEO research and analysis!




Get the ultimate SEO tools with Rank Ranger
Start Free Trial
No Credit Card Required