π Add to Chrome β Itβs Free - YouTube Summarizer
Category: N/A
No summary available.
00:00
Unlike traditional search engines that crawl an entire page, large language models have something called a token limit, which means they don't read your page in full. They prioritize what's most structured, most clear, and most useful within that limit.
With this in mind, we need to rethink how we structure content.
00:18
This shift is what we call on-page LLM SEO. Another acronym.
In this video, we're going to talk about optimizing your page for large language models. Let's go. In the AI search era, your content has to be both machine-readable and answer-ready.
00:37
Though keyword optimization is still important, and some of the things we will be sharing might seem familiar to you as SEO folks, EO, GO, LLM, SEO, whatever. The key difference between SEO and all those acronyms is the focus.
00:52
I'm restructuring content in a way that shifts the spotlight onto what actually matters for LLM, so you'll know exactly how to adapt proven SEO tactics for the realities of AI-driven search. Let's talk about the first.
Since large language models have a crawl limit on each page,
01:09
it makes total sense to have a key takeaway section at the top of the page. Now, this is not something new, but let me add something on top of it.
Llms work especially well with questions and answer formats because they are trained on conversational patterns and structured Q&A data.
01:24
Instead of just putting a list of key takeaways, why not make it into the Q&A format. For example, we are writing an article about AI search optimization.
We have a key takeaway section at the top, but instead of pointers, we will put it in the Q&A format.
01:40
In traditional SEO, we will put it in this way where we just share what is most important. But in on-page LMS SEO, we turn whatever we can into the Q&A format.
Important pointers where we can't optimize, we will keep them in a list. What's more is that we can add an FAQ schema to this Q&A section.
01:58
This will help traditional search engines to understand more about your page. Large language models rely on traditional search engines for answers.
So if we optimize that part of the content for traditional search engines, it indirectly helps the large language models. I hope it makes sense.
02:14
Anyway, the FAQ schema looks like this to your visitors, but in the back-end, the FAQ block looks like this, which includes all the questions and answers. Now, this FAQ block is only meant for Rank Math users, even the free version.
Some other plugins have this feature as well.
02:30
If you do not use WordPress, you would have to add an FAQ schema that looks like this on your page source code. Now, if you are using WordPress, but you are not using a plugin that offers an FAQ block, maybe you can consider using Rank Math.
Anyway, if we test the source code of this page on Google Rich Results Testing Tool,
02:52
we will see an FAQ schema detected and all the questions and answers are part of the schema, which allows search engines to easily understand that part of the page. If you want to create this FAQ block on your page, all you need to do is to go to the area when you want to add the block, click on Add Block, you can search for FAQ,
03:11
and this is the block, or you can click on Browse All, scroll all the way down to the Rank Math blocks, and you will find the FAQ block. From here, you can add your questions and answers.
If you need another set of questions and answer, you can just click on this.
03:27
There you go. Now, if you are stuck and need an AI to write the answer for you, just click on this and Content AI will write it for you.
Of course, you need to fact check the text. Then feel free to add an image for your question.
For example, this. If you want to see how this looks like, Let's visit the post.
03:48
Here you go. Then if you need to organize the questions and answers, you can simply move them up or down.
In on-page LLM SEO, add a key takeaway section for all your content rich changes, but turn them into the Q&A format laced with the FAQ schema.
04:04
That way, you are not only optimizing your page for LLMs, but also for traditional search engines. In traditional SEO, we were taught to add the FAQ section at the bottom of the page, and we have already talked about moving it to the top of the page instead.
04:21
But what if we add the FAQ block to wherever it makes sense? You see, we have the FAQ blog in the Key Takeaway section, but what if in this section where we talk about the LLMS.
Text optimization, we add an FAQ block? Because you see, in the content about AI search optimization,
04:39
we talked about adding the llms. Text to the site. But there are claims that the large language models do not call the LLMS.
Text file, and that becomes a common question that many will ask related to LLMS. Text. For this section, we add an FAQ block to address this question.
04:56
Now, you must be thinking, can you really add multiple FAQ blocks on the same page? The answer is yes.
Let's go to the page in the front-end, and this is the additional FAQ block that we have added to the page. Let's copy the page source code and test it on Google Reach Results Testing Tool.
05:15
This will tell us if there are any errors with the schema or markups. Here we go.
We have the FAQ schema and no errors. Let's check it out.
We can see that the new question and answer has been added to the schema as well.
05:30
The only thing that you do not want to do is if you add FAQ blocks on the page, you do not want to add an additional FAQ schema through the schema generator. We recommend just using the FAQ block.
For every section of your page, if there are dire questions related
05:46
to the H2 or H3 heading you're writing about, add the FAQ section. Questions and answers about a specific topic helps not only traditional search engines, but also the LLMs.
Now, in traditional SEO, we are writing for humans and search engines to index.
06:04
But for on-page LLM SEO, we are writing for LLMs to understand the relationship between concepts and entities while humans still benefit. For example, in traditional SEO, we would write something like, Optimize for voice search helps users find content through spoken queries.
06:20
But in entity rich content, we would write something like, Optimizing for voice search allows AI assistants like Alexa, Siri, or Google Assistant match questions with answers to help users find content through spoken queries. The meaning are the same, but we have included entities.
06:38
Alexa is an entity, Siri is an entity, Google Assistant as well, and we group the separate entities into one entity called AI Assistance. We associate those entities with a key point which is optimizing for voice search helps users find content through spoken queries.
06:54
We are essentially associating those entities with voice search optimization. Another example would be adding visual elements makes content easier to scan and understand.
This is how you would typically write content in traditional SEO. Yes, visual elements can be an entity, but it is not specific.
07:13
In traditional SEO, you could have defined visual elements like tables, lists, headings, etc, on other parts of the page. But as we have mentioned, LLMs do not crawl the entire page, so we need to add those definitions into the sentence that matters to LLMs.
07:30
Such as adding lists, tables, headings, images, and other visual elements make content easier to scan and understand. List is an entity, tables is an entity, headings, images as well.
We have grouped them into visual elements as one entity, and we associate these
07:47
entities with making content easier to scan and understand. Because the thing is, traditional search engines can read the whole page, so they usually understand the meaning from the text around it and don't require you to mention entities in every sentence.
But for LLMs, they read text in chunks and can't see the whole page at once,
08:06
so you need to give context and mention key entities in your sentence and associate them with your key points. Of course, the next question is, if I put entities all over the page, wouldn't it make the entire page hard to read because you are adding entities after entities on almost every sentence?
08:23
And that's why we recommend doing this. While we were taught in traditional SEO is that we should be optimizing our H2 and H3 heading with keywords and optimize the first sentence after each heading with a direct punchy one-sentence answer so that we can increase our chance
08:41
of getting the feature snippet on Google. But in on-page LLM SEO, we should make the headings into questions whenever possible, and we should still optimize the first sentence the same way, but adding entities to them.
For example, our article is talking about all the different AI search optimizations.
08:58
On our first H2 heading, it is a question, and we should answer them in a short and punchy line. Let's scroll down.
We'll talk about this in a while. Right here.
If we were to optimize for traditional SEO, we would have added a heading like this, understand keyword research in the AI search era.
09:17
But if we were to optimize for LLMs, we would write this instead, how to do keyword research in the AI search era, because it is a question instead of a statement. Just in case you don't know, we have covered this exact topic about keyword research in the AI search era right here.
09:33
You might be interested in checking that out, but do it later. We have left the link in the description.
Anyway, right after the heading, we will optimize the following sentence for featured snippets as well as LLMs, such as having a direct punchy one sentence answer that is about 20 to 40 words and associating entities
09:51
which are bold for your reference. Chatgpt is an entity, perplexity is an entity, Claude as well, and they use multiple SEO optimized keywords to search on traditional search engines such as these entities.
We are associating these entities with the key point, giving these
10:09
entities meaning and association. The first sentence is optimized mainly for feature snippets and LLMs, and the rest of the section is to solidify the understanding for humans.
Let's do another example. In traditional SEO, we would have used this statement as
10:25
the heading, Create other multimedia formats for AI search optimization. But optimizing it for LLMs, we will use this instead.
Why focus on other multimedia formats for AI search optimization? This is just an example for you.
Don't blame me if my grammar or English isn't proper, all right?
10:42
Now, right after the heading, we would optimize the first sentence with entities to associate them with the main point. Again, for your reference, the bold text are entities.
We are essentially saying that AI search platforms like this and this do not only pull text and entity.
10:59
They pull answers from images, videos, audios, and others. Thus, creating other multimedia content formats gives you a higher chance to be cited and seen.
Well, this is not exactly an entity, but this is the main point. You don't exactly need to plaster the entities all over the content,
11:17
but only the first sentence or paragraph right after the heading, because those are the places where LLMs will likely read. I hope it makes sense.
Now, what we have done up to this point is already helping with voice search optimization because we have structured our content with questions and answers,
11:34
but an additional way to optimize your content is through a speakable schema. Now, technically, LLMs do not process schema the same way Google does, but adding a speakable schema still pays off because it forces you to highlight short, naturally sounding passages on your page.
11:51
In other words, you are not just marking up text, you are shaping content that works in voice search, AI-driven summaries, and conversational answers. Let's say that you are going to want to mark this question heading an answer with a speakable schema.
If you are not using Rank Math Pro, you will be editing this heading
12:07
and sentence as HTML and defining them as a CSS class, let's say other media formats. Then you will be adding the speakable schema like this to your page source code, where you select this class as part of the speakable schema, making this
12:23
the question and this as the answer. However, if you are using Rank Math Pro, all you need to do is to click on the heading block, go to the block setting, expand the advanced tab, and in the additional CSS classes view, we can add the name of the CSS class.
12:38
This can be anything, but we recommend a descriptive text. Do not leave any spaces because they are seen as different classes.
If you need to add a space, use a dash instead. Then do the same for the paragraph as well.
Following that, the next thing you want to do is to go to Rank Maths tab,
12:54
go to the Schema options. The speakable schema is within the article schema, so we you want to edit it.
From here, you want to enable the speakable schema. Click on Add Property.
Then before you add the CSS class you have created, you want to add a dot followed by the class.
13:11
That's basically it. Let's click Save for this post.
Now, if you want to mark another set of question and answer with the speakable schema, let's say this heading and the first paragraph. You can click on the heading block, go to the block setting, expand the advanced tab, and right here, you can create a new CSS class,
13:29
say the keyword research. Do the same for the paragraph.
Edit the speakable schema, click on Add Property, add a dot right in front, paste the new CSS class, and save for this post. Now, save the page.
13:44
Now, To know if the speakable schema works, let's visit the page and view its page source code, copy everything, and go to schema. org validator.
Select the Code Snippet option and paste your source codes here. Run the test, and in the block posting schema,
14:01
we can see that the speakable schema is added without any errors. Now, you might have another question like, Can I use the same CSS class on other pages?
For example, we have created the keyword research CSS class earlier. Can we use the same CSS class on other pages?
14:16
The answer is yes. It doesn't matter, and it does not affect your overall SEO, geo, AEO, or whatever.
Similar to traditional SEO, we add relevant data such as the statistics from other people, statistics from our own research,
14:33
add tables, add bullet points, and of course, images with alt text to separate our content, making it schemable and accessible. For LLM SEO, these same elements carry extra weight.
Structured formats like tables and bullet points don't just help readers.
14:49
They also make it easier for AI systems to pass information in chunks. And statistics, whenever written in clear standalone sentences, for example, 72% of marketers plan to increase AI budgets in 2025, are more likely to be coded directly in AI summaries.
15:06
While it's not guaranteed, well-placed structured elements are more likely to be surfaced directly in AI answers. Since LLMs and the AI search engine's favorite content that is concise, scannable, and clearly organized.
Here are the other on-page LLMS SEO you can do.
15:24
Whenever you have completed your article, you want to make sure to add a table of contents to the page. Simply go to the section where you want to add the table of contents.
It is usually right at the top of the page, ideally after the key takeaway section. Now, as a Rank Math user, even the free version,
15:40
you can click on Add Block, search for table, and you will see Rank Math's table of contents block here. As soon as you have added the block to the page, all your heading text will appear automatically.
You can even edit the text if you want to. Hide certain heading text.
15:55
For example, this, this is a H3. To hide this, we can simply check this and all H3 heading text will be hidden.
You can even customize the look and feel of the table of contents. If your SEO plugin wants you to upgrade to the premium version to get the table of content feature, you may want to consider switching to us.
16:13
Anyway, other than adding the table of contents, if you have a table that comes with original research data or you have cited the information from other sources, like in our case, this table includes originally research statistics. We can click on the table blog and then click on this button to add
16:30
a caption right below. We can say, Originally research by Rank Math SEO.
Also, if you have embedded a video on your page, you can add the transcript of the video on the page. Or if you have added a podcast episode, add the transcript to the page as well.
16:45
Do not hide the transcript behind a JavaScript action, for example, an accordion, but it really depends on how the accordion or the element is made. For example, if I add a kadence accordion block on the page, let's say we add a title, to See Transcript,
17:01
and hidden within the accordion, we want to add a line that says, This is a test transcript to see if the accordion will hide this text behind a JavaScript. And then we select Start with All panes Collapsed.
Let's save the page, visit it.
17:18
Let's go to the section, and this is how it looks. You see the text is hidden within the accordion.
Now, what you want to do to know if the text within the accordion is hidden behind a JavaScript or not is to simply right-click on a page and view its page source.
17:33
Then hold Control F or Command F to find the text, and let's search for the exact line of text. Here you go.
If we can see the content here, it means that the text is in the HTML, and it is safe for both search engines and LLMs to find it.
17:48
But if you cannot find the text here, then it is practically hidden from LLMs. That's because LLMs do not render JavaScript like a browser.
They rely on text that is accessible in the raw HTML. So what do you think about our on-page LLM optimization?
Is it helpful to you?
18:04
If yes, can you do us a favor and smash that thumbs up? And in the meantime, drop a comment down below and let us know what topics you want us to cover.
Our channel is all about helping you grow your brand traffic, and if you haven't subscribed to us yet, consider doing so. This is Jack from Rank Math.
I'll see you in the next video.