Mark Auyoung

Mark Auyoung

Fundamentals: Frameworks for Search Engine Optimization (SEO) Best Practices

There are only 3 types of users you need to impress to make your SEO indomitable. 

  • Website Visitors
  • Google’s Crawler (Googlebot)
  • Awareness Users

I promise you, if you use this framework to inform every single one of your SEO decisions, you will be following SEO best practices without knowing it. 

A lot of marketers see Search Engine Optimization (SEO) as a magical black box that only the oracles known as SEO practitioners can communicate with. Although there is a bit of technicality to it, SEO is fairly straightforward. 

There are three major categorizations of SEO tactics:

  • On-Page
  • Technical
  • Off-Page

Each of these types focuses on a different type of user with different needs. Addressing their needs consistently will build you a powerful SEO foundation. 

search engine optimization (SEO) user framework
Search Engine Optmization Key Users & Categories

But first we need to understand who these users are, how they tie into your SEO strategy and what their needs are.


Website Visitors

There’s an old SEO adage that says “you have two seconds to capture a user’s attention before they leave your website”. The insight from this is that people who visit your site want to have a good user experience. 

We need to understand that these visitors had a specific question, searched a particular query, and landed on your website. To convey that you’re the answer to their search, you need to signal that your page is relevant to their search and will satisfy their search intent.

On-page SEO is the optimization of all elements on an individual web page with the goal of satisfying a user’s intent. This will naturally cause keyword rankings to rise and your users to be engaged with your website. 

User Need: Satisfy their user-intent while providing a great user-experience.

But where exactly do you start?

At the beginning of their search journey. 

Users that are looking for a specific answer will type in queries into the search engine in particular ways. Without getting too nitty-gritty, these search queries trigger search engines to pull up listings it deems as relevant. To influence this process need to understand the anatomy of a search engine result listing. 

search engine results page anatomy
Parts of a Search Engine Results Listing

URL: The URL displays what site you’re going to be visiting. A recent update has removed the actual URL the page is on but instead shows you a “breadcrumb” trail of what section of the domain you will be navigating to. To influence this, our website will need to have structured data. We will be diving in structured data later. 

Title: The Title is influenced by the “Title Tag” on your website. This is a special HTML tag in the code of your website that is reserved for identifying the title of the page. Depending on your website, optimizing this tag would require development knowledge or be as simple as installing a website plug-in and typing in your desired tag.

Snippet: The snippet is dynamically generated by Google but also takes into consideration what is in your “Meta-description” tag. Similar to the title tag, this HTML tag tells Google what the page is going to be about. 

Now that you understand what these tags are, optimizing them is simple. 

  • Make sure their search query is in these tags 
  • Make sure the title/snippet is engaging
  • Follow character limits for each section 

From here the user is on our website. How do we make sure they know they’re on the right page? 

Include their query on the page to create consistency and make their user experience amazing. 

Headers: Headers of the page should include the query or keyword that they typed in to reach your site in the first place to avoid confusion. 

Images: All images on the website should have an “alt-tag” associated with it. This alt tag displays when the image cannot be loaded and also allows screen-readers to understand what is being displayed in an image. 

Accessibility: The internet was built for a variety of users, therefore general accessibility standards must be met. You can find a full list of best practices here.

Semantic Structure: Organize content in a way that makes sense semantically. Imagine the webpage you are trying to build is a table of contents of a book. Organizing your headers (H1,H2,H3) and content on a webpage in a hierarchical manner allows users to quickly navigate to what they need in an intuitive way. 

E.g. 

H1: Top 10 Vacation Destinations in Orange County

H2: Disneyland, Anaheim CA

H3: Things to Try in Disneyland

Linking: Providing links both internally and externally allows users to find out more information related to what they are searching for. Linking internally to other pages in your website creates a great user experience which allows website visitors to explore topics related to the topic they are looking for and to explore more of your website. Linking externally allows users to explore more useful resources that you linked to. 

User Intent: All of these best practices are listed to create consistency and a great user experience. But at the end of the day, the user is on your website to find an answer. No matter how well your website is designed, if their answer is not answered, the user will not have completed their journey. 

Provide the resources, answer, or value that the user is looking for to create a truly great on-page experience. 


Google’s Crawler (Googlebot)

Technical SEO is really about making it as quick and easy as possible for Google’s Crawler to access and process your website. 

Google’s crawler literally has to process and keep track of every change on the internet. This is a huge job and the more efficiently your website can load and deliver what it is about, the easier the crawler’s job is. 

User Need: As quickly and efficiently crawl your website as possible

How quickly your website loads has to do with countless amounts of variables, some that are very technical but here are few that can provide the most results with the least amount of work.

CDN (Content Delivery Network): A CDN serves as a checkpoint for your user’s browsers. When you visit a website, your browser is tasked to retrieve the website from a server. Depending on where the website is hosted, this may take a little longer. A Content Delivery Network (CDN) is a network of servers placed around the world that all have a copy of the website in their servers. 

How this works is when a browser is tasked to retrieve a website rather than being forced to pull from wherever the original server is, the CDN service determines the server closest to the origin in its network, and pulls the copy of the website from there. This saves time and allows the page to load quicker. In the case of the crawler, the page is retrieved much quicker. 

Minify JS/CSS: Many times code is written with humans in mind. It is formatted in a way that allows a developer to quickly browse and understand what they are looking at. Unfortunately this formating has a lot of extra spacing or padding that computers do not need to process the code. The removal of spacing and reformatting of the code is known as “Minifying”. Reducing the size and time to process helps save Google’s crawler time in understanding your code. 

Image Size Optimization: Although images can create a better user experience, using images that are unnecessarily large can do the opposite. Waiting for images to load takes a decent amount of time. The effect is multiplied by how many images you have on your page. With many modern websites built with images scaling your images down before uploading can help with page load speeds, therefore helping Google’s crawler. 

Lazy-Loading: In a similar vein, lazy-loading (asynchronous loading) also tackles the problem of long page load times. This optimization technique loads the content on a page only when it is visible to the user, otherwise it will not load. This is in contrast to loading the entire web page every time. Doing this allows for a quicker first interaction with the page which benefits both the user and the crawler. 

Sitemaps & Robots.txt: Although generally you would like the pages on your website to be crawled and indexed by Google, there are instances where you may not want to. 

Identifying and letting Googlebot know what these pages are will help save it time. Sitemaps are essentially a table of contents of your website. They highlight what pages are on your website and how they’re categorized. Most often they are dynamically generated but SEOs who want more control can manually create them.

Robots.txt is a set of instructions for Googlebot and other crawlers about what to crawl and what not to crawl. An example when this may be useful is when you don’t want Google indexing member-only content on your website, or different filter pages of your e-commerce website. Saving the crawler time by giving instructions on what needs to be crawled and what doesn’t need to be crawled helps save Google time. 

Awareness Users

Off-page SEO consists of all the activities you do outside of your website to increase your traffic and rankings. This type of SEO encourages you to find new, qualified users outside of your website. From there it is your job to make sure that when they do find you, they get a good first impression. 

A lot of these activities fall under brand building and content ideation which makes sense based on where these users are in the marketing funnel (Awareness & Interest). 

User Need: Find your brand, trust it, and find a reason to interact with you.

Many of these tactics require a base knowledge of understanding linking and how Google uses these links to determine relevance. In essence backlinks can be seen as votes of approval for your website. When another website links to yours, an association is made in Google eyes regarding your domain and theirs. 

Not every link is considered equal. Backlinks from high authority websites such as .GOV, .EDU, or CNN who have a lot of backlinks websites are worth more. Each domain has “link juice” that it is able to give or receive by linking to or from other domains. The more backlinks from high authority websites you have the higher your theoretical “domain authority”.  

Backlink Audit: A backlink audit is a review of all of the sites that have linked to your website. There is such a thing as a bad backlink. Websites that are spammy or engage in shady marketing tactics have bad domain authority. Sites like this that link to your site may reflect poorly on your domain. Auditing regularly what your backlinks are gives you a good idea where your site is being linked from, where your users are associating your site, and whether you need to remove any associations. The removal of association from a domain is called disavowing. If you notice domains you don’t want to be associated with you are able to submit a list to Google and they will take this into consideration. Creating trust is important for users who first come into contact with your brand. If they notice an unsavory website linking to yours this may damage your brand image. 

Brand Mention: Noticing brand mentions on other websites can help you find linking opportunities if they aren’t already linking to you. Most of the time because you are being mentioned you are able to contact the webmaster of the domain and ask them to change the anchor text (text that appears as a hyperlink) that is being displayed to change how you’re portrayed to new users.

Local Directory/Social Media: Being present on these platforms is very important from a brand perspective. Millions of people use social media and not being there and not having a presence says a lot about how much you brand. If you do not establish yourself on these directories/social media platforms, potential customers will take it into their own hands and convey your brand how they see fit. 

Content Gap Analysis: Finding out what competitors are doing is a great way to start your initial content building. Chances are they’ve done the research to target keywords that work. Worse case scenario you rank for the same relevant terms then build on top of their work. After providing valuable content, other sites that cite your work are likely to link back to yours. From there users with the context of the anchor text click through and are now in on your website and in your funnel. 

Fast, Useful, Known

There are lists that contain hundreds of SEO best practices that are ever growing and changing. The ones listed above are only a few that can be utilized. What does not change is that people will always look for answers for their questions. All of these tactics will not work if you look at them as a checklist. 

Take the time to understand each of these types of users and run your tactic through them. If they do not benefit any of these users, chances are the tactic will no longer be an SEO best practice for long. 

At the end of the day if your website provides users with what they need, as quickly, easy as possible, you will be following SEO best practices.  

Tired of flashy numbers and useless reports? Make sure your AdWords and SEO campaigns aren’t wasting you money. Take this step-by-step checklist to make sure you’re getting the most out of your marketing budget.

SEO & PPC Checklist

Cheers,

Mark

Share this post

Share on facebook
Share on twitter
Share on linkedin
Share on email