The search engine optimization of a website is divided into three key areas:
- On-page optimization, focus on various elements within a page: content, keyword choice, metadata, etc.
- Off-page / Off-site optimization: refers to actions outside your website to improve optimization of your website: inbound links, social media, link-worth content.
- Technical on-site optimization: improves the structure of website as a whole.
The term meta data basically means data that describes other data. Each page contains an area of meta data, the area is made up of individual meta tags.
These individual meta tags are small snippet of text that helps search engines identify important information about the page. These information is contained within the source code of a page.
Search result mainly contains 3 pieces: title tag, URL, and meta description.
Besides the content of your page, the title tag has been regarded as the most important on-page element to optimize. It is recommended to place the keywords or information describing the page at the front of the title tag and the brand name at the end of the title tag. Some best practices:
- 55 characters or less.
- Beginning of title tag has the most weight.
- Differentiate page and brand with pipe symbol.
- Multiple keywords are separated with hyphen.
- Avoid using any special characters.
- Shorten brand name if too long.
Since meta description keywords don’t help a site rank, it can be overlooked. But it could be used to direct website traffic. Meta description can not be seen on the page or within the browser. It is hidden in source code of web page and only displayed publicly in the search results.
How ever a well-crafted meta description has been found to increase click-through to your website. By writing a meta description and using keywords that you think a user is likely to search for, more of the words within that description will be bolded.
Meta description should contain information about the page that entices a user to click, naturally incorporating keywords that users might use while performing a search.
You want the meta description to accurately describe the content. But Google is also able to provide its own meta description based on the content of the page, usually a block of text from the page that include the keywords the user searched for.
It is a good idea to control this area to ensure you present the best information possible. In case you don’t know what keywords are the best, it may be best to let Google choose the meta description for you. Moreover, social network use this as a description of the page when you (or others) post that page’s link to a social network.
Additionally, calls to action in the meta description can increase website visits. For example: “Learn more”, “Read our article to discover…”, etc.
Some best practices:
- Meta description looks better if the end is not cut off due to length. It is better < 160 chars.
- Include keywords to draw attention.
- Include “calls to action”.
- Avoid using quotations or special characters.
<meta name="keywords" content="..."> were previously a major component of an SEO strategy, but now they are only used in certain circumstances.
URLs describe a page to both visitors and search engines. URLs should be relevant and contain important keywords while remaining brief. Keywords within the URL are still useful, they just do not play as large of a roles as they used to. When possible, it is best to leave parameters out of URLs.
URLs are set in design state, you can chain them later. But changing them will mean it would lose some of the history and authority it has built up. The decision to change existing URLs, whether during a redesign or for SEO purposes, is circumstantial and should be heavily considered before making a change just for SEO. Best practices:
- Don’t change URLs for the sake of changing them. Use 301 redirects instead.
- Optimize the URL from the start.
- Incorporate keywords into URL where possible.
- Keep URLs short and succinct.
Documents you write as well as Web pages have headings. Heading tags not only help to stylistically break up the content on the page but are also useful for SEO.
Subsequent heading tags like
H3 and so forth are not really looked at by search engines from an SEO standpoint. So just focus on
H1 (one per page) and
H2 (subsequent as needed).
Creating and optimizing unique content
Content is king. Your page needs quality content around the topic or theme of your page. There are some best practices all pages can follow:
- Make content relevant to the theme of your site.
- Create your own unique content that adds value.
- Read content outlaid to see if it sounds natural.
- Add images or other resources when applicable.
- Link to other relevant pages on the site helps search engines to crawl the site.
Off-page or off-site SEO is about what you can do outside of or off your site to help ensure the success of your on-site and technical SEO efforts. These include building links to your site as well as increasing your brand recognition and visibility through social media.
Historically, PageRank is an algorithm that analyzed web links to determine the relative importance (popularity) of websites. Each page is assigned a value (0-10) based on the number and quality of links pointing to that page. The level of effort to obtain higher rank increases. Passing of rank was known as “link juice”. Today this concept is called a page’s authority, which is made up of many things including links.
In the pas PageRank of a page was public knowledge, but in 2013 Google updated PageRank for the last time, a site’s authority is no longer tracked by its PageRank, but the concept of gaining authority through the off-site SEO efforts is still similar. Google’s current ranking factors are unknown, but incoming links are still valuable ranking factors, but low quality links can result in penalties. Except the PageRank metrics, other metrics are still available.
There are many aspects of back links that Google will analyze. Links should be earned naturally. One of the ways to do this is catering to your user base by providing excellent user experience and creating great content. Some form of outreach is necessary to ensure that content is discovered by the right people wiling to share and link to it.
Google looks at many different factors when judging your site authority based on your back link profile, for example: the amount, quality, relevancy, placement of the links to your site. The best type of link is in content surrounded by words relevant to your site. The anchor (clickable) text of a link is also important.
Social media can indirectly impact your site as well. A decent correlation among social media usage and higher rankings has been discovered. A good social media strategy can increase your overall brand visibly online, help earn a better reputation allowing your site to receive more traffic.
Google has stated they do not have an algorithm that takes specific social media factors into account when determining the site’s ranking. Tweets & social profiles are all considered “no follow” links, which pass no authority from one site to the other, however these links can increase visibility in search results, social channel referrals and brand recognition.
Social media is not only useful to SEO, but to your boarder marketing and branding efforts.
Effective SEO strategy is not just about content, you also need to lay a structural foundation for your content so that search engines can more easily discover them. Technical SEO focuses on how well search engine spiders can crawl your site and index your content.
There are sitemaps created just for search engine robots. Sitemaps help point search engines to existing pages on your site, which helps ensure pages are not missed by crawlers. There are either HTML or XML sitemaps:
|HTML version||easy to read and understand by users|
|XML version||created for search engines, and more specific to SEO|
An HTML sitemap is a simple page that contains links to important pages within a site, and can be considered as a general overview.
XML sitemap files actually contain behind-the-scene activity and unique information of a web page, in order to let search engine analyze content in a more logical and intelligent manner. Uploading sitemap files to Google Search Console, you are able to inform search engines of the presence of your site and pages. There is a variety of XML sitemap creation tools.
The robots.txt File
Sitemap files only tell search engine which info to include in search result, you also have the ability to exclude specific pages from being crawled by using robots.txt file.
The robots.txt file is a protocol that was created in the early days of the Internet to prevent robots from crawling areas they were not supposed to access. However robots can ignore information in the file robots.txt. Also remember robots.txt is a publicly available file, so anyone can see sections of the server not accessible to robots. Some examples of directives:
|User-agent||what user agents should crawl website content|
|Crawl-delay||amount of time a robot should take before accessing a new page|
|Disallow||instructs robots not to crawl the referenced content|
|#||make a comment or a note|
Overcoming Error Codes
Errors aren’t just frustrating to viewers of your site. They also have ramifications for SEO. Each error has a specific status code, which helps search engines and developers to understand what went wrong.
A 404 page is an error page which is represented when a page is not found. A common question is whether or not you should redirect error pages (like 404 pages) to new pages, say home pages. “Redirecting to other pages” is not always a good idea, actually not all 404 pages are bad. Search engine will recognize these pages and exclude those pages in its index. However lots of 404 pages will harm your site. Ideally, you shall first minimize the amount of 404 pages produced.
Furthermore, we shall watch out for mishandled 404 pages (soft 404), which means pages no longer exist, but still return status code 200 (OK). Large number of soft 404 will create problems with your site and very bad for SEO, because search engines will see large amount of duplicate content, which will harm your site.
Another common status code is 500, which means server error, but the server cannot be more specific about what exactly is causing the error. The status code 503 means that the service is unavailable. This happens when the server is down, in maintenance or temporarily overloaded. it’s important to ensure that a status code 503 is returned, instead of a status code 500.
Redirects is also a type of status code, used to transfer users from deleted pages to new / updated pages. Most of the time, users won’t notice the difference, however, redirects status code provides search engines with instructions on how they should handle that page.
Let search engines know a page no longer exists.
Search engines will eventually credit the new page, and increase the ranking of new page.
Redirect is in place for a limited time, and search engines should not transfer trust. This status code is used mostly when performing site maintenance.
A big reason to recommend 301 redirects over 302 redirects is that the permanent redirect will pass about 95% of authority of the old page to the new page. On the other hand, the 302 redirect passes little to no authority to the new page and search engine keeps the old page in the index.
Be careful of chaining redirects, i.e.
Page A → Page B → Page C. It is best to have both
Page A → Page C and
Page B → Page C. It is also best that old page and new page are relevant in content.
Another type of redirect is called Meta Refresh. No HTTP status code is presented with a meta refresh, because the redirect is executed at page level rather than by the server itself. Meta refresh are generally not recommended, because they do not provide clear signals to search engines about what happened to the previous page.
Keyword Theory and Research
When people enter search terms into Google or other search engines, they generally have a specific goal in mind. The better you are able to understand the audience of your site and their needs, the better you’ll be able to target keywords that will direct your audience to your site.
You must first understand how people search and why they choose keywords. People in general use many different types of queries trying to find information about one similar subject. It is Google’s job to provide the most relevant pages for those search queries. So your web site is better to have a set of pages dedicated to the topic. A good practice is to provide answers to a variety of questions around a particular topic. This makes your website very dedicated to a more broad “focus keyword phrase”, while still allowing individuals to target and rank for very specific keywords.
Well, also consider proper keywords selection, when building up your site and pages. More specificity means less competition and more chance of ranking higher for queries. Users will be more engaged in the contents of your site.
Stages of User Search
Users usually begin with a broad search topic and then gradually refine that to be more specific as they discover exactly what they want. Having an organic presence is useful in all stages of the cycle.
|Awareness||At the beginning of the path, organic search helps users gain awareness of your brand, products or services.|
|Evaluation||Users evaluate and develop a specific preference.|
|Preference||Users identify their needs and seeking a resolution.|
|Purchase||Users compare merchants and prices|
Type of Search Queries
Each type of query has a different intent associated with it. Those types of search queries actually closely align with stages of search.
|Navigational||Users already know what they want to find, they need search engine’s help to find it.|
|Informational||Users are looking to get a bit more information to make an informed decision before a purchase. Users are more likely to notice suggested search queries.|
|Transactional||Users are ready to buy. They need to compare the details of purchase choices.|
Heads, Tails, Long-tail Keywords
All keywords are not created equally. Some keywords will help you build authority for your site by getting the right kind of attention. The more general head terms and the more specific tail terms can be used to identifying keywords for you site. The long-tail keywords (which account for low volume of overall searches) are extremely beneficial to driving the right traffic to your site.
Long tail keywords are used less frequently and difficult to optimize for. You can’t predict the exact long tail phrase a customer might use. However, the majority of your traffic will come from long-tail keywords phrases. These phrases are often more specific. Because their specificity, they have a higher probability of conversion due to being later in the stages of user search cycle and buying funnel. These phrases are often less competitive with higher rank and can attract more traffic.
Popular search terms make up 30% of the overall searches performed online, the remaining 70% are known as long-tail keywords.
How Users Conduct Searches
A study from Blue Nile Research shows that 50% of users formulate their queries using fragments like “rental textbooks” and 50% of them use more specific phrases like “rent a textbooks online”. When it comes to questions vs. statements, 27% prefer to use question form, and 73% prefer non-question.
Mostly used words in questions are How, Why, Where, Which, and What. You could generate keyword and content ideas based on potential user questions. Seasonality also impacts searches and site launches.
For more on Google SEO Fundamentals, please refer to the wonderful course here https://www.coursera.org/learn/seo-fundamentals
Related Quick Recap
I am Kesler Zhu, thank you for visiting my website. Check out more course reviews at https://KZHU.ai
All of your support will be used for maintenance of this site and more great content. I am humbled and grateful for your generosity. Thank you!
Don't forget to sign up newsletter, don't miss any chance to learn.
Or share what you've learned with friends!Tweet