Types of SEO : On-Page, Off-Page and Technical SEO for Your Site
When you want to learn more about a service, pick a product or find a nice cafe nearby, what is the first thing you do? Chances are, you type up a request in a browser and google it. And yes, ‘google’ became a verb over ten years ago. Google or Bing will get the desired result to you in a matter of seconds. The official Google guides define SEO as making small modifications to parts of your website that together can noticeably impact user experience and its performance.
SEO and Search Engine Ranking Factors
Google and Bing search results for the same query may vary, as these two search engines have unique search algorithms. For example, it is believed that Google RankBrain impacts the page rankings in search results depending on whether the user is satisfied with the search results. Instead of processing individual keywords, RankBrain algorithm turns them into concepts or ideas.
Unique content is no less important for a successful ranking than user experience. Google warns that “if your site suffers from largely identical or duplicate content, the ranking of the site may suffer considerably, or the site might be removed entirely from the Google index”. User experience indicates how user-friendly and transparent your website is for visitors, from navigation and content to mobile usability.
To improve your website performance for search results, you need to conduct search engine optimization.
There are three main types of SEO:
- Internal (On-Page SEO)
- External (Off-Page SEO)
- Technical SEO
Internal Search Engine Optimization (On-Page SEO)
Internal or on-page SEO is aimed at working with the website content. It optimizes page content and structure in order to achieve its best search engine ranking and to attract as much organic traffic as possible.
Here are the main elements of on-page SEO:
- Hypertext markup heading
- Correct page meta description
- Page structure and headers
- Text optimization
- Image optimization
Title in Hypertext Layout
HTML elements are also called tags. It is especially important to properly use header tag <title> for on-page optimization. It is displayed as the page name and forms clickable links that direct users from the search results page to the website. A good title is useful beyond SEO.
A title that is too short, not informative, or copy-pasted from the page description may negatively impact your page ranking results. <title> should contain a brief and accurate page description or its main purpose. A good header may include the name of the brand, product or service, its main characteristics, or the key offer, or a short publication title if it is a blog post.
A few tips for a good header:
- You make the first impression only once. The title will make your page attractive from the first glance to the snippet.
- A recognizable brand or project in the header will make your visitors trust your content better.
- Be concise. Often, the title tag displays only the first 60 characters.
When speaking at Google’s Webmaster Conference in Mountain View in 2019, Phiroze Parakh, Software Engineer at Google, paid special attention to how web search engine results are developed and what role the page titles and descriptions played.
How to Fill Out Meta Description
A meta description is a text in the <meta> tag, which can be displayed directly below the title in search results when published on social media or other resources.
A good meta description combined with a good title can give your client a snippet to click on. Google will only display your meta description if it helps users get a better idea of your page. In any other case, the page description will be generated automatically from its context, taking into account the user query.
Here are the key factors for a good meta description:
- Information. Competent and clear product or service description.
- Structure. Includes basic keywords as well as a call to action.
- Conciseness. Oftentimes, only the first 155 characters in the <meta> tag of description are displayed.
Structure and Headers
Official Google sources, including John Muller in this video repeatedly talk about how important an appropriate website content structure is. In particular, he discusses the benefits of using headings (H1) and subheadings (H2) on the pages of your website. A well-structured page is not only user-friendly but also improves page indexing by offering search services an additional context for relevant keywords.
One of the important criteria for optimizing texts is the structure and clear subject. Unique content is not the only factor important in the page search ranking. For example, critical topics are in a separate category often called Your Money or Your Life (YMYL – Static.googleusercontent.com). It includes :
- News about vital topics such as international events, business, politics, science, technology, etc.
- News about civics, government, and law, such as elections, changes in important legal issues, etc.
- Finances, especially investments, pension funds, taxation, loans, and insurance.
- Recommendations and product reviews, especially on the websites where you can purchase such items.
- Health and safety, including emergency advisories.
When ranking such texts, the reliability and qualifications of each author are taken into account, except for the authority of the source website. Google claims that the priority in this category is given to the materials prepared by the field experts. This ranking method is called Expertise, Authority & Trust (EAT).
To improve your page ranking, use appropriate keywords as natural elements of the context. They should not break out or disturb the text structure. Keyword stuffing is a sure way to get a search engine penalty. Some platforms, such as WordPress, allow you to add additional markers or hashtags to your publications to specify the subject of the text. The best way to optimize your text for search engines is to write high-quality, informative, and natural content.
According to a 2015-2018 Search Traffic Analysis, nearly 20% of the U.S. searches are done through Google Images. During last year’s AMA session at Reddit, Gary Illyes has repeatedly discussed the importance of SEO and the potential of graphic content.
In addition to a properly selected image format, it is worth monitoring the size of the file. Data flows are constantly increasing and the load on the network infrastructure is growing exponentially. For example, in 2017, almost 70% of all mobile traffic was images. At the same time, data is getting bigger, and user patience is getting smaller. A 1-sec loading delay can reduce conversion by 7%.
The main recommendations for image optimization are quite simple:
- Choose the appropriate image format.
- Use image compression without quality loss.
- Use unique images instead of stock photos.
- Specify image sizes, especially when using Progressive Web Applications.
- Manage extra data stored with each image (meta).
- Choose mobile-friendly images.
- Try to use search engine-friendly file names and add a description to your images via <ALT> tag.
- Respect copyright.
External Optimization (Off-Page SEO)
When evaluating your website performance and quality, external factors are as important as internal ones. For example, the page ranking in search results can vary depending on how often other websites link to it or how many users shared it on social media. This is called external SEO, or activities aimed at external ranking factors. This includes almost anything that links to your page or helps give it an external rating.
This is not only about the number of links pointing to your website, but also about their quality. For example, many links from dubious sites can lead to a downgrade or a penalty. At the same time, high-quality links from an authoritative resource can significantly raise the website ranking.
Here are the main elements of the off-page search engine optimization:
- Analysis and link weight-related activities (link building).
- Social media marketing (SMM).
- Search engine optimization for local queries & geo-targeting.
- Search engine reputational management (SERM).
Analysis and Work with Links
Most websites accumulate link mass over time. Mostly, they are referred to or referenced by users, as well as from pages on other websites. A hyperlink is a path from one page to another. To help you find the information you need, search services index millions of links daily for two main purposes: to discover new pages and to improve search engine ranking.
External links can be built (link building) or earned (link earning). Link building equals building relationships with other sites and services in order to get a link to your website from them. Websites earn links organically, without interference, mainly for good, memorable, and useful content.
Webmasters are strongly advised to separate indexable links (dofollow) from those that may not be included in the index (nofollow). There is a number of additional attributes created to simplify link evaluation. For example, paid placements should be marked with Rel = ”sponsored” attribute, and user-generated links should have Rel = ”ugc” attribute”.
Link mass analysis is also called backlink analysis. Backlinks are also called inbound links, as they are directed from a page of one website to a page of another. In the context of ranking, they are similar to voices or likes to the page address, responsible for organic ranking. Search engines generally treat backlinks as confirmation that the content they point to is valuable and high quality. Some experts believe that the link weight is fundamental for the Google PageRank tool, but official sources deny such an approach as outdated and irrelevant.
Link quality and quantity are evaluated differently. Manipulations, including those with quantitative algorithms, led to SEO separation into white, gray, and black. If you want to learn more about this topic, we discuss it in White Hat vs Black Hat SEO post.
Links are divided by their occurrence into three categories: organic (natural), manual, or automatic (self-created). Organic links are links that were created without the page owner’s intervention.
Manual links are placed specifically as a result of any actions of the owner or representative of the page. Automatic links are the result of random repeating actions. For example, links within profiles on forums or in email signatures, repeated each time a user sends an email or leaves a comment.
Link value is an important parameter of external SEO necessary for understanding the link performance on social media. Value is a metric of how useful other users find the information on your page. Social media can generate millions of links because their users continuously leave comments and send messages. Such links are not suitable for use as a ranking element of the total link weight and, therefore, have a ‘nofollow’ attribute that excludes them from direct indexing. Nevertheless, the number of the page mentions on social media is a clear indicator of its value to users.
In addition to the social media itself, forums and boards (Q&A) are also a part of off-page SEO. Some forums have a very high confidence rating, and users are willing to click on their links. This means that links from these resources to your page will positively impact its rating. To get the effect of posting links on forums, it is recommended that you first establish yourself as an active participant. The higher your authority and the more useful information you bring to the community, the more attention you can attract to your website.
Social bookmarking is another way to use social media to categorize and distribute your content. These are special web services where users themselves can bookmark the websites they like, indicate their category, and share their opinions. A growing number of social bookmarks is another positive signal for search engine services about the growing recognition of the value of your content, which entails the increasing page ranking.
The role of social media in search engine optimization is still developing. One should not underestimate its potential and significance. For example, Google recently filed a patent Spammer Detection and Fake User Identification on Social Media.
Search Engine Optimization for Local Queries
It’s not news that search queries take your location into account. When searching, you can also limit the area where you need to get relevant results. For example, you can find a cafe near your office, or look for the nearest grocery store if you have moved to a new area. Just as importantly, you can see the opinions of other visitors without leaving your search.
In addition to the reviews, local pages will also show the address, contacts, opening hours, and visitors’ peak times, so that you can choose a good time for your visit. Local optimization is a popular and promising direction, therefore we have compiled the most comprehensive local seo checklist for all who may be interested in this topic.
Brand Mentions and Reputational Optimization
The more often your brand is mentioned and the more credible the sources are, the better it is for your ranking. Mentioning does not always mean direct links to your website, which can complicate the search bot work. For such mentions, the term “alleged links” exists, which was coined by Google as part of a patent for the Panda algorithm.
“An implied link is a reference to a target resource, e.g., a citation to the target resource, which is included in a source resource but is not an express link to the target resource.”
Search algorithms do not yet know how to distinguish the critical feedback and subjective, emotional reviews. Therefore, so far the task of separating negative reviews from positive ones is entrusted to people. A survey of Bright Local users showed that in just a year they became less likely to be influenced by negative product or service reviews online, more interested in positive reviews, and generally thought more about the reviews they received.
At the same time, only 13% of users are willing to try products or services, despite the low ratings or poor reviews. Therefore, working with negative feedback and responding to reviews is definitely worthwhile.
According to WEF studies, more than a quarter of the total capitalization of a business is directly related to its reputation. Reputation management in SERMs can be attributed to the company’s comprehensive Online Reputation Management. Its main goal is to build a positive perception of the company’s online activities and manage negative reviews. It is difficult to overestimate the importance of this work, as 88-92% of potential employees or customers will primarily turn to search services to research your company or services.
Technical SEO primarily analyzes the structure and technologies that were used to create the website. However, not all aspects of technical SEO require the knowledge and skills of a developer. An adaptive layout, as well as the latest certificates and protocols for secure data transfer, will positively impact your website rankings.
Here are the basic steps of technical SEO:
- Adaptation to mobile devices.
- Improvement of the website loading speed.
- Optimization of the link structure.
- Optimization of the URLs
- Management of the duplicate content.
More than 50% of network traffic worldwide falls on smartphones. An adaptive layout and mobile-friendliness are becoming mandatory for a modern website. This trend is so established now that in May 2019, Google announced a change in the indexing order of new websites. Now, the priority will be given to the mobile version in case of a mobile search. Previously, websites with mobile versions did not have the priority. This step has been under development for over two years, and in order to help developers of new websites, Google launched such tools as Google’s Mobile-Friendly Testing Tool and Google’s Mobile Usability tool that are also available in your website search console.
Improving Website Loading Speed
There are many parameters that can impact the website’s loading speed, from image size and quality to the order in which page content is loaded. Every year, website structure changes. The faster we can transfer and download information, the more content they try to post. Images, photos, and videos can occupy about half of the total data that makes up an average website. Therefore, one way to control the download speed is to reduce the volume by compressing images and video without losing quality. They also recommend using the services of a reliable hosting provider that can provide uninterrupted data access to your site for the required number of users.
A more detailed approach involves evaluating scripts, styles, and the total number of external requests that a page must send in order to load. If you use too many external scripts, plugins, or snippets, the page will need to turn to a third-party server to load each of them, and this is additional loading time. Asynchronous loading will help your page not stumble over a particularly slow script and allow it to load all other content. Optimization of styles and templates controls their quality and speed. You can predefine some for users with slow connections and change the content on-the-fly to a specially optimized version.
Google research confirms a simple principle – no one likes to wait. The longer your page loads, the more users you lose. PageSpeed Insights is a tool to comprehensively optimize your page loading speed.
Website Link Structure Optimization
Experienced Internet users can tell a lot about how a link looks without even opening a page. The quality of anchor texts impacts both the conversion and the readiness of a user to share your links. Compare the two links below. Which one would you rather share and click?
Therefore, it is important to follow a few simple rules to ensure the best optimization results for you and your users:
- Make your links readable to users.
- Use only lower case in link addresses.
- Use a dash “-” as a separator between words.
- Try to make links as short as possible, especially if you use CMS.
- Avoid extra words or characters.
- Do not forget about the keywords, but don’t overdo it!
For users and search services it is important to know what your site is about. The structure and navigation of the resource will help them in this. To get acquainted with the new site, a special search tool called GoogleBot must follow its links. The more logical the site is built, the easier it will be to determine the relationship between the pages, highlight sections, subsections, and their contents.
A well-thought-out structure will allow automated tools to more accurately determine the importance of the content of a page and, accordingly, will improve your position in the search.
One of the reasons the structure is so important is internal competition. Without adequate structure, all your pages will compete with each other for every visitor because they are equally important in the eyes of the search service. Navigation and the site map will clearly determine the location of each page and grow the competitiveness of the website in search rankings.
Another reason to pay special attention to navigation is user experience. The overall website usability often lags in the pursuit of increased conversion. Navigational discomfort on the page will confuse and turn users away. This, in turn, will lower the value of your page and decrease its ranking.
Managing Duplicate Content
The more information we produce and store, the more likely it is to duplicate content. Websites are no exception, especially given their dynamic nature and the need for regular content updates. Duplicates are complete considerable copies of the pages or content that were already published. However, duplicate content may not the result of an error or a manipulation attempt. For example, forums often offer several options for displaying a discussion page, including only text. News websites offer print versions of their posts. These are examples of the duplicates of the main pages or page materials.
Canonicalization is using special attributes to mark one of the duplicate pages as the main version for search ranking services.
Multiple page copies with little difference would look to search tools like an attempt to increase the link mass and would most likely lead to a low ranking for the entire cluster of links. Since search services try to find and mark the primary source of information, it is important for them to understand what is considered plagiarism and what is not. Even when using canonical links, try to avoid content duplication when possible. There is a whole arsenal of special recommendations in order to simplify this task – SupportGoogle .
To improve your ranking, you should regularly monitor updates of search services recommendations. Consistency, perseverance, and a systematic approach will help you achieve excellent results! Remember that SEO is an ongoing process of ranking improvement and bug localization, rather than a one-time action.