Here is a collection of common questions about Search Engine Optimisation along with my best attempt at answering each one. Enjoy!
SEO stands or Search Engine Optimisation. It is a term for a set of processes aimed at getting a website to appear in search engines for relevant search queries, and ultimately building traffic through those search engines.
SERP stands for Search Engine Results Pages. This refers to the page displayed by a search engine (e.g. Google / Bing) in response to a query by a user. The SERP lists details of pages that are considered by the search engine to be relevant to the query. SERPs are also home to other results like PPC adverts, shopping adverts, knowledge graphs and featured snippets.
Meta title tags are an important SEO factor. They are HTML elements that act as the title of a web page and appear as the clickable title text that is shown in a Search Engine Results Page (SERP). Each page should have a single specific meta title tag which . When more than one page on a website has the same title tag, it makes it difficult for search engines to correctly categorize and rank each page.
Meta Title tags are important because the more relevant and appealing the title tag is, the more likely a user is to click on your result in the SERP. They are also a relevance factor in that they help to indicate to search engines what the content of the page is about.
A meta description is a HTML tag containing a snippet of text which summarizes a page’s content. Meta descriptions appear as the summary text under the clickable title text (Meta Title) in a SERP.
Meta descriptions give search engines extra information in relation to page content. Also, a well optimised meta description increases the likelihood of click through from SERP.
Image Alt Text tags are HTML tags used to describe the appearance and function of an image on a page. The key purpose of Image Alt Text tags is to describe images to visually impaired users who use screen readers when browsing. However, they are also used by search engines as a factor in indicating the context of a pages content.
Image alt texts are used by search engines to determine the content of the image and surrounding content. Therefore by using image alt texts containing keywords relevant to your page topic, your page is more likely to appear for relevant search queries.
A broken link is any link on your site that, when clicked, leads to a page with a 404 (not found) error. This may be because the web master has deleted the page that the link used to point to, erroneously typed the wrong URL in the hyperlink or changed a URL address without updating the link.
Broken links can prevent crawlers from indexing your site properly and waste crawl budget.
The term “Link Profile” refers to the network of inbound links pointing to your site, as well as the characteristics of those links (i.e. whether they are considered trustworthy and authoritative by Google (and other search engines) or whether they are seen as “spammy” or low quality.
Links and citations from trusted sources send signals to search engines that your website is a trusted source. Great content alone is not enough to rank well in search results. You need to be seen as a trusted source, which is where the link profile comes in. With sufficient links from other trusted websites, your site will be seen as trustworthy as well and search engines will rank your content higher in search results for relevant queries.
Crawl Budget is the number of pages Googlebot (the Google search engine crawler) crawls and indexes on a website within a given timeframe.
Prioritizing what to crawl, when, and how much resource the server hosting the site can allocate to crawling is important particularly for big sites, or those that auto-generate pages based on URL parameters.
If a website isn’t optimised to allow the Googlebot (or other search engine crawlers) crawl its pages efficiently it will reach a point in time where it reaches its “crawl limit” i.e. its allocated bandwidth allowance for a website site and moves on the next website on its crawl schedule. If your site is not optimised in this regard then Googlebot may not index all of your pages.
There are a number of things you can do to optimise for crawl budget including a) making sure that the pages that are crawled return either a 200 or 301 code, b) ensure that your pages load as quickly as possible, c) block pages that do not need to be indexed via the robots.txt file, d) avoid long redirect chains, e) where possible stick to HTML content (as opposed to Flash, XML, etc.) and d) keep your XML sitemap up to date and include all your key content in it.
An XML sitemap is a file that lists the URLs for a site in a format (.xml) that is easily readable by search engine crawlers. in it you can provide information about each URL: when it was last updated, how often it changes, and how important it is in relation to other URLs in the site.
XML sitemaps feed search engines data on the pages of the site as well as the crawl priority or hierarchy of site content, allowing them to crawl your site more effectively and efficiently, ensuring that your key content is indexed.
A robots.txt file is a simple text file which is stored at root level of a website and informs web robots (e.g. search engine crawlers) which areas of the website should not be processed or scanned.
The robots.txt instructs search engine how to crawl and index pages on a website. It restricts access to certain pages allowing crawlers to more effectively
HTTPS stands for Hypertext Transfer Protocol Secure. It is an evolution of the Hypertext Transfer Protocol and is used for secure communication over the internet.
If you’re running both HTTP and HTTPS versions of your homepage, it is very important to make sure that their coexistence doesn’t impede your SEO. Search engines are not able to figure out which page to index and which one to prioritize in search results. As a result, you may experience a lot of problems, including pages competing with each other, traffic loss and poor placement in search results.