When you want to "do Search Engine Optimization", one of the first steps is a technical SEO Audit. So we review the technical settings of the website that are important for SEO, and their lack is not good for our ranking in the Google results list.
The Google Lighthouse tool is perfect for this in the first place, which can be accessed directly from the Chrome browser by pressing CTRL + SHFT + i on the page you want to scan.
Click on Lighthouse here and run the audit!
I ran it here on my own page: sollabong.com
Lighthouse is one of Google's free, open source, and automated analytics tools that allows website developers and operators to continuously monitor page load speed and optimization.
Lighthouse scans the site throughout the audit and rates each category between 0 and 100:
Performance: The first report evaluates how quickly key elements of the page appear to users. The result consists of the summation of several measurement points, such as the appearance of the first elements or the time elapsed until the first possible interaction.
Accessibility: the score that indicates accessibility. This includes whether each element is named (e.g. alt tag of images, descriptive text for links), whether the contrasts are appropriate, whether the texts are readable. In short, crawlers and users (readers) can logically read your content.
Best Practices: monitors and evaluates the application of technology solutions in line with current trends and industry best practices.
SEO: The report checks some basic search engine optimization steps. This is not much, but it can be well used as a starting point for a developer.
Let's take a look at the points listed in SEO:
1. ‹ title ›
The page has a title. This is important because in search engines this will also be the title of the page, this is what the search engine will look at first.
2. ‹ meta ›
You can give the page a description, keywords, and many other parameters that will also be used by the search engine, or social sites if someone shares your page.
3. The site does not block indexing,
i.e. we have given search engines permission to publish and store our website on their system.
On the one hand, you need to have such a file uploaded to you in the ‹ head › section. On the other hand, if your robots.txt file is malformed, robots will not be able to understand how to crawl or index your website.
5. alt attribute on images
Here, a short, alternative text should be sought. If not specified, search engines may even completely ignore it, plus scanners will not see what is in the image.
Hreflang links tell search engines which version of a page should be listed in the search results for that language or region.
7. The document avoids plug-ins
Search engines cannot index the content of plug-ins, and many tools restrict or do not support plug-ins.
It helps you crawl your page, you actually give the search engine robot a map of how the pages-subpages are logically related to each other.
9. Structured data is valid
Search engines use structured data to understand what content is on a page. For example, you can say that your page is an article, a job advertisement, or a FAQ.
It’s good to know about the SHEMA system in this regard, but I’ll write about that in detail in a later post.
If you pay attention to just these few things, believe me your customers will be very grateful, or if it’s your own portfolio page, you’ll help yourself a lot to find customers.
This is just the surface for SEO, the rest is really left to the SEO professionals