This is a paragraph! Here's how you make a link: Neocities.
Here's how you can make bold and italic text.
Here's how you can add an image:
Technical SEO or technical SEO is one of the three main parts of SEO site along with internal SEO and external SEO . SEO experts always use these three main areas as the basis of their work to optimize websites. In this article, we will introduce you to the SEO technical checklist for SEO experts and enthusiasts.
Technical SEO is a topic that only some of us use carefully, yet it is a part of our lives. But if we want to look at it in its entirety, which part of SEO is not technical?
Our technical list includes SEO issues, mistakes, tips and advice. By doing this, we want to cover in the most effective way possible all the important elements for building a user-friendly, efficient, visible, practical and easy-to-understand website.
What is SEO Technical?
Technical SEO refers to the process of optimizing your website for the crawl and indexing phase. In fact, with the help of technical SEO, you can help crawlers and search engines to interpret and index your website without any problems.
Technical SEO is called technical because it has nothing to do with the actual content of the website or the advertisement of the site. The main goal of SEO is to optimize the infrastructure of a website and increase site traffic .
To get an overview, it is better to take a look at the chart below, which shows the three main pillars of SEO, namely technical SEO, in-site SEO and off-site SEO.
In-site SEO is about optimized content production techniques and how you can relate it to what the user is searching for. Off-site SEO, also called link building, is the process of getting links from other websites to increase trust during the ranking process.
Website load time
Server response time refers to the period of time it takes for HTML code to load before a page is presented. Basically, when you access a page, a message is transmitted to the server and it takes time for the server to show you the information.
Server response time depends on how long Googlebot needs to access the data. This time can be 1, 2, 3 seconds or more that converts your visitor or not. Google says you should keep the server response time below 200ms.
Here are three steps you can take to test and improve server response time:
If a website loads very slowly, one of the first things that should come to mind are images. Because the size of the images may be large. We are not talking about size on the page, we are talking about size on disk.
As mentioned before, in addition to the information that an image has, it also loads a large number of bytes on a page, causing the server to spend more time loading information. Instead, if we optimize the page, the server will run faster because extra bytes and irrelevant data have been removed. The lower the number of bytes loaded by the browser, the faster the browser can load and submit page content.
Extensions such as JPEG, PNG and GIF are the most commonly used types of extensions for images, so there are many solutions for compressing these images.
Here are some tips and tricks for optimizing images:
You will see this message when you perform a speed test with the Google Speed Insight page:
If Google detects pages that are delaying the first submission time because of blocking practices, you should optimize CSS delivery here. You can use two options to do this:
For each resource you will have a specific option:
Below you can see an example of how to reduce your CSS.
Redirects can save you a lot of trouble such as link problems and page crashes, but if there are a lot of these redirects it can cause a lot of problems for you. Many redirects can load your website slower. The more redirects, the more time the user should have for the landing page. Another thing to note is that you should only have one redirect per page, otherwise you risk a redirect loop. A redirect loop is a chain of redirects for a page that can be misleading because the browser does not know which page to show and will end up with a very nasty error. If you have a 404 page, there are many ways to customize the page and provide some tips for users so don't miss them.
Website capability and usability
With more than 50 percent of all users worldwide using their mobile devices to browse the Internet, Google has prioritized mobile indexing. You need to make sure that your website is optimized for mobile devices. This optimization is usually used in terms of design as well as speed and performance. It is generally preferable to design a responsive site that is responsive to mobile and PC rather than designing a separate version for each.
URLs are very important, and you need to get them right the first time so you don't have to change them. URLs are useful for users and search engines that are descriptive and have keywords. However, many people often forget about this and build websites with dynamic URLs that are not optimized at all. This does not mean that Google does not accept such URLs. They can be ranked, but eventually you will have to integrate them with new ones to improve their performance, and this will be a struggle for search engines.
We've talked many times before about easy URLs or URLs, how important they are. Avoid ambiguous parameters in the URL. It will also be difficult for you to do link building. You may miss link exchange opportunities because of the appearance of the URLs.
If you're a WordPress user or site builder Brms you should use personalization options and set your link structure. Creating a standard URL is not difficult. You can follow these 3 tips for this purpose:
In August 2014, Google announced that it had added the HTTPS Secure Protocol to its list of new ranking agents and advised all sites to switch from HTTP to HTTPS.
HTTPS encrypts data and prevents it from being altered or corrupted during transmission, while also protecting it against attacks. In addition, improving data security is another benefit of buying traffic to your site, some examples of which are given below:
You also want to make sure that all your other versions point to the correct version of the site. If people have access to a copy, they should be automatically redirected to the correct version.
These are all versions:
Site migration is a recommended operation if the website changes completely and no longer uses that domain. You can apply 301 redirection settings if you make a move.
The subject of Google crawlers is usually the Robot.txt file. Testing your Robot.txt file helps Google crawlers figure out which pages they can search, and which pages they can't. Use this method to give your data access to Google.
You can find your online Robot.txt file at http://domainname.com/robots.txt.
Make sure your files are in the correct order. You can use Robot.txt file tools in search consoles. This tool is easy to use and shows you if your Robot.txt file is blocked. Ideally, no errors should be observed in this test.
Content marketing and SEO expert James Parsons points out the importance of this in an article.
The search console can provide insightful information about the status of your indexing pages in Word. The steps are very simple. You can go to the next list and then to the index and create a chart similar to the one below.
An XML sitemap tells Google how to organize your website.
Google crawlers can read and understand how to build a website in a more understandable way. A good structure means that crawlers have a better understanding of the website. You can use dynamic XML sitemaps for larger sites. Do not try to synchronize everything between XML, Robots and meta robot maps.
The search console can once again be a savior. In the crawler section you can find the site map report and add the map, management and test file there.
Up to this point you have two options. Whether to test a new sitemap or a previously added item. About the first:
About the latter: You can test the sitemap you have already submitted and click on the survey results.
Since we talked about a site redirect program to migrate a site, it's a good idea to know why Google does not recommend using Meta Refresh to transfer a website.
Here are three ways to define a diversion:
Hreflang tags are used for URLs of Internet and regional languages. It is recommended to use rel = ”alterna” and hreflang = ”x”, which are properties for the correct language or region URL services in search results. There are other conditions that can be as follows:
Tracking your website is really important. There will be no progress without tracking your results.
Post-migration problems are difficult to track via HTPP and HTPPS. They can corrupt the tracking code and lead to data loss.
When we talk about technical SEO, we also think of duplicate content, which is a serious problem. To check and remove duplicate content, click on Duplicate title tags in the HTML Improvement section of Google Webmaster Tools.
Structure Data is a way for Google to better understand your content as well as help the user to select and receive directly from the page you want through search results. If a website uses structured bookmarking data, Google may display it on the search screen:
In addition, structured data can be used for the following:
John Mueller told Google Webmaster Hangou that Google does not encourage blog subsets to focus on the root of the blog in the original version. Because subcategories are not real copies of the blog homepage, it makes no sense to do so.
User friendly website
Google recommends that you use AMP to improve UX, which is very important to the company. Since Google AMP changes affect many sites, it is better to know and customize how it works and how to set it up and install it on different operating systems such as WordPress, Drupal, Joomla, and many others.
Google AMP does not directly affect SEO, but it can affect indirect factors.
Breadcrumbs are actually used to find a way home; which websites also use for the same purpose so that they can direct the user through the website. They help visitors find out where they are on the site and provide tips for easier access.
Breadcrumbs can improve the user experience and help search engines get a clearer picture of the site structure. Another advantage is that they reduce the number of actions as well as clicks that the user has to make on the page. Instead of going backwards to Targetedwebtraffic.com, users can easily use a bunch of links to get where they want to go. This method can be used in optimizing large sites or e-commerce sites.
To learn more HTML/CSS, check out these tutorials!