Number of pages URLs are too long
Posted: Sun Feb 02, 2025 3:49 am
On many occasions, we see that links to social networks are placed with this attribute.
In my opinion, just as we must pass value to our internal link strategy, we must also pass value to social networks so that they at least receive authority from their owner's domain.
We can encounter this problem in the systems we have to share any information on social networks.
In these cases, the decision is often to reduce the number of social networks on which information or a product is shared.
Not everything is ready for social media.
This is a parameter that indicates that a URL is too long in number of characters and we may have problems at the device level and with Google itself.
If we have URLs with too many characters we should know that not all devices will work the same.
Above all, we can have some problems with older smartphones.
With the new devices we will not have any problems, but as SEOs we do have to know that a URL that is too long usually contains too much information that may be duplicated in another URL.
This is because we find these types of URLs in domains with saudi arabia mobile database little control over their URLs.
To solve this, it is advisable to have a 301 system that allows us to create a more precise URL but with a redirection from the old URL to the new one.
Robots.txt not found
It's hard for SEMrush to be confused by this information.
It is telling us that the robots.txt file (already mentioned) is not in our domain.
If we don't have it, we are giving Google all the tracking possibilities.
Which is certainly not the most convenient for our project.
To solve this problem, it is best to carry out a thorough study of the conditions that we want to define in our tracking strategy and capture them in this file.
Be careful not to have more than one robots.txt freely distributed throughout our domain without any control.
Pages have hreflang language mismatch issues
If we want the international semantics of our domain to be understood by Google, we need to have a correct code that tells Google what is what.
Hence the importance of having these tags correctly configured.
This is a warning that we need to check against the international targeting data in Search Console.
Both tools must give this data as positive if we want the internationalization of the language to be correct.
In my opinion, just as we must pass value to our internal link strategy, we must also pass value to social networks so that they at least receive authority from their owner's domain.
We can encounter this problem in the systems we have to share any information on social networks.
In these cases, the decision is often to reduce the number of social networks on which information or a product is shared.
Not everything is ready for social media.
This is a parameter that indicates that a URL is too long in number of characters and we may have problems at the device level and with Google itself.
If we have URLs with too many characters we should know that not all devices will work the same.
Above all, we can have some problems with older smartphones.
With the new devices we will not have any problems, but as SEOs we do have to know that a URL that is too long usually contains too much information that may be duplicated in another URL.
This is because we find these types of URLs in domains with saudi arabia mobile database little control over their URLs.
To solve this, it is advisable to have a 301 system that allows us to create a more precise URL but with a redirection from the old URL to the new one.
Robots.txt not found
It's hard for SEMrush to be confused by this information.
It is telling us that the robots.txt file (already mentioned) is not in our domain.
If we don't have it, we are giving Google all the tracking possibilities.
Which is certainly not the most convenient for our project.
To solve this problem, it is best to carry out a thorough study of the conditions that we want to define in our tracking strategy and capture them in this file.
Be careful not to have more than one robots.txt freely distributed throughout our domain without any control.
Pages have hreflang language mismatch issues
If we want the international semantics of our domain to be understood by Google, we need to have a correct code that tells Google what is what.
Hence the importance of having these tags correctly configured.
This is a warning that we need to check against the international targeting data in Search Console.
Both tools must give this data as positive if we want the internationalization of the language to be correct.