If you want your website to get noticed by visitors and potential clients, following at least the basic SEO-practices is essential. However, to fight for the top page in Google – you have to be even more advanced. Advanced SEO can require a ton of website changes. Adding tags, optimizing website speed, and making sure bots will crawl your page – all of that is needed in the fight for top 10.
However, making website changes without knowing how to code is similar to moving in the dark. There will be a lot of stumbling blocks and obstacles which can only be avoided if you understand the code. If you are not a coder, consider paying a developer salary so that there’s a techie on your team.
Why good development is important for SEO?
It’s common to think that the only task of SEO is to put as many keywords as you possibly can into your content. And it was so, ten years ago. Now, the algorithms have changed – there are many other things GoogleBot takes into account while crawling the page.
Here are a few things that matter:
- Time for a script to load;
- Pagespeed – the page has to load in under 2 seconds;
- Cross-device friendliness;
- Website security – making sure that the website and its service has a valid SSL-certificate.
- Business website responsive design.
All of these tasks are directly linked to development – so, SEO is as much of a part of web development services as it is for marketing. It’s only through code that you can improve user experience and improve the average website conversion rate.
SEO-friendly development: best practices
Provide structured data
When GoogleBot is crawling websites, it’s trying to understand the content of each page. If you want to give a better understanding of the page to a bot, add structured data to describe the key elements of the page.
The source code for structured data is created by Google and can be downloaded. Then, you’ll have to test the code. In order to do it, developers use the Structured Data Testing Tool (SDTT).
Red flags to watch out for:
- Provide a relevant description of the page. Google doesn’t show rich results for time-sensitive content.
- Mark the content that your readers can see on the website. Otherwise, Google bot will consider your structured data potentially misleading.
- Don’t be misleading while labeling the content.
- Don’t mark pages that are a summary of links to individual pages (lists of jobs, announcement board page, etc.).
Prerender your page
Those developers who power their website with JS have noticed GoogleBot struggles to render the pages. It’s been commonly noticed that HTML-pages are treated better in rankings than JS ones. The reason for that is that Google struggles with rendering rich technology – media, snippets, etc. To make the rendering easier for the bot, you can basically do it yourself. While a bot will get an HTML-snapshot, a website user will still see the page with top JS frameworks. By the way, React is one of the most popular JS frameworks and you can find out why over at this website.
Tools for prerendering: you can either go with external ones like Prerender.io or use PhantomJS on your server-side.
Find out if your website is Bing-friendly
With over 11 million of audience reach, Bing is a search engine you shouldn’t ignore. Though 99% of search engine optimization practices are shared between Bing and Google, there are a few peculiarities.
1) Down-level website experiences
Unlike Google, that, with a few issues, can still crawl rich media, Bing might not be able to index your JS-rich website. To make sure the crawlers can still get in, optimize the website technology – use down-level experience so that a user with an older browser version can see an HTML-version of the website. Bingbot will see the version with rich media and interactive elements as well and indexing will get way easier.
2) Use 301-redirects
Instead of a temporary 302-redirects between pages used on multiple websites, be sure to implement permanent redirects. Bing considers redirect a strong ranking factor and will boost your website’s ranking. Using right redirects is a strong ranking factor for Bing that can significantly improve your website.
3) Flash is okay, as long as pages have separate sitemaps
It’s no secret that Google has no love for Flash-powered pages. So, if you want to have good rankings on Big G, go slow on powering your website with Flash. Keep in mind, however, that Bing is likely to rank these pages pretty well. To make the job easier for BingBots, create a separate sitemap for any page that uses Flash.
Make your links crawlable
Search engines discover websites through links. If the link to your page has technical errors, it will be impossible for a search bot to get through the content of the page itself. The URL doesn’t seem significant in terms of SEO. However a poorly optimized will make matters worse.
When using an alt tag, don’t disregard a =href attribute. Any other attribute will not be crawled by Google. At the end of the day, your link has to look the following:
Avoid ‘zombie pages’
When Brian Dean wrote a post about top SEO tips, he had a lot to say about zombie pages – the ones no one cares about. “The more – the merrier” has been a common misconception when it comes to pages – after all, it’s yet another chance for your website to be found. Is it really so?
Turns out, pages no one visits are a dead weight for your website. Not creating them in the first place is the right thing to do. One page with heavier SEO-weight will rank higher than a few insignificant ones – Google stated so themselves.
“Those smaller pieces themselves might not be that clearly kind of targeted where we can really recognize actually all of the information is here and they can navigate to the rest of the site here.”
John Mueller, Trend Analyst, Google April 2018
Best web development tools for website developers
It’s crucial to know all the newest SEO-trends and practices. However, the process of optimization is a lot of analysis and inspection. While we can detect gross errors manually, some pieces of the code can only be noticed with the help of technology.
Developers all over the world are using SEO-tools to improve website inspection and audits. Let’s take a look at most popular SEO-instruments.
Google Search Console
Google’s own Search Console is hands down the most important tool for tracking and analyzing your website’s performance. After being installed on a website, the Search Console will alert you of any malware, errors, or notable notification about your website’s performance and current rankings.
Google Pagespeed Insights
Website speed is crucial for SEO-rankings. The time for a page to load should be under 2 seconds and that of a script is under 5 seconds. If any of these operations takes longer, there’s a risk crawlers won’t index the page. To make sure that website speed is under control, developers use Google Pagespeed Insights.
Google Chrome 41
Google Chrome is constantly updating, and it would seem that GoogleBot would use the latest version of the company’s browser. Yet it doesn’t!
For website rendering, Google uses Chrome 41 released in 2015. Since then, a lot has changed in web development but all of these features might not be supported by crawlers. In order to make sure if your website is renderable, display it in Chrome 41. If you have no errors, fingers crossed that neither will Google crawlers.
Out of so many tools available for SEO-audits, Deepcrawl is among the most advanced ones. It will tell you if there are duplicate pages on the website, make an inspection for failed URLs and paginated pages.
Good news is, you won’t be getting a list of mistakes with no suggestions – a tool offers a few possible solutions you can implement right away.
- Good SEO is crucial – high rankings alone can bring thousands of people to your website so that you no longer need to put as much money into social media promotion.
- The UI design also makes a difference – make sure your page load time is under 2 seconds, your website is mobile-friendly and works for different platforms.
- Apart from hiring a developer who knows SEO-practices and takes them into account, consider using tools to track your website’s performance. That way, you’d have a three-dimensional outlook on how your website is doing.