Minimalistic web design is all the rage nowadays with many companies adopting simple color palates and single page designs for their websites. The single-page websites run on extensive CSS and JavaScript components, and are low on text. The aim is to appear hip and trendy. But are such websites easily crawl-able too? And the big question – how does such sites compare to multi-page web designs when it comes to SEO considerations? Matt Cutts – head of the Google Webspam answers these questions and more regarding the effectiveness of single-page websites.
Google bots are getting better at understanding JavaScript
To create stunning and dynamic effects on single-page websites, web developers often use a lot of JavaScript components. Matt assures that Google bots have gotten much better at reading JavaScript components. In fact, he says that even if people use complex and unusual JavaScript code on websites, it can still be read pretty well by Google bots.
Matt’s suggestions
Although Matt is pretty confident about the ability of Google bots to read JavaScript and CSS components on single-page websites, he strongly suggests that web developers and designers test their websites for search engine friendliness. Instead of taking a chance and risking your SEO efforts on single-page websites that may not find it easy to appear in search results, web designers should include search engine readable text where possible. Testing can help you remove obscure CSS coding and help identify areas that are very difficult to crawl for Google bots.
In this video, Matt warns that the website design is not the only factor that decides how much traffic the website will attract and suggests that companies think about their content layout and other such factors which may have an important impact.