How to tackle SEO before you ship website to production or even build it
If you have a necessity or an opportunity to gain search traffic, make sure your potential contractor understands that SEO starts early during the research and development stage.
'How to tackle SEO before you ship website to production or even build it'
How to tackle SEO before you ship website to production or even build it
Commonly overlooked approach to benefit from organic traffic is to address SEO development on the very stage of website building (be it first time or redesign).
New algorithms used in search engines made links unsuitable to be used as a main and reliable promotion tool. It’s not completely obsolete, but its ROI dropped down significantly. The primary tasks SEO specialists perform today are web analytics and ongoing consulting on content and website improvement.
A bit of theory: internal and external SEO factors
What specialists and clients mean by search engine promotion? Everyone wants good positions and rankings, but how to achieve the actual result? That’s where the argument reaches it peak. To put it simple, SEO is an attempt to proactively influence a search engine algorithm. It is an attempt to either artificially drive up the metrics — at its worst, or at its best — a series of orderly tweaks and adjustments. Search engines keep track of three metrics:
- First, external factors: who and how is redirected to your website;
- Second, internal factors: the quality of your website. How satisfactory it is for users;
- Third, click factors: how many clicks your snippets get in comparison to your competitors. You can use micro layouts, comments, ratings widgets via external websites and many more to improve your snippets.
You can influence either external factors, or internal. Snippet formatting and proper use of text in meta tags “title” and “description” are very important, though only able to add a few percent to your traffic volume. Let’s take a closer look.
A bit of history: the time of working with external factors has long gone
Working with external factors was vital at the rise of search engine technology. It allowed search engines — and later SEO specialists — to succeed.
THE ORIGINS OF THE CITATION INDEX
The revolutionary concept of PageRank by Larry Page and Sergey Brin of Google was borrowed from the academic environment. The more scientists cite one of their colleagues, the higher his or her citation index and authority becomes. Since 1960s academic magazines have their reputation weighed based on citation index: the more often its articles are cited, the higher the impact factor of a magazine becomes. It all directly correlates with money: the higher your citation index is and the more articles you have published in high impact magazines, the more chances you get to receive a grant or be subsidized. That’s why scientists pay close attention to such factors.
Young post graduates from Stanford adapted old idea to become a solution for new objectives — automated web search. Judging resources by how often they are referenced proved to be a viable hypothesis and Google took off. Russian Yandex independently came to a similar conclusion.
GOLDEN YEARS OF CITATION SPAMMING
When people realized that search engines are a source of potential clients, orders and profit, a struggle for a place in the sun began. Unlike in academic environment, there were no ethical barriers in business. Soon people figured out that one can manipulate links and references to create an appearance of authority. Direct purchase of backlinks and exchange just for the sake of it became a widespread method of promotion. Buying links was easy, relatively cheap, and effortlessly automated. It was easy to analyze your competitors and your ROI could be thousands of percent.
A rad story to read about those times: https://thinkgrowth.org/confessions-of-a-google-spammer-4f2e0c3e9869
Later, several backlink purchase services like Sape emerged and turned into platforms with huge revenue streams, making their founders multimillionaires.
That was quite a blow to revenues and the quality of services search engines provided. Search engines started making their requirements more complex, while SEO specialists were constantly looking for workarounds to efficiently purchase backlinks. It was like a “shell vs armor” competition.
THE PENGUIN ALGORITHM
Search engines saw ultimate solution in either stopping measuring link mass at all or at least diminishing its priority. Google’s Penguin algorithm made that possible.
WHAT’S GOING ON WITH LINKS TODAY?
External factors still affect rankings, but they’re much harder to manipulate these days. Instead of factoring in a link mass, search engines specifically targeted quality links. For example, links from social media content, posted by live users (search engine AI can easily tell real users from bots). Another example: mentions on high authority online media websites that are out of backlink trading game by default.
You can still pay for articles in online magazines and other media to include your links, or hold a promotional offer on social media requiring users to mention your website, but it is more like a vain attempt. In that case, ROI wouldn’t be as effective as it could be during backlink purchase era. So forget mass backlink purchase mechanism. It’s ineffective, but targeted link placement (paid or not) is still a good thing to work with.
What else should you do? The answer is both simple and complicated: you should follow search engines recommendations and build truly great websites.
Internal factors — the costly foundation of a modern SEO success
Working on a website is very important. Ridiculous SEO texts — like “XXX smartphone became ingrained in our everyday life. Visit us to buy XXX smartphone in New York” — have been go to feature of optimisation for a long time. You could optimise your website in a month back then and deal with links once every year or so. Now the accentuation is different: you have to build a solid quality website and constantly improve and develop it. With each year this process becomes more complex and expensive and that poses quite a problem.
It requires involvement of wide range of specialists: designers, programmers, copywriters, photographers and others, which calls for more expenses and extended periods of time for coordination between all sides.
Link development is not dead. Today it is a separate area of expertise, which only gets more sophisticated with every year. Think of it as a Wild West concept: every year there’s more people, less space and competition gets only tougher.
When to take recourse to internal factors
Ideally it should be an endless cycle like this: gather feedback — improve existing solutions — introduce new hypotheses — gather feedback.
Practice shows that SEO is often forgotten. Why is that? First of all, clients tend to be too inert and still nurture the idea of SEO being a combo of primitive optimization and backlinks purchase. Second of all, everyone wants traffic in the future but no one is ready to work hard for it now, right on the stage of the web development. What do they actually want? Let’s see worst case scenario:
- CEO: wants an awesome website on minimum budget with maximum functionality. CEO is often hurt by: additional SEO related tasks, which increase both development time and budget.
- Marketer: wants an awesome website to report to his boss and add one to his resume. The marketer also wants trendy features like call-tracking, concealed user data gathering, automatic recommendations, and many more introduced to website. The marketer is not sure if SEO works and is not ready to pay the increasing price today for a vaguely predictable result in the abstract future.
- Corporate IT specialist: based on his (or her) SEO experience in 2008 knows for sure that all you need is links. The more, the better. Too sceptical about any outsiders intervening in his area of responsibility.
- Contractor (company/freelancer): dreams of making good money and finalising amazing project to include in portfolio.
- PRO TIP: Search Engine Optimisation (SEO) is so logically goes along with website development or redesign, that many agencies offer such consultancy as a separate service (i.e. they ensure newly born\redesigned website is made according to best practices). Check out our article on how to analyse a commercial offer for SEO services.
Soon after the website is launched, CEO asks the marketer: “why are there few clients coming from search engine redirections?”. And this is important for everyone:
- CEO needs clients from search to improve marketing ROI.
- Marketer needs clients to free up additional funds for new interesting tests. In addition, remember that a significant share of conversion rate is built up through search engines and eventually affects your overall CVR.
- IT specialist needs clients too — to prove his decisions right, gather contacts and build up CRM base.
- The company needs clients to flow in through search engines too. Otherwise, reputation of agency is compromised by the fact that it produces “wry” websites, that are not being indexed, drive no traffic and demand constant investments into paid channels.
That’s when an employer goes to SEO agency again. Bad SEO agency immediately starts purchasing links and, eventually, leads a website to a ban on search engines; good SEO agency shrugs its shoulders and proposes a redesign program. So, our CEO is mourning:
money has just been spent and here you go again — new expenses!
To avoid excess expenditure, you should start working on SEO before you even start building a website. Even more: if SEO traffic is important in your project, analyzing it should be the first step of web development.
SEO on every stage of web development
There are many ways to improve SEO, but we are going to dissect them mainly for managers, rather than tech experts. What to pay attention to when you redesign or build a website from scratch?
Development cycle. When we speak of steps to take at each stage of the cycle, we mean an industry standard scheme. Principal stages are:
- Information architecture (IA)
- Website prototype and graphical concept
Though it is a standard, there are some variations to the cycle on the market of development.
Variation used by GRIN tech: following the same steps, we skip drawing mobile version templates and rely on front-end developer to adapt it. It gives designer more time to work on a project, leaving a cost sheet intact.
Some agencies skip stages of IA and prototypes, and get to design right away. It only suits small projects.
Semantic core is a base for Information architecture (IA)
A semantic core is basically a roadmap of a search request organised as a request tree. It represents the real demand structure and answers the question — “How do people search for particular product or service?”. It helps to understand what key categories one should include in navigation menu; how to organise e-commerce catalogue; which pages should drive traffic and many more. Search engines understand synonyms and there is no need to use exact search request templates in your texts.
Semantic data collecting is probably the first thing you would want to execute along with information structure development. Developed architecture and prototypes must reflect actual search requests. Depending on a scale of a project it can take from one up to three weeks.
In case of online stores a website structure is usually identical to its catalogue and the semantic core helps to specify the inclusion order and additional subsections.
Why is it important? Any marketing activity better start with analysis. Are you sure your target audience is real? Don’t you try selling your clients something the way it’s convenient just for you and not them?
A website selling complex machinery equipment can be a great example: imagine the structure of a website’s catalogue that reflects manufacturer’s nomenclature, rather than tasks clients need to perform with those devices.
Evaluate search requests before the developing process to avoid similar mistakes. It may surprise you, but clients are likely to categorise your products/services in their own way.
How it’s done. You should gather more or less popular search requests, which suit your topic and structure them hierarchically. In SEO slang it stands for “semantic core” or just “semantics”.
You can do it manually with the help of Google’s KeyWord Planner. Wide-ranging topics require the use of automated services but it’s not that simple. Automated means of building semantics don’t always work perfectly, so it is going to require a lot of man hours later.
Or you can delegate this task to an agency or freelancer.
The combination of features that improve indexing, insure your visibility through target requests and keep indexing errors at bay. Here are the most important technical factors:
- Common CMS logic:
- checks for duplicates in structure;
- URLs organization;
- sitemap generation &robots.txt;
- if metadata on pages and in categories can be independently edited;
- Server settings: www/non-www pages and redirections from any old versions.
- How fast is the load time of your page?
- Mobile excellence since Google introduced “Mobile first indexing”
Technical elements are implemented on the stage of website layout development and programming. Most technical factors are correlated with CMS (Content Management System) settings. Some CMS are more SEO-friendly and require less effort and attention in process, others do not meet search engines’ requirements that well.
COMMERCIAL BREAK: all web development tools we use to back our solutions are well designed for SEO oriented development.
Why is it important? Because unresolved technical issues can compromise and even negate all optimization efforts, both present and future. For example:
- Bad addressing logic — overflowing with parameters — won’t allow search engine to index your website properly.
- Duplicates: utility copies of some pages can slip into indexing mechanism and lower your ranking, since search engine won’t be able to determine which specific copy contains needed information.
- Long load time: excessively bulky code and/or weak hosting service lowers your chances to outcompete those websites that load fast and smooth.
How it is addressed. Going through all the details on tech factors optimisation requires a whole separate article. Some key examples are listed in the section about typical mistakes developers make. The outline looks like this:
- SEO specialists provide developers with documents on search engines’ requirements.
- Programmers create a test build of a website, secured from indexing via robots.txt or “http auth”. Unfortunately, 90% of developers forget to do this and your test build gets into indexing pool before an actual website is ready, which leads to the issue of duplicates.
- During the server setup and CMS deployment, SEO specialists remind developers which mistakes to avoid.
- Before publishing and after the programming part is done, SEO specialists make necessary final adjustments to a website, partially testing it, meanwhile consulting the rest of the team.
Search engines found a way to measure a quality of websites taking external links almost out of consideration. In a nutshell it goes like this:
- A user is redirected to a website through a search request “X”.
- With a chance of 95% the website is hooked up to Google Analytics.
- This “benign Trojan horse” provides a search engine with all the information about a visit: time, number of pages viewed, amount and quality of interactions (first click, button pushes, video views, etc.)
- Your data is gathered and the system runs it through a benchmark. It is compared with average metrics in your topic/business area.
- If the score is above average, a website goes up in rankings through a request “X” and vice versa.
- You also lose points if a user visits your website, then goes back and enters an alternative form of request in a search engine. That is a clear sign that a user was unable to find what was needed on your page.
So you get a temporary ranking boost through texts, metadata and, to a certain extent, links. Later engines analyse it all in the context of user behaviour and you either solidify your position, or lose it. If your website is clunky, confusing, inconvenient, you’re going to lose your position and no trick will save you from losing traffic. The more competitive your environment, the more significant behavioural factors become to rank. Technical factors have similar impact on the subject: at the end of the day they directly affect user experience.
The foundations of UX, UI and website’s structure are laid at the stage of prototyping, but partially at the stage of IA development.
Why is this important? Behavioural factors cannot really be manipulated like links. These factors cannot be substantially improved postfactum, when the game is on and you’re running a website that needs restructuring, redesign and total conversion. Behavioural optimisation done badly during development stage is what is going to negate all efforts in the end. It is also what makes working with links pointless.
Behavioural factors can be boosted with bots or hired people, but in the end search engines can detect and identify these “fake users” in the mass of natural ones. There are known cases of websites developed by a major SEO agency dropping massively in their rankings due to such manipulations.
How it is addressed. Benchmarks used by search engines are not under public disclosure, so you can not tell what score you need to have to make it to the top of the list. Cheating with a single parameter, like first-click-time is a dead end.
You better take the long but productive road of SEO oriented development: build UX/UI to make it convenient and accessible for users, making them read your texts and purchase your goods/services. UX/UI industry works on solving this problem using “customer decision journey” approach, prototypes, testing and continual improvement based on web analytics.
To decide what solutions are needed to achieve intended results, SEO specialists should analyze competitors and provide feedback to developers and designers (it’s important to evaluate how work of devs and designers affects traffic).
It is a group of factors affecting rankings of online stores and services. It is not valid e.g. for media or social networks entities. The general idea is: some features make a sale proposition more attractive for a user, so it makes sense for a search engine to move a such website up in line and satisfy user’s request. Here are some of the factors:
- Wide and well-indexed variety of products/services
- Contact details that are easy to index
- Variety of payment methods
- Free shipping and different delivery options like customer pickup, courier delivery, pickup point, etc
- Average or below average pricing
- Operational calculators (borderlines with behavioral factors).
Commercial factors mostly fall outside of development cycle and are more related to general business strategy. Catalogue structure is developed from the start, before the IA stage (lookup “semantic core” section). Proper contact details placement is important during the prototyping stage. You should use micro layouts during web page slicing stage. More at Schema and here is a guide list for contacts section.
Why is it important? Because commercial factors are simply a common sense, rather than a search engine’s requirement only useful for robots. People love convenience, variety and fair pricing. Though businesses established in luxury segment don’t really fit in and require different strategies and approaches. To be honest, SEO traffic won’t be a major source of their customer flow anyway.
How it is addressed. Business’s back office is the front line of action when it comes to commercial factors. You should use every bit of potential these specialists have to offer and implement it during development stage.
Design and code are usually of primary priority when a website is developed. “First, we need to make a website. We’ll figure out what to fill it with later”. No! Don’t neglect the content!
Over the years of practice, the industry came to realisation that it is a vicious approach. You risk getting yourself a website with simply no space for content: texts you wrote, after the development was done, are too long, pictures are too large or there are too many of them. Your goal is to create a website according to your content. You should always maximise the potential.
You should start working on content, considering SEO requirements at the stage of IA (information architecture) development. That is going to make your website a sensible container for your content and present information to users in a clear and intuitive way. Among other things, developers won’t have to constantly adjust and improve database structure due to lost parameters.
SEO perspective is:
- build semantic core,
- organise structure,
- review content plan and analyse how all three correlate.
SEO development often forces content plan increase for large projects (some groups of specific search requests require independent pages to be created for them), On small scale projects it is usually enough to only use pages proposed by IA specialist. You will only need to specify blocks’ placement requirements for those pages.
Why is it of importance? Well, it’s a rhetorical question. Content is the purpose of your website. It is too early to start thinking about SEO if your website is not about content.
From a SEO-development standpoint content is important for two reasons:
- It provides initial indexing through prefered search query.
- It sustains behavioral metrics after indexing is done.
Responsive / Mobile
Adaptive approach is the best you can use to create a mobile version of your website. As mobile traffic all around the world goes up, mobile versions slowly become of primary use.
You can start implementing adaptive elements at the stage of prototyping. The major work is done at the stage of design, when interface solutions start emerging. At GRIN tech we only create desktop templates and individual complex elements. Adaptive design is implemented by front end developer. That gives a designer more time to do quality job, without increasing the budget.
Why it’s important. Since the spring of 2015 Google ranks websites with adaptive design much higher. On the other hand, websites that lack adaptive design lose their positions. The fact that your website is also tagged in the search list as “adapted for mobile devices” additionally boosts its ranking (like any other additional implementation).
Mobile versions are, of course, not just a SEO area of competence. You website should also look good on a smartphone to attract customers. The repercussions of having no mobile version and implemented adaptive design can be drastic. At least 50% of search requests come from mobile devices (some regions and markets have much higher percentage). If your website lacks adaptive design it won’t participate in search results on mobile devices and you will lose over 50% of your potential clients. This is especially upsetting if a desktop version is well ranked.
Recommended reading: GRIN tech’s post with a dramatic headline MOBILE APPS ARE DYING. The post actually dives into “Can PWA (Progressive Web Apps) of 2018 make up a worthy competition to native applications?” TDLR: yes, they can. You can build a web entity to work offline and to be open directly from smartphone home screen (and much more).
There are several approaches to mobile adaptation:
- Using a separate domain for mobile version (m.yoursite.ru).
- Redirection to different versions through dynamic views (more in Google instructions)
- Unified URL and different layouts: the smaller the screen, the more primitive interface elements are. A website is represented by a vertical series of narrow blocks.
Using unified URL allows for adaptation with best results and minimum errors. It also guarantees equal indexing with no risk to lose positions due to duplicates issue.
8 typical SEO development mistakes
Now let’s take a look at pitfalls of web-dev, which can seriously throttle your web promotion.
In the perfect world, teams working on web development know every requirement and recommendation search engines have and act accordingly. In reality, nobody really cares about SEO development, considering it to be an “important-but-not-urgent” type of task.
As a result you get a massive bunch of petty and annoying mistakes. You can easily avoid them at start, but fixing it all is very, very expensive. Here are 8 major mistakes:
1. Content loads dynamically
There is an opinion that this should be avoided by default. However, the realities of modern development are such that you just have to do both: give content out statically at first, and once the user interacts with it for the first time (for example with filters), engage dynamic delivery (AJAX).
So it just requires some extra development work and proper guidance.
2. Beautiful pages without content
You can refer to the section above to remind yourself why content is important and how websites without content are born. It is important to understand that websites are made for clients. No matter how beautiful it looks, it won’t work without proper content. No content — no SEO gains.
Solution. All participants — both the customer and the developers — should understand that the content should be taken into consideration as soon as possible. This SEO development centric approach will allow you to quickly get a convenient website and gain a pleasant bonus of SEO-traffic. At the same time, avoid writing text sheets and don’t bother with the number of occurrences. Thanks to modern algorithms, a small text with photos or a video is enough. The use of headings and lists is also worth mentioning: it will increase the readability of the material and improve behavioural metrics of your page (more live visitors, the time spent on the page will increase, etc.)
3. Mobile version for later
See the section “Adaptive” / Mobile “. Mobile traffic is growing, desktop traffic either stagnates or declines. In addition, the mobile version is one of the important ranking factors and from the SEO perspective you have no right to ignore it.
4. “Let’s shut down a website and open a new one on a new domain”
It’s unclear why, but this idea comes to mind to every second website owner. This is a dangerous suggestion. Shutting down the old domain, you lose:
- the precious legacy the domain has accumulated in its segment. Static ranking factors like the age of the domain are a luxury that can not be ignored and swept under a rug. Even if the domain is banned, it is easier to lift the ban and solve the problem rather than start from scratch.
- direct traffic from business cards, bookmarks, old emails and way more. It will deliver a great deal of discomfort to old customers. Don’t treat existing customers like that.
A lot of the times the “abandoned” website is not removed from indexing pool and eventually begins competing with the new one, and often wins. Why? Because the age of the domain is a massive advantage on its side. So you are going to get a domain driven mess, instead of renovation. The old website will eventually lose its positions and the new one is not going to gain any.
Solution. Think twice if you plan to change your domain. Never change it just for the sake of it. But if it’s inevitable you need to perform a neat transfer:
- Set up (!) 301 redirect.
- Set up the main mirror.
5. Change of structure excluding old results
Another great problem is an unreflecting change of structure which leads to losses in SEO results. This usually comes along with a change of CMS. Trying to replace poorly written articles with better ones, developers only have positive intentions. Unfortunately, as a result, all pages seem to be in place but now have new addresses. If the update is more radical you may even get new pages attached to new addresses.
A robot cannot find pages using their old addresses and throws the website out of the index for this query. One day it may find it but you never know when and the result is unpredictable. The lesson is: you are going to lose SEO-traffic, because of someone else’s inability to think things out.
One abstract example can illustrate the problem. There is a page with the address “www.company.com/kittens-for-sale-without-registration/” and it contains text: “Kittens for sale. No registration and SMS”. The new employer/CEO prefers minimalism so after the redesign and update the content moves to “www.company.com/kittens-for-sale/”. The content stays the same, but Google perceives it as a new page, and its way to the top of SERP starts from the ground up.
Solution. You either set up (!) a 301 redirect to new addresses or save previous links from the corresponding pages. Keep in mind that you don’t have to save all pages — it is unlikely that your user agreement can gain you search traffic and customers.
The structure can and should be changed, but do it carefully. You should clearly understand that the change is done to increase SEO coverage, not just for the sake of a new CMS and minimalism.
6. The test version remains in the indexing pool
The test version falls under indexing and begins to compete with the main version. The test version content is identified by the search engine as the primary source (because it was published earlier). The primary domain receives a penalty to its ranking as a site with duplicated content. This is a classic example of a penny mistake which may take months to eliminate.
Solution. It is necessary to close the debug domain from indexing via “robots.txt” file or by using “http auth”.
If you are debugging a new site on the main domain, don’t forget to open it to indexing after launch. “I forgot” is the first answer to the question why the new site is working but is not indexed.
7. Page proofs mistakes: using <H1> – <H6> headers solely for typographic purposes
Unfortunately, developers sometimes don’t bother with additional styles for sidebars and decorative elements that do not contain the main query so they simply use header styles. In this case the h1-h6 headers are important elements of the layout, which provide the search engine with information about the contents of the page. If you used these tags to highlight parts of the text that do not contain key information you get lower chances of proper page indexing.
Solution. h1-h6 headers can only be used in the content part or for the headings of typical elements like reviews, comments, related products, etc.
8. The site is available via five addresses after deployment
You get a set of competing copies (mirrors) after deployment if you didn’t take all needed measures. Site versions can all be on the same hosting but indexed independently: with www or without www, http or https, site.ru or site.nichost.ru. If you bought several domains for one website, just imagine the hell of duplicates of versions you are going to deal with. With a mess like that, bad rankings are guaranteed.
Solution: a website has to be available for indexing via one address. Other versions should redirect the visitor to the main mirror. To do this you need to set up redirect as soon as possible.
Hope our guide is going to help you evaluate your SEO specialist’s competence and avoid common mistakes.
GRIN tech – provides SEO development solution. Give us a try.
Get in touch via Telegram or use an Intercom chat (bottom right).