How it works

  1. Form your cart
  2. Checkout with invoice
  3. We'll send you a link to book a discovery call
  4. Once we clarify everything together & invoice is paid we'd start working on your project
Full service agency

For those who need design, coding, or digital marketing, the GRIN tech is a full service agency that is able to deliver turn-key solutions.

🤙 Book a discovery call

👋 Can a full service agency build you a business?

White label

For agencies & industry professionals who want to focus on strategy and business growth, the GRIN tech features a white-label solution  that covers pretty much full cycle: design, code & marketing (including PPC, SEO, link building, content, and lead generation). 

🤙 Book a discovery call

prospecting platform

For sales professionals engaged in b2b prospecting, the Hound @ is a prospecting platform helping to fill the pipeline with qualified leads. Unlike the competition, our product features both data discovery & message delivery tools.

🤘 Give it a try

& Outreach

For companies who need backlinks, the Launcher is a productized outreach service that delivers opportunities to your Trello board on autopilot. Unlike the competition, our service is transparent & pricing is performance-based.

👋 Watch explainer video

🤙 Book a discovery call

SEO Web Development for Beginners – Complete Guide

A commonly overlooked approach to benefit from organic traffic is to address Search Engine Optimization (SEO) wise development on the very stage of website building (be it the first time or redesign, update: check out how we recently redesign Buschman Store).

New algorithms used by search engines made links unsuitable to be used as a central and reliable promotion tool. It’s not entirely obsolete, but its ROI dropped down significantly. The primary tasks SEO specialists perform today web analytics and ongoing consulting on content and website improvement.

A bit of theory: internal and external SEO factors

What do specialists and clients mean by search engine promotion? Everyone wants top positions, but how to achieve the actual result? That’s where the argument reaches its peak. In other words, SEO is an attempt to influence a search engine algorithm proactively. It is an attempt to either artificially drive up the metrics — at its worst, or its best — a series of orderly tweaks and adjustments. Search engines keep track of three metrics:

  • First, external factors: who and how is redirected to your website;
  • Second, internal factors: the quality of your website. How satisfactory it is for users;
  • Third, click factors: how many clicks your snippets get in comparison to your competitors. You can use micro layouts, comments, rating widgets via external websites and many more to improve your snippets.

You can influence either external factors, or internal. An example of internal optimization is HTML rich snippet formatting and proper use of text in meta tags “title,” and “description” are very important, though only able to add a few percents to your traffic volume. Let’s take a closer look.

A bit of history: the time of working with external factors has long gone

Working with external factors was vital at the rise of search engine technology. It allowed search engines — and later SEO specialists — to succeed.


Larry Page and Sergey Brin of Google borrowed the revolutionary concept of PageRank from the academic environment. The more scientists cite one of their colleagues, the higher his or her citation index and authority becomes. Since 1960s academic magazines have their reputation weighed based on citation index: the more often its articles are cited, the higher the impact factor of a magazine becomes. It all directly correlates with money: the higher your citation index is and the more articles you have published in top impact magazines, the more chances you get to receive a grant or be subsidized. That’s why scientists pay close attention to such factors.

Young postgraduates from Stanford adapted the old idea to become a solution for new objectives — an automated web search. Judging resources by how often they referenced proved to be a viable hypothesis and Google took off. Russian Yandex independently came to a similar conclusion.


When people realized that search engines are a source of potential clients, orders, and profit, a struggle for a place in the sun began. Unlike in the academic environment, there were no ethical barriers in business. Soon people figured out that one can manipulate links and references to create an appearance of authority. Direct purchase of backlinks and exchange just for the sake of it became a popular method of promotion. Buying links were easy, relatively cheap, and effortlessly automated. It was easy to analyze your competitors, and your ROI could be thousands of percent.

A rad story to read about those times:

Later, several backlink purchase services like Sape emerged and turned into platforms with colossal revenue streams, making their founders multimillionaires.

That was quite a blow to organic search results Google provided (Bing & Yahoo included). Search engines started making their algorithms more complex, while SEO experts were continually looking for workarounds to purchase backlinks efficiently. It was like a “shell vs. armor” competition.


Search engines saw the ultimate solution in either stopping measuring link mass at all or at least diminishing its priority. Google’s Penguin algorithm made that possible.


External factors still affect rankings, but they’re much harder to manipulate these days. Instead of factoring in a link mass, search engines specifically targeted quality links. For example, links from the social media content, posted by live users (search engine AI can quickly tell real users from bots). Another example: mentions on high authority online media websites that are out of backlink trading game by default (still there are cases when sites like Forbes are found guilty of link selling).

You can still pay for articles in online magazines and other media to include your links, or hold a promotional offer on social media requiring users to mention your website, but it is more like a vain attempt. In that case, ROI wouldn’t be as effective as it could be during the backlink purchase era. So forget the mass backlink purchase mechanism. It’s ineffective, but targeted link placement (paid or not) is still a good thing to work with.

What else should you do? The answer is both complicated and straightforward: you should follow search engines recommendations and build great websites.

Internal factors — the foundation of a modern SEO success

Ridiculous SEO texts — like “XXX smartphones became ingrained in our everyday lives. Visit us to buy XXX smartphones in New York” — have been a go-to feature of optimization for a long time. You could optimize your website in a month back then and deal with links once every year or so. Now the accentuation is different: you have to build a reliable quality website and continuously improve and develop it. With each year this process becomes more complex and expensive, and that poses quite a problem. For me, in many industries, content marketing now seems as building editorial media from scratch. Think if Ahrefs and Intercom blogs.

It requires the involvement of a wide range of specialists: web designers, programmers, copywriters, content strategists, photographers, and others, which calls for more expenses and extended periods for coordination between all sides.

Link development is not dead. Today it is a separate area of expertise, which only gets more sophisticated with every year. Think of it as a Wild West concept: every year there’s more people, less space and competition gets only tougher.

When to take recourse to internal factors

For bureaucratized businesses the time is right when they are redesigning or building a website from scratch, when (almost) every member of your company is ready for changes.

Ideally it should be an endless cycle like this: gather feedback — improve existing solutions — introduce new hypotheses — gather feedback.

Practice shows that SEO strategy often overlooked and SEO efforts are sporadic at best. Why is that?

First of all, clients tend to be too inert and still nurture the idea of SEO being a combo of primitive page optimization and backlinks purchase. Such an attempt to game search engine results at no effect at best and in penalties at worst. Second of all, everyone wants traffic in the future, but no one is ready to work hard for it now, right on the stage of web development. What do they want? Let’s see the worst case scenario:

  • CEO: wants an excellent website on minimum budget with maximum functionality. CEO is often hurt by additional SEO related tasks, which increase both development time and budget.
  • Marketer: wants an excellent website to report to his boss and add one to his resume. The marketer also wants popular features like call-tracking, concealed user data gathering, automatic recommendations, and many more introduced to the website. The marketer is not sure if SEO works and is not ready to pay the increasing price today for a vaguely predictable result in the abstract future. Pay Per Click (PPC) is a more predictable thing for him.
  • Corporate IT specialist: based on his (or her) SEO experience in 2008 knows for sure that all you need is links. The more, the better. Too skeptical about any outsiders intervening in his area of responsibility.
  • Contractor (company/freelancer): dreams of making good money and shipping an amazing project production & to include it in a portfolio.
    • PRO TIP: Search Engine Optimisation (SEO) is so logically goes along with web design and development, that many agencies offer such consultancy as a separate service (i.e., they ensure the newly born\redesigned website made according to best practices). Check out our article on how to analyze a commercial offer for SEO services.

Soon after the website launched, the CEO asks the marketer: “why are there few clients coming from search engines?”. And this is important for everyone:

  • CEO needs clients from search to improve marketing ROI.
  • A marketer needs clients to free up additional funds for new exciting tests. Besides, remember that a significant share of conversion rate is built up through search engines and eventually affects your overall CVR.
  • IT specialist needs clients too — to prove his decisions right, gather contacts and build up a CRM base.
  • The company needs clients to flow in through search engines too. Otherwise, the reputation of the agency compromised by the fact that it produces “wry” websites, that are not indexed, drive no traffic and demand continuous investments into paid channels.

That’s when an employer goes to an SEO agency again. Corrupt SEO agency immediately starts purchasing links and, eventually, leads a website to a penalty (i.e., ban) on search engines; right SEO agency shrugs its shoulders and proposes a redesign program consisting of things like site architecture update (i.e., new URL structure), internal linking outline, more prominent accent on brand name, etc. which often means building a completely new website. So, our CEO is mourning:

money has just been spent and here you go again — new expenses!

To avoid excess expenditure, you should start working on SEO before you even begin the development process. Even more: if SEO traffic is vital in your project, keyword research and striving for great content should be the first steps of web development. 

SEO on every stage of web development

There are many ways to improve SEO, but we are going to dissect them mainly for managers, rather than tech experts. What to pay attention to when you redesign or build a website from scratch?

Development cycle. When we speak of steps to take at each stage of the cycle, we mean an industry standard scheme. Principal stages are:

  • Information architecture (IA)
  • Website prototype and graphical concept
  • Design
  • Layout
  • Programming

Though it is a standard, there are some variations to the cycle on the market of development.

The variation used by GRIN tech: following the same steps, we skip drawing mobile version templates and rely on front-end developers to adapt it. It gives the designer more time to work on a project, leaving a cost sheet intact.

Some agencies skip stages of IA and prototypes and get to design right away. It only suits small projects.

A semantic core is a base for Information architecture (IA)

A semantic core is keyword research that results in a roadmap of a search request organized as a request tree. It represents the real demand structure and answers the question — “How do people search for a particular product or service?”. It helps to understand what key categories one should include in the navigation menu; how to organize e-commerce catalog; which pages should drive traffic and many more. Search engines understand synonyms, and there is no need to use exact search request templates in your texts.

Semantic data collecting is probably the first thing you would want to execute along with information structure development. Developed architecture and prototypes must reflect actual search requests. Depending on the scale of a project it can take from one up to three weeks.

In the case of online stores, a website structure is usually identical to its catalog, and the semantic core helps to specify the inclusion order and new subsections.

Why is it important? Any marketing activity better starts with analysis. Are you sure your target audience is real? Don’t you try selling your clients something the way it’s convenient just for you and not them?

A website selling complex machinery equipment can be a great example: imagine the structure of a website’s catalog that reflects the manufacturer’s terminology, rather than tasks clients need to perform with those devices.

Evaluate search requests before the developing process to avoid similar mistakes. It may surprise you, but clients are likely to categorize your products/services in their way.

How it’s done. You should gather more or less popular search requests, which suit your topic and structure them hierarchically. In SEO slang it stands for “semantic core” or just “semantics.”

You can do it manually with the help of Google’s Keyword Planner. Wide-ranging topics require the use of automated services, but it’s not that simple. Automated means of building semantics don’t always work correctly, so it is going to require a lot of staff hours later.

Or you can delegate this task to an agency or freelancer.

Technical factors

The combination of features that improve indexing ensures your visibility through target requests and keep indexing errors at bay. Here are the most important technical factors:

  • Common CMS logic flaws:
    • checks for duplicate content in the structure;
    • URLs organization;
    • XML sitemap generation & robots.txt;
    • if meta descriptions and title tags on pages and in categories can be independently edited;
  • Server settings: www/non-www pages and redirections from any old versions.
  • How fast is the load time of your page?
  • Mobile excellence since Google introduced “Mobile first indexing.”

Technical elements implemented on the stage of website layout development and programming. Most technical factors are correlated with CMS (Content Management System) settings. Some CMS are more SEO-friendly and require less effort and attention in the process; others do not meet search engines’ requirements that well. At GRIN tech WordPress and Tilda are our favorites.

COMMERCIAL BREAK: all web development tools we use to back our solutions are well designed for SEO oriented development.

Why is it important? Because unresolved technical issues can compromise and even negate all optimization efforts, both present, and future. For example:

  • Bad addressing logic — overflowing with parameters — won’t allow a search engine to index your website correctly.
  • Duplicates: utility copies of some pages can slip into the indexing mechanism and lower your ranking since the search engine won’t be able to determine which specific text contains needed information.
  • Long load time: excessively bulky code and weak hosting service lowers your chances to outcompete those websites that load fast and smooth.

How it is addressed. Going through all the details on tech factors optimization requires a whole separate article. Some key examples listed in the section about typical mistakes developers make. The outline looks like this:

  • SEO specialists provide developers with documents on search engines’ requirements.
  • Programmers create a test build of a website, secured from indexing via robots.txt or “HTTP auth.” Unfortunately, 90% of developers forget to do this, and your test build gets into the indexing pool before an actual website is ready, which leads to the issue of duplicates.
  • During the server setup and CMS deployment, SEO specialists remind developers which mistakes to avoid.
  • Before publishing and after the programming part is done, SEO specialists make necessary final adjustments to a website, partially testing it, meanwhile consulting the rest of the team.

Behavioral factors

Search engines found a way to measure the quality of websites taking external links almost out of consideration. In a nutshell, it goes like this:

  • A user redirected to a website through a search request “X.”
  • With a chance of 95%, the website is hooked up to Google Analytics.
  • This “benign Trojan horse” provides a search engine with all the information about a visit: time, number of pages viewed, amount and quality of interactions (first click, button pushes, video views, etc.)
  • Your data gathered, and the system runs it through a benchmark compared with average metrics in your topic/business area.
  • If the score is above average, a website goes up in rankings through a request “X” and vice versa.
  • You also lose points if a user visits your website, then goes back and enters an alternative form of request in a search engine. That is a clear sign that a user was unable to find what was needed on your page. In a similar indirect manner website design affects SEO – if users enjoy it – they stay longer – behavioral factors improve.

So you get a temporary ranking boost through texts, metadata and, to a certain extent, links. Later engines analyze it all in the context of user behavior, and you either solidify your position or lose it. If your website is clunky, confusing, inconvenient, you’re going to lose your spot and no trick will save you from losing traffic. The more competitive your environment, the more significant behavioral factors become to rank. Technical factors have a similar impact on the subject: at the end of the day, they directly affect user experience.

The foundations of UX, UI and website’s structure laid at the stage of prototyping, but partially at the stage of IA development.

Why is this important? Behavioral factors are harder to manipulate than links. These factors cannot be substantially improved postfactum when the game is on, and you’re running a website that needs restructuring, redesign and total conversion. Behavioral optimization done poorly during the development stage is what is going to negate all efforts in the end. It is also what makes working with links pointless.

Behavioral factors can be boosted with bots or hired people, but in the end search engines can detect and identify these “fake users” in the mass of natural ones. There are known cases of websites developed by a significant SEO agency dropping massively in their rankings due to such manipulations.

How it is addressed. Benchmarks used by search engines are not under public disclosure, so you can not tell what score you need to have to make it to the top of the list. Cheating with a single parameter, like first-click-time, is a dead end.

You better take the long but productive road of SEO oriented development: build UX/UI to make it convenient and accessible for users, making them read your texts and purchase your goods/services. UX/UI industry works on solving this problem using the “customer decision journey” approach, prototypes, testing and continual improvement based on web analytics.

To decide what solutions are needed to achieve intended results, SEO specialists should analyze competitors and provide feedback to developers and designers (it’s essential to evaluate how work of devs and designers affects traffic).

Commercial factors

It is a group of factors affecting the rankings of online stores and services. It is not valid, e.g., for media or social network entities. The general idea is: some features make a sale proposition more attractive for a user, so it makes sense for a search engine to move such a website up in line and satisfy the user’s request. Here are some of the factors:

  • Wide and well-indexed variety of products/services
  • Contact details that are easy to index
  • Variety of payment methods
  • Free shipping and different delivery options like customer pickup, courier delivery, pickup point, etc
  • Average or below average pricing
  • Operational calculators (borderlines with behavioral factors).

Commercial factors mostly fall outside of the development cycle and are more related to general business strategy. Catalog structure should be developed from the start, before the IA stage (lookup “semantic core” section). More at Schema and here is a guide list for contacts section.

Why is it important? Because commercial factors are simply a common sense, rather than a search engine’s requirement only useful for robots. People love convenience, variety, and fair pricing. Though businesses established in the luxury segment don’t fit in and require different strategies and approaches. SEO traffic won’t be a significant source of their customer flow anyway.

How it is addressed. Business’s back office is the front line of action when it comes to commercial factors. You should use every bit of potential these specialists have to offer and implement it during the development stage.


Design and code are usually of priority when a website developed. “First, we need to make a website. We’ll figure out what to fill it with later”. Please, don’t neglect the content – from my experience, it is one of the key differentiation points between you and competition.

Over the years of practice, the industry realized that it is a vicious approach. You risk getting yourself a website with merely no space for content: texts you wrote, after the development is done, are too long, pictures are too large, or there are too many of them. Your goal is to create a website according to your content. You should always maximise the potential.

You should start working on content, considering SEO requirements at the stage of IA (information architecture) development. That is going to make your website a sensible container for your content and present information to users in a clear and intuitive way. Among other things, developers won’t have to adjust and improve database structure due to lost parameters continuously.

SEO perspective is:

  1. build semantic core (i.e., keyword research),
  2. organize structure,
  3. review the content plan and analyze how all three correlate.

SEO development often forces content plan increase for large projects (some groups of specific search requests require independent pages created for them), On small scale projects, it is usually enough only to use pages proposed by IA specialist. You will only need to specify blocks’ placement requirements for those pages.

Why is it of importance? Well, it’s a rhetorical question. Content is the purpose of your website. It is too early to start thinking about SEO if your website is not about content. Also, do not limit your think to the text as there are lots of content types.

From an SEO development standpoint great content is vital for two reasons:

  • It provides initial indexing through preferred search terms.
  • It sustains behavioral metrics after indexing is done.

Responsive / Mobile

An adaptive approach is the best you can use to create a mobile version of your website. As mobile traffic all around the world goes up, mobile versions slowly become of primary use.

You can start implementing adaptive elements at the stage of prototyping. The major work is done at the stage of design when interface solutions start emerging. At GRIN tech we only create desktop templates and individual complex elements. The front-end developer implements an adaptive design. That gives a designer more time to do a quality job, without increasing the budget.

It is crucial since the spring of 2015 Google ranks websites with an adaptive design much higher. On the other hand, websites that lack adaptive design lose their positions. The fact that your website is also tagged in the search list as “adapted for mobile devices” additionally boosts its ranking (like any other additional implementation).

Mobile versions are, of course, not just an SEO area of competence. Your website should also look good on a smartphone to attract customers. The repercussions of having no mobile version and implemented adaptive design can be drastic. At least 50% of search requests come from mobile devices (some regions and markets have a much higher percentage). If your website lacks adaptive design, it won’t participate in search results on mobile devices, and you will lose over 50% of your potential clients. That is especially upsetting if a desktop version is well ranked.

Recommended reading: GRIN tech’s post with a dramatic headline MOBILE APPS ARE DYING. The post actually dives into “Can PWA (Progressive Web Apps) of 2018 make up a worthy competition to native applications?” TDLR: yes, they can. You can build a web entity to work offline and to be open directly from smartphone home screen (and much more).

There are several approaches to mobile adaptation:

  • Using a separate domain for a mobile version (
  • Redirection to different versions through dynamic views (more in Google instructions)
  • Unified URL and different layouts: the smaller the screen, the more primitive interface elements are. A vertical series of narrow blocks represent a website.

Using a unified URL allows for adaptation with best results and minimum errors. It also guarantees equal indexing with no risk to lose positions due to duplicates issue.

Eight typical SEO development mistakes

Now let’s take a look at the pitfalls of web-dev, which can severely throttle your web promotion. It is far from a complete SEO checklist but covers the most common pitfalls.

In the perfect world, teams working on web development know every requirement and recommendation search engines have and act accordingly. In reality, nobody cares about SEO development, considering it to be an “important-but-not-urgent” type of task.

As a result, you get a large bunch of petty and annoying mistakes. You can easily avoid them at the start, but fixing it all is very, very expensive. Here are eight major mistakes:

1. Content loads dynamically

There is an opinion that dynamic content hurts SEO by default. However, the realities of modern development are such that you have to do both: give content out statically at first, and once the user interacts with it for the first time (for example with search filters), engage dynamic delivery (AJAX).

So it just requires some extra development work and proper guidance.

Flash is also a non-indexable thing, but it is not alive anyway.

2. Beautiful pages without content

You can refer to the section above to remind yourself why content is essential and how websites without content are born. It is crucial to understand that websites made for clients. No matter how beautiful it looks, it won’t work without proper content. No content — no SEO gains.

Solution. All participants — both the customer and the developers — should understand that the content should be taken into consideration as soon as possible. This SEO development-centric approach will allow you to quickly get a convenient website and gain a pleasant bonus of SEO-traffic. At the same time, avoid writing text sheets and don’t bother with the number of occurrences. Thanks to modern algorithms, a small text with photos or a video is enough. The use of headings and lists is also worth mentioning: it will increase the readability of the material and improve behavioral metrics of your page (more live visitors, the time spent on the page will increase, etc.)

3. Mobile version for later

See the section “Adaptive” / Mobile “. Mobile traffic is growing, desktop traffic either stagnates or declines. Besides, the mobile version is one of the important ranking factors, and from the SEO perspective, you have no right to ignore it.

4. “Let’s shut down a website and open a new one on a new domain.”

It’s unclear why, but this idea comes to mind to every second website owner. That is a dangerous suggestion. Shutting down the old domain, you lose:

  • The precious legacy the domain has accumulated in its segment. Static ranking factors like the age of the domain are a luxury that can not be ignored and swept under a rug. Even if the domain is banned, it is easier to lift the ban and solve the problem rather than start from scratch.
  • Direct traffic from business cards, bookmarks, old emails and way more. It will deliver a great deal of discomfort to former customers. Don’t treat existing customers like that.

Often the “abandoned” website is not removed from the indexing pool and eventually begins competing with the new one and usually wins. Why? Because the age of the domain is a massive advantage on its side. So you are going to get a domain driven mess, instead of a renovation. The old website will eventually lose its position, and the new one is not going to gain any.

Solution. Think twice if you plan to change your domain. Never change it just for the sake of it. But if it’s inevitable you need to perform a transfer:

  • Set up (!) 301 redirect.
  • Set up the primary mirror.

5. Change of structure excluding old results

Another significant problem is an unreflecting change of website structure which leads to losses in SEO results. That usually comes along with a CMS replacement. Trying to replace poorly written articles with better ones, developers only have positive intentions. Unfortunately, as a result, all pages seem to be in place but now have new addresses. If the update is more radical, you may even get new pages attached to new addresses.

A robot cannot find pages using their old addresses and throws the website out of the index for this query. One day it may find it, but you never know when and the result is unpredictable. The lesson is: you are going to lose SEO-traffic, because of someone else’s inability to think things out.

One abstract example can illustrate the problem. There is a page with the address “,” and it contains text: “Kittens for sale. No registration and SMS”. The new employer/CEO prefers minimalism so after the redesign and update the content moves to “” The content stays the same, but Google perceives it as a new page, and its way to the top of SERP starts from the ground up.

Solution. You either set up (!) a 301 redirect to new addresses or save previous links from the corresponding pages. Keep in mind that you don’t have to save all pages — it is unlikely that your user agreement can gain you search traffic and customers.

The structure can and should be changed, but do it carefully. You should clearly understand that the change is done to increase SEO coverage, not just for the sake of a new CMS and minimalism.

6. The test version remains in the indexing pool

The test version falls under indexing and begins to compete with the main version. The search engine identifies the test version content as the primary source (because it was published earlier). The primary domain receives a penalty to its ranking as a site with duplicated content. That is a classic example of a penny mistake which may take months to eliminate.

Solution. It is necessary to close the debug domain from indexing via the “robots.txt” file or by using “HTTP auth.”

If you are debugging a new site on the main domain, don’t forget to open it to indexing after launch. “I forgot” is the first answer to the question why the new site is working but is not indexed.

7. Page proofs mistakes: using <H1> – <H6> headers solely for typographic purposes

Unfortunately, developers sometimes don’t bother with new styles for sidebars and decorative elements that do not contain the main query, so they use header styles. In this case, the h1-h6 headers are important elements of the layout, which provide the search engine with information about the contents of the page. If you used these tags to highlight parts of the text that do not contain key information you get lower chances of proper page indexing.

Solution. h1-h6 headers can only be used in the content part or for the headings of typical elements like reviews, comments, related products, etc.

8. The site is available via five addresses after deployment

You would get a set of competing copies (mirrors) after deployment if you didn’t take all needed measures. Site versions can all be on the same hosting but indexed independently: with www or without www, HTTP or https, or If you bought several domains for one website, imagine the hell of duplicates of versions you are going to deal with. With a mess like that, bad rankings guaranteed.

Solution: a website has to be available for indexing via one address. Other versions should redirect the visitor to the primary mirror. To do this, you need to set up a redirect as soon as possible.

To sum it up on SEO development

I hope our guide is going to help you evaluate your SEO specialist’s competence and avoid common mistakes.

GRIN tech – provides an SEO development solution. Give us a try.