We recently had a client that desired to update their website themselves, but due to a lack of a content management system and an outdated website, after an extensive consultation, the project ended up evolving into the creation of a whole new website, built on the WordPress platform, with a transfer of existing copy and content. Interestingly, along with providing this new user-editing functionality to the client, there was a noticeable side effect: we saw a massive benefit in terms of SEO results. After the new site launched with the updated code, we saw more than double the original amount in traffic from Google. Does the quality of a website’s code really matter? Let’s look at the variables and analytical data so we can break them down to see exactly what helped trigger this surge in traffic to answer this question.
This website is mostly informational, non e-commerce, for a small business that offers local services and products. To maintain clientele privacy, we’ll just show the numbers and percentages given from the host without specific URL’s, IP addresses, or names. All data came from the host, startlogic.com.
Timeframe of New Site Launch
The new website was launched in late-October, with the old website running in parallel until the switch. We’ll be comparing data from September (containing a full month of data collected from the old website) and November (full month of data from the new website). December is also shown to show consistency with this higher surge.
With 5 columns of statistical data, it’s important to undertand the difference between these metrics. To keep from digressing too much, we’ll be focusing on the first two columns as the best indicators of website traffic: “Unique visitors” and “Number of visits”. “Unique visitors” account for an individual IP address that visit the website, whereas the “Number of visits” account for the number of times someone/something visits the website. For example, if Sally visits the website on Monday, then visits it again on Tuesday on the same device, she counts as 1 unique visitor but her activity accounts for 2 visits.
The graph shows that the number of unique visitors jumped from 801 in September to 2,127 in November, a 265% increase. We see that the number of visits rose alongside this rate as well, from 1,122 to 2,906 visits, a 259% increase.
Let’s look at the sourcing data available:
The traffic source channels give us only two metrics to read from. So looking at the better indicator, the “Pages” metric shows a more-than-tripling in direct traffic (users directly going to the website), in organic searches (Google, Yahoo, etc.), and in external site links. In search engine traffic, there’s a 389% increase in pages viewed from Google alone.
For further information about the other columns in the first monthly history graph, you may notice a large decrease in the number of “Hits”. This isn’t a concern (a “Hit” isn’t a great metric to use to analyze visitor traffic – see definition of a “hit”), since the number of requests in the new website is far less due to efficient coding, so the comparison of the number of “Hits” here would comparing apples to oranges. Also, there were definitely some spam IP’s pinging the server in September that inflated this metric (i.e., a single IP address accounted for almost 10,000 hits alone in September). You can also see evidence of spam in the latter sourcing images that tell us the number of external site links – notice the decrease in the number of “Hits” (from 2,554 to 1,348) even though the number of “Pages” rose (from 55 to 752). We also are able to clearly identify them as spam after looking at the full source list, since most URL’s were unrelated to this business and some IP’s are listed on spam blacklists. You can also see the increased efficiency leveraged through the updated coding by the large decrease in bandwidth usage (566.38 MB to 384.23 MB), saving server requests – a large benefit being faster load time.
“the number of unique visitors jumped from 801 in September to 2127 in November, a 265% increase”
The coding behind your website matters, because how a website is coded directly affects how search engines like Google will be able to find you. Their goal is to display search results most relevant to search terms, so you want the code behind your website and its content to be semantically relevant to what your website is about (or what services and products your business offers), so the search engine will be able to read it correctly and relay its data in relevant searches. We can see how this new website’s code has almost quadrupled the amount of page view traffic coming from Google alone, as well as the more than doubling of unique visitors in general, which are excellent indicators that the code update boosted SEO and was very worthwhile all around.