Identifying underperforming product pages at scale

Share this content:

[Guest post by Ben Henderson]

As an Ecommerce store owner, maximising the profitability of your site is of paramount concern. Google Analytics plays a key role in allowing you to determine which of your pages are performing well, and which ones are perhaps not performing as well. However, when you have an Ecommerce site with thousands of pages to look through, it is sometimes difficult to scale the analytics process.

Under most circumstances, at PureNet we recommend clients use standard Google Analytics with Ecommerce tracking (and tag manager if appropriate) as the majority of the time this works well enough. However it can sometimes be a good idea to perform a detailed analysis of an Ecommerce store’s pages from time to time.

As it happens, it’s possible to do this at the same time as auditing your site for SEO errors, using the Screaming Frog tool (available at screamingfrog.co.uk.) We’d highly recommend purchasing the full version as it only costs £149/$184 a year, and allows detailed insights to be drawn across your whole site.

Screaming Frog is a site crawler, which is typically used by technical SEO professionals to diagnose and correct onsite errors. A recent, and very useful update, has integrated Google Analytics access into the software, which allows the collection of a number of key metrics across potentially thousands of pages in one place. Let’s get started!

What you will need:

 

  • A Screaming Frog licence
  • Access to your domain’s Google Analytics profile
  • Access to Google Search Console (optional)

Setting up the crawl

 

First, purchase/download and install Screaming Frog, and enter the licence key. Once you’ve done this, go to ‘configuration’ then ‘API access-Google Analytics’.

screaming-fog-1

screaming-fog-2

Simply follow the steps to connect your Google account. Once this is done you’ll be prompted to select the account, property and view that you wish to analyse. After you have selected this, you can change the date range and also select the metrics that you’re interested in. We’re going to tick the Ecommerce box, as well as leaving the other boxes selected.

screaming-fog-3

If you wish, you can also connect your Google Search Console account in the same manner.

If you have a large domain, it might help to increase the number of connections to either 10 or 15 (feel free to go slightly higher if you have a powerful PC and a good internet connection) under Configuration>Speed and also uncheck Check Images, CSS, Javascript and SWF (Flash) under Configuration>Spider to speed things up. Lastly, put your domain in the Enter URL to Spider box (in this example we’re using Fisherscateringsupplies.co.uk) and press start.

screaming-fog-4

If you encounter issues with Screaming Frog running out of memory during the crawl, this handy guide should help.

Analysing the results of the crawl

Once the crawler is finished (which may take some time depending on how large the site is) we’ll be presented with the following (the rest of the columns are on the right of the displayed URLs).

screaming-fog-5

The easiest way to work with the data now that it’s been gathered through the crawler is to export it into Excel. First, select the Internal tab if it isn’t selected already and export all the data.

Once all of the data is in Excel we’ll need to delete some unnecessary columns – in this example there a still 45 (!) columns worth of data, but this means we now have a great deal of information on each page, including:

  • Basic information such as status code, meta title and meta description information
  • Google Analytics sessions, session durations, bounce rate and page views
  • Goal completions and conversion rate plus numbers of transactions, revenue and revenue per transaction
  • There is also additional information that can help with page optimisation such as word counts, the text to code ratio of the page, the page’s response time (speed) and the number of internal links pointing to the page
  • Google Search Console data such as clicks, impressions, click-through rate and average position can also help to identify pages that are underperforming in Google organic search specifically.

screaming-fog-6

Now, it’s possible to find all pages on the site that aren’t bringing in revenue, have low engagement rates or are otherwise struggling for traffic, as well as having additional information (such as meta data) which may help in diagnosing page-level problems, in the same place. If we wanted to go into more detail we could upload the URLs into a backlink tool such as Majestic’s bulk backlinks checker to see which pages were generating the most backlinks, or we could check the number of social shares pages had acquired using something like Scrapebox’s social checker addon.

To more easily locate product pages if there is no linear directory structure, filtering by ‘level’ (how deep the pages are on the site) should help.

Additional options

For the more adventurous there is also the option of using the custom source code search function to isolate certain types of pages, or to check details of schema tags across the site. This can be of particular use if certain high value product pages may be missing schema that would otherwise help with clickthroughs and conversions in organic search, for example.

In conclusion, Screaming Frog is a great tool for any Ecommerce site owner, capable of giving a high level of analysis for a reasonable cost.

Related Post

Share this content:

Subscribe to our mailing list

Our Personalization Solution


Want to increase conversions and sales of your eCommerce Website? Discover our 360º eCommerce Personalization Solution and Try it for Free!


Post your thoughts