Lars Larsson, CEO of Varnish Software, reveals how technology can help speed shoppers’ online searches and prevent them defecting to a retail competitor
Only musicians like The Kinks benefit from waiting. In 1965 the band turned the dreaded experience into the hit song “Tired of waiting for you”. Today’s average consumer who wants to buy an item online fast is less romantic and just goes elsewhere if they can’t find what they’re looking for.
A large part of the online shopping experience comes down to search and navigation. A consumer that visits an online shop either knows what they want to buy or wants to be inspired. In both cases the consumer will very likely use onsite search and expect the results to be relevant, accurate and show up fast.
Relevant and accurate search results
Relevance and accuracy depends on artificial intelligence (AI) or machine learning, whereas search query performance depends on a technology known as intelligent caching. Enterprise search tools used for onsite search usually handle relevance and accuracy very well. They are built to lead site visitors to relevant products that matter to them. Beyond basic search, these tools use AI to understand shopper behaviour and to define exactly which items out of a comprehensive product range to show. For a retailer’s site, a successful search occurs when a shopper’s input is interpreted to pinpoint exactly what he or she would like to buy.
Getting search query performance right
The user experience, though, is a two-pronged challenge: the accuracy and relevance of the search contributes heavily to the positive user experience and probably helps to “close the deal”. But what if the user has to wait too long to see results? Usually in as little as a couple of extra seconds, the shopper will abandon a search and leave the site. Thus, search query performance is crucial for retailers and that’s where caching comes in.
Caching works by creating a temporary storage area (or cache) to store information so that the application does not have to keep re-accessing it from the central server. Most search tools use old caching methods to return search results fast, which work pretty well with smaller datasets. These caching systems need to track how much memory and disk space they can use and move content back and forth between these two storage areas. This costs time and processing power and can quickly exhaust disk capacity.
For very large datasets with more than two million records, these internal caching systems struggle. One record in a database doesn’t necessarily need to equate to one item for sale on your website, say a shirt. The same shirt can have multiple records repeatedly listed, tied to different categories, for example, ladies wear, clothing, and sale items. In this case each listing needs its own record, which makes the number of individual records escalate so quickly.
This is where the challenge resides for onsite search tools – repeated, aggregated records processing is intensive. In order to do this processing, the tool must get all the records from the database, then aggregate, compute and show them to the shopper as pages. Even the fastest processors cannot do this fast enough. This typically takes about two seconds, which is unacceptably slow in e-commerce. The bottom line is that response times in search need to be faster. The most refined search won’t matter if the shopper has already left a site that was too slow.
Using intelligent caching in front of enterprise search tools
Fortunately there is a solution that helps retailers solve this problem: deploying intelligent caching solutions in front of these search tools. Intelligent caching intercepts the search requests before they overload the database. This solution can be easily handled by any IT sysadmin, as they are likely to already be using intelligent caching for the website. According to Quocirca, “Intelligent data caching can significantly minimise the inefficiency of retrieving the same data repeatedly from the original server”.
As a result intelligent caching can serve search queries faster, significantly reducing latency and boosting performance to deliver near-instant results. In testing scenarios, responses have become up to 1,000 times faster using intelligent caching. Performance, in fact, increases with the more records/the larger dataset you have.
Retailers have been testing disruptive technologies like AI and machine learning to match search results to shopper’s behaviour and preferences. Now it’s time to serve up those results much faster. Intelligent caching helps retailers to improve the search query performance of their onsite search.
After all, your shopper is not the star in the Kinks’ song so won’t think –
“You keep a-me waiting
All of the time
What can I do?”
But simply search somewhere else!
(A Retail Times’ sponsored article)