CMG in the Clouds
April fool’s day this year brought with it the Rocky Mountain CMG conference. Best view of any CMG show I have been at.
David Halbig from First Data Corp. (also the guy who organizes the show) kicked off with a surprisingly refreshing presentation for a CMG region. He talked about the cloud in an extremely practical way – as well as laying out the management of performance in the cloud.
David laid out some very compelling examples of issues that can potentially emerge when moving to the cloud. David broke up the monitoring approach into:
1. Continuous Monitoring
2. Specialized – intermittent use monitoring (the fire hydrant)
3. Middleware monitoring (application deepdive)
4. Business Transaction Monitoring – for a cross tier understanding of what is going on
David talked about how many vendors say that they are “BTM capable” – but really do not have the “real thing” – his “sure fire” list of requirements included:
• Horizontal view of aggregate and single transactions across all tiers of interest
• Resource consumption information from each monitored tier – basically saying that a network tap based solution cannot cut it.
• Auto-discovery of transaction path (said that some products needed to be “trained” on transaction paths)
• Capture path to non-monitored tiers
• Continuous operation at volume
• Low transaction path overhead
We (Correlsense) also presented; focusing on how Cloud (public/Private), Agile, SOA and Virtualization create an ever changing environment where the only things that stay the same are the user’s expectation of sub second response times and the transactions that run the business.
Although – as usual – the vast majority of the audience was not anticipating and migration to the cloud in the near future – the sessions were very informative because they dealt with managing performance in a constantly changing environment – something that everyone has to deal with.
Tuesday, April 19, 2011
Tuesday, April 5, 2011
Improving Conversion Rates
There is a clear and obvious correlation between the time it takes a landing page to load and the conversion rate of the call for action on that page. We know now that the time it takes to load your site will affect not only your conversion rate but your ranking at Google as well. If you agree with these statements, continue reading... If not, I suggest you to read this article and this one as well.
Today, the main tools for optimization are dealing with questions like "where the traffic is coming from" and "what people are doing on my site before they convert". The traditional set of tools available for the web analyst are tools like google analytics that provides information about where the visitors are coming from, funnel analytics tools that provide information about the steps in your acquisition funnel, AB and MVT testing tools to run experiments on the traffic to your site and user recording tools and life-cycle tools that let you view your users flow.
None of the above deals with site performance while we all agree there is a significant weight for this issue on your conversion rate.
Few months ago, one of our clients decided to put his site on CDN. His optimization manager told him that like any other change done to the site, they should run a test and evaluate the ROI of such change. They agreed to run a simple AB test on their Brazilian display traffic (traffic from banners) - 50% will be redirected to the regular site and 50% to the CDN version.
Before starting the test, they went to their IT manager and asked him to check using his monitoring tool how long it takes to load each version. The IT manager did not have a robot/agent in brazil so he bought a virtual server there and installed the agent and came back two weeks after with the following results: it took the original site to load 25 seconds while the new version took 6 seconds. Although it was clear 19 seconds is a huge difference, they decided to run the test after all.
Few days after, they got the results: the conversion rate of the new version was 65% LOWER than the original version... Based on the results, it appeared that the longer the user waited, the better chance he will convert (original version: 25 seconds to load, 6.4% conversion rate, new version: 6 seconds to load, 2.2% conversion rate)...
This is obviously wrong. By looking at the amount of page views of each variation they noticed that the amount of pageviews of the original version was much less than the amount of pageviews of the new version (about 1:3 while it should be 1:1) which lead them to the conclusion that because of the load time, most of the users left the page before it was fully loaded and before the web analytics tool sent the event to the server. The users who did wait 25 seconds for the page were more likely to convert...
Another thing they found was that when they measured the actual time it took for the visitor to load the page, it was much more than 25 seconds (closer to 40 seconds). The IT manager did not include the rendering time or the actual time it took the visitor to load all the files (relying on the network speed determined by the server will never be accurate).
You can assume visitors who are waiting 40 seconds for a page are really interested in its content...
The main things you should take from this story are:
- Page load time and abandonment rate during page load should be part of the basic metrics your analysts are using
- You should have a real site performance analytics tool that will capture 100% of your traffic and not count on robots or other sniffing tools
- You have all kinds of KPIs that provide you with a clear view of all of your SLAs and response time through the funnel except for one of the most important steps - when the visitor arrives to your store.
RUM measures the load time of each and every request for 100% of your traffic, from all around the world, and in the most accurate way. It does not rely on robots and synthetic request or calculates the processing time on the server or the size of the response or guessing the network speed of your visitors. Instead, it uses both client and server agents to calculate the actual time it took from the time the user requested the page till the page was fully rendered. It also tells you how many visitors left before getting the page (and before you got a pageview event to your web analytics tool). RUM has a simple yet powerful user interface that was designed not only for IT people.
Please contact us to schedule a demo or download the free addition of the tool.
Today, the main tools for optimization are dealing with questions like "where the traffic is coming from" and "what people are doing on my site before they convert". The traditional set of tools available for the web analyst are tools like google analytics that provides information about where the visitors are coming from, funnel analytics tools that provide information about the steps in your acquisition funnel, AB and MVT testing tools to run experiments on the traffic to your site and user recording tools and life-cycle tools that let you view your users flow.
None of the above deals with site performance while we all agree there is a significant weight for this issue on your conversion rate.
Few months ago, one of our clients decided to put his site on CDN. His optimization manager told him that like any other change done to the site, they should run a test and evaluate the ROI of such change. They agreed to run a simple AB test on their Brazilian display traffic (traffic from banners) - 50% will be redirected to the regular site and 50% to the CDN version.
Before starting the test, they went to their IT manager and asked him to check using his monitoring tool how long it takes to load each version. The IT manager did not have a robot/agent in brazil so he bought a virtual server there and installed the agent and came back two weeks after with the following results: it took the original site to load 25 seconds while the new version took 6 seconds. Although it was clear 19 seconds is a huge difference, they decided to run the test after all.
Few days after, they got the results: the conversion rate of the new version was 65% LOWER than the original version... Based on the results, it appeared that the longer the user waited, the better chance he will convert (original version: 25 seconds to load, 6.4% conversion rate, new version: 6 seconds to load, 2.2% conversion rate)...
This is obviously wrong. By looking at the amount of page views of each variation they noticed that the amount of pageviews of the original version was much less than the amount of pageviews of the new version (about 1:3 while it should be 1:1) which lead them to the conclusion that because of the load time, most of the users left the page before it was fully loaded and before the web analytics tool sent the event to the server. The users who did wait 25 seconds for the page were more likely to convert...
Another thing they found was that when they measured the actual time it took for the visitor to load the page, it was much more than 25 seconds (closer to 40 seconds). The IT manager did not include the rendering time or the actual time it took the visitor to load all the files (relying on the network speed determined by the server will never be accurate).
You can assume visitors who are waiting 40 seconds for a page are really interested in its content...
The main things you should take from this story are:
- Page load time and abandonment rate during page load should be part of the basic metrics your analysts are using
- You should have a real site performance analytics tool that will capture 100% of your traffic and not count on robots or other sniffing tools
- You have all kinds of KPIs that provide you with a clear view of all of your SLAs and response time through the funnel except for one of the most important steps - when the visitor arrives to your store.
RUM measures the load time of each and every request for 100% of your traffic, from all around the world, and in the most accurate way. It does not rely on robots and synthetic request or calculates the processing time on the server or the size of the response or guessing the network speed of your visitors. Instead, it uses both client and server agents to calculate the actual time it took from the time the user requested the page till the page was fully rendered. It also tells you how many visitors left before getting the page (and before you got a pageview event to your web analytics tool). RUM has a simple yet powerful user interface that was designed not only for IT people.
Please contact us to schedule a demo or download the free addition of the tool.
Subscribe to:
Posts (Atom)