Archive for the ‘analytics’ Category

4 Tools To Measure The Effectiveness of Your Web Site

Wednesday, May 29th, 2013

If you’re new to web marketing then you may not be aware of the different options you have for measuring the effectiveness of your web site. There are many different tools out there that measure everything from the pages that people visit all the way down to what sections of a page visitors are focusing on.

Each of these four tools provides different insight into what is happening on your web site. The first two tools should be part of every site. For those looking for deeper information then I highly recommend integrating heat map and A/B testing tools into your site.

Web Analytics

Analytics software tells you how many visitors come to your site, what pages they are viewing, and where are they going. You can view the paths that people follow through your site, see which pages are causing your visitors to leave, and measure the performance of your campaigns. If you have a web site, you need to have a web analytics tool setup.

Google Analytics is a great free option and one of the best web analytics tools on the market. Setting up an account is simple and adding the necessary code to your site requires little to no coding knowledge depending on how your pages are built.

Webmaster Tools

Adding Google Webmaster Tools will give you insight into how Google views your site. With Webmaster Tools you can diagnose problems on your site to ensure your pages are being crawled and that there are no errors with any of your pages. Webmaster Tools also shows you which search queries your site is showing for and how many click throughs your search results receive. Google allows you to optimize what is being crawled by submitting a site map listing. This is something every site owner should do to ensure all of their pages are being crawled. To get started visit Google Webmaster Central.

Heat Maps

Heat map software tells you where a visitors attention is focused on a page. It measures the mouse movement of individuals visiting your site. The combined mouse movements from as little as 50 visits can give an amazing amount of insight into how a page is performing. Heat maps that rely on mouse movement aren’t as accurate as eye tracking but it can give a close approximation of the most important areas of a page.

There are several heat map tools available. Two of my favorite are:

Crazy Egg: For as little as $9 per month, Crazy Egg allows you to measure up to 10 pages on your site for up to 10,000 visits. Crazy Egg was the first heat map tool that I used and have found it to be very effective.

ClickTale: Depending on the type of site that you run and the amount of data you want, you will want to give ClickTale a try. ClickTale is similar to Crazy Egg but where they differ is that ClickTale allows you to view individual visitor sessions. You can go back to a recorded visit and see exactly how individual users are browsing your site. This can be a very effective learning process but it can also be overkill for the average person managing a web site. ClickTale offers a free plan which records up to 400 page views per month.

A/B Testing

Testing applications allow you to setup A/B tests on your site to measure how changes on a page impact visitor actions. Testing is a more advanced practice but it’s one that can have significant results especially if you run an eCommerce site or a site where form completion is important.

Some example of test that you could run:

  • Long form vs. Short Form
  • Text Ad vs. Image Ad
  • Different link text to see what gets more click-throughs
  • Positioning of calls-to-action on a page

To get started, I recommend setting up Google Content Experiments. If you have a Google Analytics account in place then there is no additional code setup required. If you don’t have a Google Analytics account, then you will need to set one up.

Twitter Growth Strong Despite Increased Competition

Monday, February 15th, 2010

Twitter has received a lot negative publicity lately. The number of new registrations has dropped dramatically over the past couple of months. Despite this, Twitter continues to have an impressive amount of “tweets” or messages sent across its service and it continues to grow.

Last week Twitter crossed the 9 billion tweet mark. Incredible considering Twitter crossed the 5 billion mark just 4 months ago. Averaging over 300 million tweets per week means that Twitter will hit the 10 billion mark at the beginning of March.

Other statistics from the past week:

  • Tweets per week increased of 5% over previous week
  • 47,455,890 Tweets per Day
  • 1,977,329 Tweets per Hour
  • 32,955 Tweets per Minute
  • 549 Tweets per Second

Right now, it takes just over 3 weeks to add 1 billion tweets. Twitter didn’t pass the 1 billion tweet mark until November of 2008. It took them another year to add another 4 billion tweets. At the current pace, they’ll easily add another 12+ billion this year.

Clearly, Twitter is still a very popular service and it is continuing to grow. Over the next couple of weeks we’ll see if they can continue to keep this momentum or if services such as Google Buzz will start to eat away at it’s popularity.

Mining Twitter for Gold

Tuesday, January 12th, 2010

Finding the 27% of Tweets that Have Value

A recent study by ReadWriteWeb has shown that only 27% of tweets contain information with some value. Many people will point to this and use it to dismiss Twitter as worthwhile platform. However, this number comes from Twitter’s flexibility. Some people use it to keep in touch with friends, others use it break news. Some use Twitter for advertising and others use it for sharing information they find on blogs.

It’s this last group that’s the most interesting. It’s the human web. It’s people finding information and sharing it that adds value where search engines can not.

The problem is finding the tweets that make up this 27% of the stream that holds information of value. Further, 27% doesn’t sound like much until you realize it’s 70+ million tweets per week. The best information on Twitter amounts to a needle in haystack.

This points to the growing need for filters and recommendation engines for the real-time web. Last week I posted on micro filters and I believe this post by ReadWriteWeb further emphasizes this need.

To leverage the value that Twitter and the whole real-time web hold, we need better tools. We need more filters that go beyond the basics; Twitter lists, follower lists, and individual favorites. For example, value can be attributed to the number of people sharing the same content or  the credibility and clout of those sharing it.

If the web is going to evolve beyond search, micro filters will play a huge part in it but filters alone are not the answer.

Recommendation systems are the other piece of the puzzle. They’re needed to understand user behaviors; what people like and don’t like, what they favorite, what they read, and what they share. Recommendation systems leverage this data and combine it with filters to find the best information that people want to read. This helps us to take a full advantage of the real-time web without becoming overwhelmed.

To solve the problem of finding the 27% of Tweets that have some value, filters will be used to narrow the stream of information. Then recommendation systems, which have some insight into our past behavior, will be able to narrow the focus even further by taking the information output by these filters and funnel it to us based on our interests. This means that we’ll all be giving up some privacy on the web but it’s a trade off we’ll need to make to keep up with the barrage of information.

Using Bit.ly and Ruby on Rails to Shorten a URL

Monday, January 4th, 2010

If you’re building a Rails application and you need to add a URL Shortening service then you should look at philnash’s bitly gem. Bit.ly is dominating the crowded URL Shortening category. Its dashboard provides statistics for each shortened URL which makes it a very valuable tool.

To get started simply type:

gem install bitly

In your controller you’ll want to add:

require 'bitly'

Finally, to use the service you’ll need an API key. First create an account at bit.ly and then go to the following URL to retrieve you key: http://bit.ly/account/your_api_key/

To shorten a URL:

bitly = Bitly.new('your-bitly-user-id','your-bitly-api-key')
page_url = bitly.shorten('www.marketingformavens.com')
page_url.shorten => http://bit.ly/7BWXcQ

The URL has been shortened and you can display or use it in your application. You can also log into your Bit.ly dashboard and see detailed statistics on each URL.

To display stats in your dashboard use the following code:

page_url = bitly.shorten('www.marketingformavens.com', :history => 1)

For other options such as expanding a Bit.ly URL check out the documenation on github.

Measuring Social Media ROI

Tuesday, October 27th, 2009

Christina Warren has posted an excellent article called “How To: Measure Social Media ROI”. If you are running social media campaigns this is a great place to start learning how to measure the impact of these campaigns. Christina highlights several products that will get you started. Be sure to check this out.

How the Real-Time Web Will Impact Internet Marketing

Monday, October 12th, 2009

The real-time web has become the buzz word of the day.

Sites like Twitter and Facebook have made it easier and faster to consume media, causing a shift in our expectations. We want any and all information as soon as possible and we find it difficult to extract ourselves from these rivers of information.

Articles written month or even weeks ago are considered old news and this isn’t good if you’re still managing your web site like it’s 2004.

The real-time web is here and growing. Don’t let your site be left behind.

You Need to Boost Quantity from Quality

With the rise of the real-time web, people are less likely to go to your corporate web site. This isn’t news, it’s been happening since the beginning of the decade when RSS reduced the need to visit your favorite web sites each day looking for new content.

The real-time web leverages this even further.

If you’ve used Twitter or Facebook you know exactly what I mean. You’re receiving updates and recommendation from hundreds of people on what to watch, read, and listen to. The human network has taken over and it’s funneling everything to you without searching on Google.

This opens up a huge opportunity for Internet marketers but it requires more work and extra planning.

Content has always been a web sites greatest asset. But now you need to think about how you can break it up into smaller pieces that can be shared and reused through other channels.

For example, that new product page you just created should also be condensed down into an interesting 140 character Tweet, a Facebook update, and maybe a 30 second demo showing off the best new feature. You want as much content from your web site pushed out into your “cloud site” – your companies content that is floating in different areas of the Internet. It’s photos on Flickr, articles on Digg, and bookmarks in delicious.

This strategy helps you to effectively use the real-time web to to scatter breadcrumbs across the Internet leading people back to your web site but how will you know if it’s working?

Real-Time Web Analytics

To deal with the changes that the real-time web brings, companies are developing real-time analytics programs. Gone are the days when we waited for log files to run to process the previous days statistics.

The real-time web ensures that we respond to visitors faster than ever before. Waiting days or weeks to see how a campaign performed is no longer necessary. With real-time web analytics we can respond to visitors instantly. Dashboards keep us up-to-date on how each campaign is performing and the smallest changes can be seen, corrected, and tested within minutes.

Integrated into real-time analytics will be information coming from outside of your web site. Since most real-time web applications use shorter messages – and character limits – URL shorteners have become very popular. This is fantastic for Internet marketers as these service give statistics on how your links are being shared.

Unfortunately, there are a couple of downsides to this which will be fixed over time. Right now there are too many URL shortening services. This fragments your statistics across different services making it difficult to see the big picture. The second is that these statistics aren’t available within your current web analytics program. You’re going to have to do some searching to get the information you need.

Overtime, the real-time web is going to dramatically shift how we think about corporate web sites and how we build corporate content. We’re already seeing this now. It’s up to you to experiment and learn how to leverage real-time technology before web site and your company are left behind.

Please feel free to share your comments. We’d love to hear them!

Web Traffic Watch Launches Today

Wednesday, September 23rd, 2009

I’m very excited to tell you that the Web Traffic Watch beta has gone live. Web Traffic Watch allows you to view visitor activity to your web site in near real-time. As visitors come to your site, they pop up in Web Traffic Watch. For each visitor you will see the url of the page they are visiting along with their lead score, page view total, and referring site information.

You can join the Web Traffic Watch beta today. Since this is a limited beta we will email you when you’ve been selected to participate. We have chosen to limit the beta in the beginning to ensure that we don’t overload our servers. The last thing we want is a poor user experience but we hope to have all of our beta participants using the app as soon as possible.

Why Content Management Systems Need a Brain

Thursday, September 10th, 2009

If I Only Had a Brain…

Web analytics is the brain behind your web site. It knows exactly what people are doing.

So why is it that web analytics isn’t built into every content management system? I’ve worked with several different CMS – large and small – over the years and none of them included statistical tracking and visitor analysis. CMS’ are missing their brain!

Why Do Content Management Systems need Web Analytics?

CMS’ are great at managing and publishing content for static web sites. But static web sites are like your body without a brain, they just exist. Instead, they should be dynamic and by dynamic I mean, personalized to each person visiting your site. If someone comes to your site 5 times, they shouldn’t see the same home page banner each time. The CMS should be smart enough to show a home page based on the information it learned from the person’s previous visits. This goes for all of your web pages.

Completing the Web Content – Visitor Loop

To accomplish this, a CMS should include analytics data – a brain – that it uses to control content. This would allow it to discover visitor trends and display related content and promotions based on this data.

A continuous feed back loop should be created between the CMS and the visitor. This can be done by packaging web analytics within the CMS. The CMS pushes content to the visitor and the web analytics feature reports back to the CMS on the results. Now it knows what the visitor clicked on and the next page they visit is personalize based on the their previous history and the CMS’ knowledge of what content and promotions past users in the same situation responded to.

Closed Loop, Found Brain

The loop between the CMS and the visitor is now closed and several new opportunities have opened. Your site is now engaging to your visitors. It’s personalized based on their needs and it’s easier for them to find what they want. New content is recommended based on an intelligent CMS that is constantly learning with each new visitor.

Removing Bias from Your Web Statistics

Friday, September 4th, 2009

The State of Web Analytics

Web analytics applications are great at helping internet marketers and executives understand how their web site and online campaigns are performing. But there’s a huge problem. They overload us with too much information and, worse, they provide no context to the data leaving the statistics open to interpretations.

Less is More. It’s True for Statistics Too.

First, we don’t need more information. We need the right information analyzed in an unbiased context.

If you have a seasoned web analytics person on your team then this isn’t a problem. But most of us don’t have this luxury. In the age of downsizing and the growth in number of small businesses, internet marketers are performing multiple jobs. This includes sending emails, building Adwords campaigns, publishing content, updating social networks, and analyzing statistics.

With all of these tasks to perform, who has time to go through the mountain statistics that analytics applications provide?

In addition to this, having the same person or people working on campaigns and analyzing statistics creates a conflict of interest. This isn’t deliberate but we all want to perform our jobs well and this usually means that we’re overly optimistic about the results we think we see.

How People Interpret Statistical Reports

When people look at statistics they put it into their own context which is biased based on their experience and their relationship to the data. If the data coincides with projects they’re responsible for then you can image how they’re going to interpret the results.

I’ve seen this happen over and over again. It’s human nature to try to reason that the email campaign that bombed didn’t really bomb because the open rate was fantastic – hint, open rates mean nothing. Again there are too many statistics and it’s far too easy to steer the eye toward the data that looks good. So how do we find a baseline to work from that minimizes our biases?

Putting Statistics into Unbiased Context

To prevent biased reports, it’s important that rules and/or goals are established. These should be unique to each of your web sites and marketing campaigns. For example, your Google Adwords campaign is measured against conversions, your email campaign measured against new registrations, and your blog posts measured against the number of comments received. Choose the rules that match your business goals and stick to them.

Now it’s easy to determine the success of your campaigns and you’ve removed the bias from your reports. Everyone knows  which campaigns performed well and which ones need improvement. Plus, there are fewer statistics to interpret because you know exactly what you’re looking for and everyone is working from the same set of rules.

Rules leave nothing to biased interpretation. They provide a baseline for comparing your online campaigns. Best of all, they allow you to focus on what matters most to your business and your visitors.

Monitor Your Web Site Traffic in Real-Time

Wednesday, August 19th, 2009

Join the Wait List – Beta Starts in September

Web Traffic WatchWe are fast approaching the Web Tracking Watch application launch. Soon you will be able to watch your visitor traffic in real-time in what we like to call a lite-app – a 320px wide application which runs in it’s own window or on your iPhone.

The benefit of a “lite-app” is you can push it to the side of your screen so you can work and watch your site traffic at the same time.

With Web Traffic Watch you will see:

  • Visitors interacting with your site in near real-time (refreshes within 10 seconds)
  • How many pages each visitor has visited
  • What site referred each visitor
  • What the last page was that each visitor visited
  • Level of visitor interest – lead tracking color codes each profile to show “hot” and “cold” leads
  • Total number of visits for the current day
  • Total number of conversions for the current day

Web Traffic Watch will be a limited beta when it launches in September. To get on the wait list, please visit the Web Traffic Watch beta page and sign up today.

Join Wait List