In my last blog I had explored deep into the digital Advertising world and the revenue it is generating without even walking door to door, to promote a product or selling a product. In this blog I will discuss about Google Analytics- a service by Google to analyze all the digital footprints and planning a business strategy.
Google Analytics is a service provided by Google to analyze the web statistics of a website. The basic package is free and available to all. However, some of the premium features requires a fee to be paid before using them. Google analytics gives an insight on the visitor’s demographic information like geographical location, browser used, time spent on website, referral link, area of interest, keywords searched and all other detail that could be analyzed through the user’s browsing pattern. Google analytics is deeply integrated with AdWords (a Google’s own digital advertising tool for pay per click digital advertising). It can produce up to 70 different types of reports to help make a prudent decision on the strategy and about the performance of the strategy employed in real time. This greatly helps in managing the businesses. Below hypothetical scenario explains the need of real time data:
Consider that a company starts manufacturing a shoe model, thinking that the model would do wonders and would take the sales of the company to new heights. The model gets a positive response in first 2 weeks and sells 10000 pairs, considering the demand of the shoe the company starts manufacturing the shoes aggressively and produces 1 lakh pairs. Unfortunately the demand for that model drops due to some reasons, and they get to know this after a month that the sales have dropped exponentially. This could be avoided if the correct information was known at right time, had it been the case a remedial action could have been taken to avoid such a loss.
Google does not share your information with anyone for any reason and Google analytics is free of cost for basic features and it only requires a Google account to use it. So, if you interested in some serious analysis, just log on or create an account on http://www.google.com/analytics/
We all head straight to Lord Google for seeking answers to our day to day problems. And like most of the people we simply type the question and expect the Omniscient to provide us the solution. However, often we find ourselves struggling to find what we are looking for. But don’t give up yet. The answers are all there. And as they if you need the right answer you need to ask the correct question.
Wondering what I mean by asking the right questions? Well, there are certain hidden tricks which you can use for easily finding what you are looking on Google and avoid those irritating scrolls to another page.
The best part about Google is that it is clearly focused on what if offers and allows you to enter your query and hit enter to get the answers. Although there are advanced functions which you can use to further refine your results but who uses it anyways?
So to make your search faster and smarter here are some cool tricks for getting the best out of Google.
Use these queries to make your search better:
For Instance, If you wanted to search for ‘I want it * way’, Google would show you ‘I want it that way’.
And if you want to access a website which is down or is blocked in your network them you can see a cached version of it by searching for it in Google.
So use these quick tips and make you search smarter and better.
According to the Alexa site, Google is the most visited site in the world. Not coincidentally, the site created by two college friends not only changed the way we look for pages, but also the way we create our sites so that most SEO companies virtually ignores the existence of other seekers, making the sites of their clients appear on a good position or on the first page of Google. But there was a time when the Internet giant simply did not exist. In this article, we will understand how searchers were this season and how Google managed to impose, beyond the controversies that surround the site on the privacy of its users.
The first seekers
As you may know, the Internet has emerged in the mid-1970s. At that time, it was restricted to military and academic institutions. Its users were using services such as Telnet, FTP and e-mail. In the 1980s, popularized to the BBS (Bulletin board system), computer systems that allowed its users to read news, exchange messages, and download and upload files.
It was only in the early 1990s, Tim Berners-Lee created the World Wide Web, which would allow the exchange of information through the hypertext transfer protocol, HTTP. It is important to know, therefore, that the Internet is much greater than the web, but the web is part of the Internet that is more accessible around the world.
With the advent of internet shopping, began to appear on various websites, pages created by companies and ordinary users in the world wide web. Since the number of sites already surpassed the tens of thousands, it was necessary to have some sort of “phone book” that lets users quickly find the information they sought. There were, then, the search sites that at first were basically three types: directories, “crawlers” and meta search engines.
The directories are websites that specialize in collecting, storing and categorizing links to other sites. They run on three elements: title, keywords and description. All this information can be found in <head> section of a web page. On these sites, you type the key words you want to search and it returns the page title, its description and address. Yahoo!, one of the first search engines that appeared on the Internet. Currently, DMOZ is one of the few remaining directories, edited by humans.
Yahoo! Homepage In Mid-1997
The crawlers functioned similarly to Google. Instead of storing just the title, keywords and description, they also kept the content of the pages, making the search more accurate. The Altavista was one of the major crawlers and search engines in the higher end of the 90s.
Home of Altavista in December 1998
Already meta search engines such as HotSPot, differed from the first two because they are “leeches” in the best sense of the word. Unlike directories and crawlers, they had a database itself, but returned to the user results from other search sites.
Thus we see that, before Google, the market for web search sites was dominated by basically three types of sites: directories, crawlers and meta search engines. All of them, however, had serious flaws.
If both directories crawlers stored as links to other pages, the key question is: who will appear first when you do a search? The classification should be fair for all, therefore, could not be done alphabetically or by last registered site. Thus, most search sites were guided by the keywords contained in the page to sort your results – and that is where the issue was complicated.
Say you are looking for a Data Center in Bangalore, you would type something like “Data Center Bangalore” in the search box and the search engine would return all pages that have those words. The problem is that, this system is very easy to be deceived in both directories as crawlers.
In directories, which are searched by just the keywords and description of the site, not the content itself. Soon, it was common to make a “spam” of keywords to get more clicks. So the chance was that, you’ll find a site that is related data center and fall on a page that is without the content or, at worst, maliciously or with adult content.
Although the crawlers have partly solved the problem of directories to search the content of the page itself and not just in your keywords. Many webmasters put certain keywords in hidden text or in excess, ie, the same color as the page background, making their sites move ahead in position, making the user running the risk of finding only garbage in their research.
So, with the two main methods, needed a new way to sort the search results.
In 1998, the Ph.D. students Larry Page and Sergey Brim launched the project on which they were already working two years ago: the BackRub that later would be called Google, a reference to gugol, which corresponds to the number 1 followed by 100 zeros.
The uncommitted college project was to have an exponential growth in its early years soon, leaving behind all market leaders of search engine hitherto. This achievement is mainly due to two factors: its simple design and powerful algorithm.
Google Homepage In 1998
As you can see by analyzing the images of the text, the initial pages of search sites in the 90’s were packed with links leading to or categories of sites, or user services, such as email and chat, or even advertisements. Google, however, decided to bet on simplicity, which would become its trademark, putting forward the user to only your main tool: the search form. The clean visual made users fall in love with Google search engine which made it popular, but it was eventually being copied by competitors, and was a major factor for the popularity of the site.
While most search engines ordered search results based on the keywords of each page, which could give rise to fraud, the folks at Google decided to follow another path, ordering pages for its importance. To this end, they developed a series of algorithms called PageRank, which assigns a value of 1 to 10 for each site, the higher the value, the more important is the site. A link from site A to site B represents a vote from site A to B. The higher the PageRank of a page, the vote has more weight and thus, based on links a page receives, determine the order of search results.
Soon, Google revolutionized the Web not just for its simple design, but mostly for its innovative way to rank pages, which silently dictated new rules to build pages. If before, due to the nature of search engines, there was great concern about the internal organization and layout of the pages, the need to appear in the top positions of Google opened a new market: SEO, which was responsible for making the internet a better place.
Privacy: The Achilles heel of the Giant Mountain View
But there are not all flowers in the success story of the search giant. From its earliest days, the company was facing problems in relation to treatment provided to the data of their users – and the list was to grow every day.
Soon it started to become popular around the world, in the early 2000s, there was a great controversy over the so-called “immortal cookie” Google. Cookies are pieces of information that websites write to your hard drive to remember your preferences. They allow, for example, you log into a page, close the browser and when opened it again, the page will be showing your profile, without the need to re-authenticate. The problem is that the cookie of Google was originally scheduled to expire in 2038, 40 years after the founding of the company! Added to the fact that the cookie assigns a unique ID to each computer and that the company records all the searches you make, it could mean that they could keep track of all your life based on your searches (currently the cookie of Google is scheduled to expire in 2014, refer to your browser.)
Undoubtedly, Google has managed to impose and change the web for the better. But the question that remains is: are we ready to live without it?
The theme of online statistics is one of those, which has always represented a mystery to all persons and companies that are preparing to monitor their website, to understand its increasing popularity over time. Today there are thousands of statistics services like Google Analytics, webserver based, i.e., based on the analysis of the log of the web server. The most famous of the first category is surely Google Analytics, a former paid product (produced by Urchin) that Google has made free. In terms of web-based solutions instead of the software, there is definitely well known AWStats and Webalizer, which, however, require installation within the web server, hosting the web site.
What I know is that, the analysis tools allow you to understand almost exactly what issues are on your web site and its weaknesses: it goes beyond the fact, the number of page views and unique users, with values and parameters that can be very useful for the operator of the website.
In this article we will see an evening of good solutions and parameters within which you should monitor your web site. The article refers in particular to the possibility of using advanced tools of Google Analytics to improve the overall view of the progress of the website.
Besides the visits and page views
Usually these are two parameters that are seen by users, visits and web pages viewed. Although they represent the starting point for understanding what is popular on our website, these two parameters do not provide information about the possibility of increasing these numbers. Thus we see that other values are summarized directly from Google Analytics and are of fundamental importance:
Pages per visit : This parameter is important and closely related to the loyalty of our users. Indicates how many pages did each visitor visit on our website. Obviously when the higher the number of visits, then it means that our site is able to “drive” the navigator in a real way, a limited number of pages usually means that our website has links that take the user to multiple pages and do not limit it to single visit, coming almost certainly from the research online.
Bounce Rate : Bounce rate has got an equally important value because it is tied to page views per user. Bounce rate is the percentage of single page visits, i.e., visits in which the user left your site from the landing page. This parameter is an indicator of the quality of visits. A high bounce rate generally indicates that the entry pages are not relevant to your visitors. The more effective the landing page, the more visitors will remain on site and will provide a conversion.
New visits : visits are obviously people who had never seen the site before. This value is important because it gives an idea of how our website is able to attract new visitors, but it could also be a negative point if the value is too high and should therefore say that we are not quite able to retain the users of the site.
Unique visits : this parameter is important because it expresses only unique users who have visited the website, without double counting, and especially not counting the visits of any bots.
Set goals for your website
Once we have added the tracking code inside our web site for the pages, we move to the objectives of the site. Google Analytics allows you to think in these terms to allow site managers to understand if you are working in the right direction, but each website has different goals, which vary depending on the type of website and also depending on the business model.
As you can see in the screenshot, the objectives must be set manually in your Google Analytics. It is essential to create the actual rules that allow you to define whether that goal has been reached, or what percentage it is, depending on the number of visits, pages viewed, time spent and other parameters.
The moment on which we are going to display the progress of the web site, it will also show the trend according to the goals set. Obviously an e-commerce website will have different objectives from those of an information website.
We connect to Google Analytics Webmaster Tool
Google Webmaster Tool is a useful tool to monitor the situation on your own website within the Google search engine. The company has made it possible to link this tool with Google Analytics in order to weave the data in the two tools. Once we made the connection, the number of impressions for the query is displayed, for example, the number of links that take you to our website.
The following image shows how the page will appear:
In this way you will also understand what are the keywords to be developed in our website statistics. It gives a clear idea of which ones generate the most impressions and also the highest number of clicks.
The speed of the site and our pages
For some time this parameter directly affects the ranking assigned by Google to our pages, so it is useful to keep an eye on the speed of opening every single page of our website. To do this, you can enable additional features such as Speed Site in Google Analytics and get an overview on the slowest pages: note that those who have a greater load time are obviously those that have a higher bounce rate than the others. The users do not wait for the page that is completely filled, because of its slowness, and leave the site.
Trace the events in Google Analytics
Another important tool that is often overlooked or not fully used by webmasters. This tool allows you to monitor any type of activity within your website, as can be, for example downloading an ebook, rather than recording in a specific area of the site, or clicks on a particular link that represents a important action for us.
The use of events must do great things, just “hang” a small line of code for the link you want to monitor.
Let’s follow the line of code to add:
onclick = “_gaq.push ([‘_trackEvent’, ‘category’, ‘action’, ‘opt_label’, ‘opt_value’]);”
Category : Indicates which type of content / actions we are going to trace, for example eBooks, advertising or whatever name we give it.
Action : This entry defines the type of interaction with our users, and can take on different values: click, button, play, stop.
Label : Identifies the type of event that must be tracked.
Value : Specifies a specific value to be associated with that event if the action is completed. It is useful because we can then use that value directly in the definition of the event to monitor.
Once these parameters are defined, all you will have to do is insert this string of code in our HTML-formatted link, simply adding the string.
The definition of a strategy for monitoring your website takes time and knowledge, in large companies and for the most popular Internet companies, this role is usually played by a specific person in the team, or more than one person, which deals exclusively with tracking what happens to the best site and define the reports that provide directions to modify or enrich the pages.
Using Google Analytics can become simple and the framework integrates all the technology needed to monitor any event that happens within our website, giving the opportunity to see trends in the long term and understand what actions to take.
Since Google opened up the business to create pages on Google+ two weeks ago, they have started to provide basic tools for companies to enter this network. While industry analysts believe that Google+ Pages can find a profitable niche in the world of social networks among business users, for now there is a perception that Facebook could overtake Google+ when it comes to providing businesses with a place to reach the hearts of customers.
For now, there is a list of features that Google+ pages does not count, including the ability for companies to offer promotions or coupons, as well as the POSSIBILITY of hosting contests or sweepstakes. Companies in Google+ pages also cannot sell products. Many of these features are available on Facebook and now the users want them on Google+ Pages.
In July, shortly after Google has allowed the creation of pages in Google+, Facebook launched Facebook for Business which is basically a guide to assist businesses to use Facebook features targeted for business, social plug-ins and ads. Facebook has made it clear that they wanted to lure companies and took advantage of the delay in the launch of Google+ Pages. Now, months later, analysts say that it is clear that Google+ is yet to be mature.
Right now, Facebook has a more complete offer, but all this could change quickly. Google and Facebook are not exclusive choices. Many companies have a blog and a Twitter and a Facebook page. Now, they will add a page in Google+.
Many companies are entered into the world of social networking and see the audience of Google with good eyes. If Google add processes to improve the solution, it will be really useful and valuable for companies.
But first, need to address the problems that users have noticed recently. There are some real limitations on pages that make it harder for companies to use the solution the way they want. This condition can significantly reduce the adoption. For example, only one person can manage the account, so either the individual will be responsible for the company’s site or you have to share the login and password with multiple people. This is not recommended.
In considering these points, Facebook has an advantage when it comes to social network aligned with the business. But Google will need to work on changing that. On the other hand, no tool is “everything to everybody. In recent weeks, Google said that it was exactly the idea which is going forward .
The CEO of Google, Larry Page said that he wants to “transform” the company, integrating its various services with Google+. Google has taken a huge step in this direction by integrating the Google+ with Google Apps, a suite of cloud-based enterprise applications.
As many of you know that the world of SEO has changed dramatically with the Google Panda update, the algorithm of Google is always looking for new techniques to avoid any possible positioning of website based on duplicate content, poor quality content, pages with a poor design or that its only function is to grow backlinks.
It is for this reason that, in positioning our website we must take this new Google function into account when assessing the site. One of the key measures to achieve this is to get an increase in time spent by visitors to our sites displaying our content. Until now it was very common to find well-positioned sites with poor content.
With the entry of Google Panda, it is difficult because the new algorithm introduces new measures to detect the content without interest to the visitor. To achieve this, we have to rely heavily on the introduction and reference point for social networks ( Facebook, Twitter and Google + ) to serve as a social measure. Google already appreciated this for a while, but now it has become much more important to have presence in social networks, and currently if you have no presence in these networks and not receiving visits and “like”, you will lose a lot of appreciation from Google.
Another measure is to consider the design of a more functional website. Before that Google could “understand” while the page was sufficient proof of this and there are many websites with a little care and simple design which are aesthetic but accessible from your code to Google, which had allowed good evaluation by Google. Indeed we must also be careful about the placement of advertisements on our site, as it is important that this is an orderly design and not an obstacle to visitors.
Some of the steps we can take is to avoid to publish content without any interest , you can choose from resources such as embedded videos to force the display of the same from the website. This will make web visitor stay longer for video on the website. Of course, opting for quality written content is equally valid, because what we have to clear is not so outstanding as before by direct assessment by Google but the social value of our website, directly or indirectly will most affect us at valuation.
As mentioned, a key point is the presence on social networks and social tools in general to make use of Google Panda. One is the “+1” on Google. We can add the buttons of Google +, Facebook and Twitter on our website so that our visitors can share and evaluate our content. Of course we must also allow the inclusion of comments to the extent possible and in general any function that allows users to interact and be part of the site.
Anyway… Google Panda will advance and expand its capacity, but it seems clear trend toward social value.