Monthly Archives: November 2011

ABC of Cloud Computing

Cloud computing” – the next leap to the digital world communism “matrix” or a heavy step towards total control over the information? What can we expect in the near future, and what is happening right now on the other side of the monitor?

How It All Began

To go back to the beginning of a generation ago, when many of us have walked under the table, the trees were big, and the computers occupied entire rooms in the Institute. At the time of tube computers, computer was a set of huge cabinets that were calculating, recording and storing the information.

Access to these so-called “mainframe” was carried out with the device as input (keyboard \ card reader) and output (monitor and \ or printer) user space, which in common parlance was called “terminal”. Such terminals in one mainframe could be a few, all depending on the system performance. The user, when he had to calculate some data, received permission to use the computer, and entering the terminal through its program, waiting for processing and output the result.

Since then, dozens of years have passed, along with the development of electronic computers (there was a) decrease in size and is now one of the computing power of the mainframe fits into the normal cell phone. It would seem so much more? What can be improved in this situation?

It turned out there was more to aspire for. Indeed, with increasing computer power grew and the volume of tasks that they had to perform. Previously, a large computer center expected to fly to the moon, and today millions of navigation devices count your way through urban traffic jams.

And if you combine the power of these devices in a large computing center, and expect to get the path of a specific vehicle, and the movement of all vehicles in the city, and if the system receives a signal failure of traffic lights – ensure that the problem is updated to the repair team.  For such purposes, we again need to create computer centers that have huge capacities.

Cloud Computing

But these powers in full will be used only during peak hours, and for example at night time in the equipment which is considerably idle and without much work load. Here to help us, there is a new trend in the computer industry “cloud computing”. Cloud computing is a term defining the shape of a group of powerful server computers that are not constantly engaged in the processing of a particular problem, and provide the resources to different “virtual” services according to their needs.

For example, in the afternoon, 90% of resources will be given to the regulation of road traffic, and at night will receive priority to processing tasks such as the results of the exam, and etc.

Cloud computing can be applied in any field, as in the systems regulating the vital functions of metropolitan, and large enterprises for the processing of priority at a particular point of data.

What’s the difference between conventional computers and cloud computing services?

In the cost! The implementation of “mainframe” – “terminals” are substantial savings to the user’s computer as a terminal, to access the mainframe that does not need an expensive system that prevents the unit under the table, just connect to a monitor and keyboard, a miniature box, which will be on a local area network, transmit data input from the keyboard to the server and return back to the graphical information to the monitor.

Since in this embodiment, all data is stored on the server, it greatly simplifies the task of staff in the organization of data safety. Primarily and more – not only data is stored on the server, but also the software to work with them, so the user is no longer bound to a particular site, he only needs to remember the username and password to access the server of the enterprise.

As a result, he can happily go on a vacation, knowing that in the event of an emergency, from anywhere with an Internet connection you can get into your “workspace” and operate the necessary programs. A company gets more and saves money on licenses for the programs, because there is no need to install them on each user’s computer, just enough to purchase the necessary number of licenses based on the maximum possible simultaneous use of programs.

For home use

The undeniable advantages of this system has a home version of the application. Modern high-speed connection allows you to watch movies, “online” without downloading them to your computer. In the case of cloud all the movies that you bought, will be stored on the servers without taking up space on the shelves of your home, and can be accessed from anywhere. And also to play games, work with texts.

Many well-known companies have already taken up the idea of cloud computing and deploying them. For example service allowing to collaborate on documents with people of different area, or eNlight Cloud from ESDS stores and host your files.

At the same time, some of the leading bookkeeping use such systems to store data in a safe place, often abroad, without losing the ability to work with them. The use of such offshore mainframe allows company executives to not to worry about their computers with all the accounting information.

Myths and Truths About Cloud Computing

It is undisputed that Cloud is spreading and has recently been the subject of writing the entrance exam, so it is natural that IT executives and business persons want the answer for their questions and alleviate their fears.

I will not address the issue of security. We’ve talked a lot about that here so I will focus on other questions that always arise in these discussions. The first question relates to costs.

Cloud Hosting Solutions actually decreases IT costs?

To answer, we will analyze the differences between cloud and the traditional IT infrastructure. In the current model, the physical resources (dedicated server, storage, etc.) are owned or managed by the IT departments of companies. In general, usage levels are low and a significant portion of the computing capacity is idle. As a result, we have machines and data centers that are not fully used, with consequent high costs per unit of labor. In a virtualized environment, the physical resources are still owned by the company, they are virtualized in multiple logical resources, increasing the level of use and lowering unit labor costs.

Of course there is a difference between private clouds, where the company still owns the resources in “cloudification” and public clouds. A public cloud: its potential for large-scale works, in general, with unit costs far less than those devoted to a single company. A public cloud is the one that best exploits the economies of scale, achieving unit costs per unit of labor that is much lower than other alternatives.

But the result is that, in general, the model of cloud, private or public, tends to offer lower costs than the traditional model.

Another interesting point is a question that arises now and then: “private cloud can be considered as a true cloud?”. One company, to build a private cloud computing needs to invest in assets and software that comprise the intelligence of the cloud layer, which are components that can implement virtualization, standardization and automation. It is also a finite cloud because its limits are the capacity of your data center. It also reduces the fear of entering the cloud, because it operates under the policies and security controls of the company.

Another issue is where to start? There are no ready answers, but to any initiative, cloud is a prerequisite to obtain executive support and budget allocated. After selecting a design proof-of-concept or even an actual implementation. Often a POC can cost as much as a real project, and why not start showing what benefits a cloud can generate with a real project?

The result of a pilot project for successful cloud is evidence of the popular saying “see-to-believe.” It’s amazing to see how a skeptical executive is keen to see it on a portal application and allocation of computing resources in minutes rather than several days in which it is accustomed. One suggestion is the development and testing environment. Often about 50% of IT efforts are spent on these activities. Do not forget that there is a whole process of migration to the cloud, which requires extra effort to maintain interoperability between systems in the cloud environment (public and / or private) and the system is still on-premise. The change is gradual and this coexistence can last for many years.

Another doubt that arises is that, changes must happen in IT to support cloud or not. Cloud is not just a technology, it is a new computational model that changes the rules of IT use, affecting both service providers and IT products as their customers. Therefore it is clear that many processes will be affected, since the producer-consumer relationship (new business models and contracts) to the models and processes of governance is are established in the IT field. So it is absolutely essential to get a skilled cloud.

At the end, cloud is already there. IT departments cannot ignore this trend and should lead the process. The cloud model allows the proliferation of “shadow IT”, these initiatives are triggered by the users without IT involvement. The uncontrolled spread of invisible IT can cause future problems in terms of safety and interoperability. Thus, IT can and should take advantage of the cloud model to be an important actor and transform the IT organization.

Google+ vs. Facebook: Which is better for business?

Since Google opened up the business to create pages on Google+ two weeks ago, they have started to provide basic tools for companies to enter this network. While industry analysts believe that Google+ Pages can find a profitable niche in the world of social networks among business users, for now there is a perception that Facebook could overtake Google+ when it comes to providing businesses with a place to reach the hearts of customers.

For now, there is a list of features that Google+ pages does not count, including the ability for companies to offer promotions or coupons, as well as the POSSIBILITY of hosting contests or sweepstakes. Companies in Google+ pages also cannot sell products. Many of these features are available on Facebook and now the users want them on Google+ Pages.

In July, shortly after Google has allowed the creation of pages in Google+, Facebook launched Facebook for Business which is basically a guide to assist businesses to use Facebook features targeted for business, social plug-ins and ads. Facebook has made it clear that they wanted to lure companies and took advantage of the delay in the launch of Google+ Pages. Now, months later, analysts say that it is clear that Google+ is yet to be mature.

Right now, Facebook has a more complete offer, but all this could change quickly. Google and Facebook are not exclusive choices. Many companies have a blog and a Twitter and a Facebook page. Now, they will add a page in Google+.

Many companies are entered into the world of social networking and see the audience of Google with good eyes. If Google add processes to improve the solution, it will be really useful and valuable for companies.

But first, need to address the problems that users have noticed recently. There are some real limitations on pages that make it harder for companies to use the solution the way they want. This condition can significantly reduce the adoption. For example, only one person can manage the account, so either the individual will be responsible for the company’s site or you have to share the login and password with multiple people. This is not recommended.

In considering these points, Facebook has an advantage when it comes to social network aligned with the business. But Google will need to work on changing that. On the other hand, no tool is “everything to everybody. In recent weeks, Google said that it was exactly the idea which is going forward .

The CEO of Google, Larry Page said that he wants to “transform” the company, integrating its various services with Google+. Google has taken a huge step in this direction by integrating the Google+ with Google Apps, a suite of cloud-based enterprise applications.

Virtualization – A Little History

Although server virtualization technology is currently receiving much attention, the concept is not new. Actually the idea came in the mid-1960s, when the Giants and expensive computers of the day reached to a high processing speed but they were unable to seize the expensive computing time due to management processes that needed to be done manually by the operator. To get the best out of expensive computer processing, it was necessary to run multiple processes in parallel. Thus arose the concept of time sharing, which culminated with the idea of virtualization.

Time Sharing: This concept means sharing time, i.e., idle time between the processes are shared with other processes to streamline the system. Multiple jobs are executed simultaneously, and the CPU meets each job for a little while, one by one in sequence. The time dedicated to each job are small enough so that users can interact with each program to recognize that there are no other programs running.

In 1972, an American computer scientist, Robert P. Goldberg introduced the theoretical basis of the architecture for virtual computer systems in his dissertation at Harvard University. In the same year IBM introduced a mainframe that was able to simultaneously run different operating systems under the supervision of a control program – hypervisor.

The IBM System 370 was the first commercial computer entirely designed for virtualization, with which, the operating system CP / CMS allows you to run multiple instances simultaneously. This was followed by the IBM z / VM, which took advantage of hardware virtualization. The VM / CMS is highly regarded and widely distributed in the industry and academia. Several modern approaches to virtualization implementations are very unique for mainframe of computers from IBM.

Over the years, virtualization has begun to fall by the wayside due to creation of new client / server applications and the decline of the mainframe platform that lost power before the rise of the x86 platform. According to VMWare, the widespread adoption of Windows and Linux as the operating system on servers in the 1990s eventually established the x86 architecture as the industry standard.

Due to high costs for the purchase of a mainframe, companies began to acquire x86 servers according to demand, this is a process called low-end (several small machines doing the work of a large dedicated server). In this scenario, instead of having a high initial cost with the purchase of a mainframe, they opt for purchasing smaller servers according to need.

The impact of this new strategy was to ensure  a good deal of backlash against hardware scaling problems, and most of these servers were used for a single application. Thus, according to International Data Corporation, in each implementation of a typical x86 server, the roof of CPU usage was between 10 to 15% of the total capacity of this server.

The servers were oversized for the application that would perform, and as a result, ended up suffering the same problem of mainframes of the 1960s, that is, not all took advantage of their computing power,  and were underutilized.

Then in 1999, VMware Inc. introduced the concept of virtualization on the x86 platform as a more efficient way to operate the equipment of the platform, taking advantage of x86 servers to provide a computational structure that would enable the full utilization of computational resources of these servers.

From 2005, processor manufacturers like Intel and AMD have given more attention to the need to improve hardware support in their products. Intel with its Intel VT and AMD with AMD-V. These hardware contains features which allow to exploit hypervisors that are used with the improved technique of virtualization (full virtualization) that make it easier to implement and enhance the performance.

Choose a Hosting Service – The Choice Of The Company

In this episode we will see how to choose the company that will provide the hosting service.

Choose a hosting service – The company’s choice

In the first part of our guide, we have spoken about the ‘ requirements analysis, fundamental step to understand what service is needed, what should be the level and above. We will now talk about the different types of hosting to choose from (shared hosting, virtual and dedicated server hosting) in the next episode, but first I want to dwell for a “bit”, the choice of the company that will provide the service, regardless of our crucial final choice.

It is always a topic discussed in our forum, and among those who often have different opinions brought to the surface: the choice of the company actually takes place according to different factors, the user often chooses because he was recommended. In our blog, we talked about word of mouth publicity several times, when choosing a company, it is definitely the preferred solution, second only to targeted advertising.

How do you choose a company that offers web hosting services? Considering various factors, more or less important, which are considered at all, even when the offer is so tempting to just push us to buy.

Let’s see a small lineup of the factors we should consider:

  • Company Size (2 / 5)
  • Type of business (sole proprietorship, company or other) (1 / 5)
  • Location of webfarm (3 / 5)
  • Number of carriers and connectivity (3 / 5)
  • Type of hardware used for its servers and security tools used (4 / 5)
  • Methods of care provided and the language in which it is provided (5 / 5)
  • Opinions, reviews and testimonies on forums and portals in the industry (4 / 5)
  • Number of customers of the company (1 of 5)
  • Number of years on the market (2 / 5)
  • Types of services offered (4 / 5)
  • Prices list (4 / 5)
  • Partnerships or membership in associations in the industry (3 / 5)

As you can see, a rating value ranging from 1 (least important) to 5 (highest importance) was given, but not all the factors mentioned must “weigh” the same way in our choice of provider hosting.


At the end of the list of important factors to consider, when choosing a good hosting company, we should always keep in mind that these factors are considered as the basis and starting, then every choice must also consider other factors that vary according to your needs. As you can see there is no surefire way to tell if an activity is completely reliable, however, there are, as we have seen, evaluated factors that restrict the possibility of a wrong choice or disappointment.

Choosing A Hosting Service – The Requirements Analysis

Before driving a series dedicated to the choice of hosting service. In this first guide we talk about the analysis of the basic requirements for the selection of a service, or how to understand what we need and at what level.

Choosing a Hosting service – Guide dedicated to the basic steps to choose which service is best for us.

First part of an article that seeks to clarify which aspects are essential to arrive at a choice of web hosting service. I think, this has been asked several times actually, this article will be a sum of what has been said and repeated. There would be no sense to write an article like this first even for our staff, for me personally, ESDS inevitably has enriched the knowledge of our users, and as well as our own.

An article that speaks only about the factors in choosing a web hosting service, so we will treat explicitly only the appearance of web pages related to hosting services and dedicated to the web.

Choosing the dedicated hosting service – Requirements Analysis

Before you think about what will be our hosting service, it is important to understand what are the reasons why (that) we are choosing a web hosting service, or understand what we’re going to put on our space, and what features it should have. It is not a trivial step in our path, we could even say that it is one of the most important, given that, once defined what we call in the jargon specific to our project, we will have more clear idea for our real needs.

There are hosting services for virtually any type of website or application, more standard, ranging from the more personalized, cost and of course different methods of management. Therefore ask what kind of website we are going to need, trying to figure out first whether it is a static site (static web pages and images) or dynamic (formed by pages so that it exchanges data with a database or using a language server-side programming, such as php and mysql). It is important to understand this because a static site gets thousands of hits, hardly require (requires) resources equivalent to those of the same site that uses a database and is built with a scripting language.

In the first case of a static site, there will be a few other variables in play, but even here among these we find the web space and traffic needed again, you must figure out how many megabytes of  space the site will be occupying, and how many megabytes or gigabytes of traffic is produced on your website every month.

In the second case, if it is a dynamic site, we just do not know how much web space and bandwidth will be consumed, it is also essential to know how many users visit our site daily, and even more important, how about those who come at once. A community, for example, must focus on these data in order to understand what resources will it really need. If our site is a blog, unlike a community, despite being constructed on a dynamic platform, requires less resources, but also in this case a proper analysis of the statistics of visits and the online presence of users is important.

As a “dimension” in project numbers, first of all, if we have our website hosted on a hosting service, we favor the statistics and those relating to the daily visits and traffic (measured in gigabytes or megabytes) consumed each month. Have some initial statistics from which the case is always more lucky, but if it is our first experience and it is a new project, better to refer to parameters such as advertising, consider that if it is advertised on the network, and try to estimate the public based on sites or projects similar to ours.

Finally, we also consider two other aspects that will affect a lot on our final choice: will our website produce a gain for us or is it just for the hobby? Does it provide an important service and is intended for a customer? Two questions that are often underestimated, if our site / project is just a hobby we can afford to “prove” any service, knowing that we will still be a secondary thing, but if our site is an asset for us, and therefore brings earnings, carefully consider the hosting that we are going to choose. This happens for example in the case of an e-commerce portal, where the reliability of our website, minimized downtime, and service provider are essential, because we have Revenue from that activity.

Very often you see sites in the network which are also of some importance who entrust their data to entry-level solutions or amateur providers, with obvious risks if something serious happens.