Tag Archives: Pingdom

Monitoring Your Uptime – Free Tools

 

Server Uptime 448 Days & Counting
Server Uptime 448 Days & Counting

Clearly, one of the most important aspects of your website is how often it actually is a website. After all, if no one can access it, it’s not really a site but more of the idea of a site. Also, at times, it may be “up” but not fully functional… a groggy website that does not want to be bothered. Uptime, then, is a word used often by those conducting business online.

Uptime is also often phrased as “reliability” or “availability.” A site with high-availability has very little downtime because it is based on a system that is highly reliable. The same can be said of a 24-hour shoelaces store: if you need shoelaces at 3:30 AM, Every-time Lace Shop has got you covered. Plus, they won’t ask you any questions, such as, “Why are you here?” or, “Are you sure you need shoelaces?”

Hosting companies are highly concerned with the uptime their clients receive. They have to be, because it is one of the core concerns of anyone looking for a hosting solution: “What’s your guarantee for the maximum amount of downtime allowed?” Typically a hosting company will guarantee 99% uptime or 99.9% uptime or 99.99% uptime, possibly more – such as 200%, which is highly remarkable.

*** Pause for a commercial break: In our case, notably, we don’t allow any unscheduled downtime. For that reason, we guarantee 100% uptime in our Service Level Agreement (SLA) with all our clients. If we ever fall under that number, we will reimburse you and possibly (don’t count on it) give you a back rub. ***

In this two-part series, sponsored by Darrell’s Free Tool Shed International (a nonprofit tool-provisionary outfit), we will look at a number of different free tools to assist you in uptime-monitoring. These are tools you can use to ensure that you are getting the uptime you are guaranteed when you sign up for your hosting account.

We will use a couple different sources to broaden our perspective: Mashable and WPMU. Both sites provide 10-12 different options for free software you can use to monitor your uptime.

It’s a good idea to install all available software, create an intricate schedule to monitor your uptime-monitoring software, and then consider installing uptime-monitoring-software-monitoring software. Keep layering and layering until your uptime-monitoring matrix forms a layer cake of satisfaction that tastes good and is reasonably filling.

Free Online Uptime Monitoring Tools

Without further ado, here are several tools you can use to ensure you are getting the uptime you deserve. If you aren’t, phone your Congressman and bark into his voicemail (and whatever you do, don’t meow) … Also, e-mail your hosting company with details (including your barking experience).

UptimeRobot

Maximum websites monitored: 50

Monitoring frequency: 5 min.

Contact options: RSS, text, e-mail

The way this software works is it checks your header code. If there is ever an error, it digs deeper. If the more in-depth analysis suggests real problems, you are immediately notified by any of the methods listed above, or by airhorn.

Pingdom

Maximum websites monitored: 1

Monitoring frequency: Optional, 60 seconds minimum

Contact options: text (20 max./month), iPhone, e-mail

Pingdom is massive within this sector and primarily likes to make money, but it does offer a no-frills, unpaid option. Though it is limited to just one site, the phone app may make it worthwhile when you are walking, climbing trees, or jumping off your roof into a pile of Jell-O for a hilarious reality TV series.

Mon.itor.us

Maximum websites monitored: 1

Monitoring frequency: 30 min.

Contact options: RSS, text, e-mail, instant message, paper airplane

This application is the dumbed-down version of Monitis, but it is easy to install and use. Rather than just notifying you of problems, the application creates statistics broken down into various time-frames. The stats populate immediately, for efficiency, or in slow-motion, for dramatic effect.

InternetSeer

Maximum websites monitored: 1

Monitoring frequency: 60 min.

Contact options: text, e-mail, smoke signals

This site, so it says, is presently monitoring almost 2,000,000 sites. The company has gotten a bad rap for being aggressive with mass marketing campaigns. However, it allows numerous different people to be notified of downtime, and statistics – including CDC pandemic figures – are sent out to you each week.

Uptrends

Maximum websites monitored: 1

Monitoring frequency: 30 min.

Contact options: no notifications, except singing telegram

Uptrends, rather than being focused on letting you know when periods of downtime occur, is geared toward making your visitors aware how seamlessly your site delivers content. You can embed the code for its button, and it checks your site globally every half an hour. When anyone clicks the button, they receive information related to various time-frames, up to the previous year. (Previous eon is only available to Paleolithic users, most of whom are deceased.)

Conclusion & Continuation

So far, we have gotten a sense of several of the most high-profile and useful options out there for uptime monitoring. There are many more solutions available, and we will review some of those other major tools in the second and final part of this series. Then we will get a bite to eat and talk at length about my maritime marital problems, which are extensive and difficult to resolve, due to my chronic seasickness.

P.S. Considering our 100% uptime guarantee, you can’t go wrong with a shared, dedicated, or VPS solution from Superb.

By Kent Roberts

Study: Where the Web is Located & Ghost Servers Haunting the Internet

 

English: The CERN datacenter with World Wide W...
The CERN datacenter with World Wide Web and Mail servers

Pingdom recently performed three interesting hosting-related studies that are also interesting generally regarding global Internet behavior. All of them relate to where in the world websites are hosting their sites – what different nations and cities are being used the most – by looking at the GPS coordinates of the servers (alongside other information).

One is a general study of the popularity of certain countries for website hosting. A second breaks down that information into specific cities. The third and final study is an exploration, via analysis of websites using country-specific top-level domains (TLDs), of how many sites within certain countries use servers that are “onshore” – meaning internal to their own nations.

Finally, I will look at how ghost servers haunt the Internet: acting creepy; moaning; shouting “boo” at unsuspecting website visitors; and in the process, scaring many children and elderly web users out of their chairs.

Where in the World is the Web? – General Study Methodology

We obviously think of the Web as “global” (it is, after all, the Worldwide Web). However, it’s not evenly distributed throughout the world, as we all know. Looking at where the servers are located that are hosting websites is one way of understanding how the Internet is spread out across the planet.

Of course, these studies do not look at how many servers are used by each site or company, but it at least shows us the behavior of different sites related to geographical location of their servers. Also, tiny sites were not included: all sites that were scanned were in the Alexa-Netcraft top 1 million sites at the time the studies were conducted. Plus, for the third study, no .com or other gTLDs (generic, ie non-country specific) were included because it related the location of the site’s ownership with the location of its servers – impossible to perform with the generic TLDs.

In other words, the studies do not give us a perfect or complete picture, but the stats gives us a general idea of physical location of web hosting. This tells us where the Internet – at least as hardware being accessed for page loads by companies and individuals – is located throughout the world, in some cases down to the city. I’ll discuss methodology a bit more in each study’s section (below).

Ghost servers, by the way, do not have a defined methodology. They tend to have certain characteristics though. They are wisps of surreality, populating sites that only exist for a moment. The ghost servers use the sites to freak everyone out and then recede into the abyss from whence they sprang (both the site and the hardware do this in unison, as the server moans a soft yet resonant howl of pain and longing).

Study 1: USA, Server Nation

The first study I will look at from Pingdom focuses on the countries. The study reviewed the Alexa top 1 million and clustered those sites into the top 100 nations where their servers are located (a follow-up to a 2012 study conducted by the company). Bear in mind, these figures are not in any way gauged by population, geographical size of the country, or any other factors. Quantity of people and scope of land obviously helps boost the US’s numbers over many other nations.

The information for all the studies – the raw data – was all gathered on February 27, 2013. It was attained using a script developed by Pingdom that allowed it to GPS-scan all the Alexa 1 million in rapid succession. For the first two studies, 907,625 sites were scanned. Out of the remaining sites, 52,539 could not be scanned, and 39,836 did not have any observable GPS coordinates. Since the GPS coordinates were irrelevant to the 3rd study, the scanned number of sites was 947,461 instead (although, as described below, many of those sites had to be removed because they were generic TLDs).

What about the ghost servers, you ask? No ghost servers were scanned for this study. Ghost servers and the sites represented by them are not in the Alexa 1 million because they cannot be tracked; and, furthermore, if you attempt to study them, they start screaming. Ghost server science, therefore, is considered inhumane if it is at all in-depth. I even feel kind of bad about writing what I do here, fearing that somewhere a ghost server might be wailing in agony because of my actions. Sorry, spirits.

Worldwide … Well, Sorta

191 nations serve as a home, hosting-wise, to at least one of the Alexa 1 million websites. According to Worldatlas.com, different sources list the number of countries on the planet at just below 200 as well – between 189 and 196. Just looking at it in terms of inclusion of one “top” site, then, the Web is distributed worldwide. However, the numbers for the primary countries are massive. The top ten countries are as follows:

  1. United States: 421,228
  2. Germany: 70,587
  3. China: 35,908
  4. United Kingdom: 35,500
  5. Russia: 35,254
  6. France: 34,498
  7. Japan: 29,898
  8. Netherlands: 25,632
  9. Canada: 18,116
  10. Poland: 12,109

Compared to 2012, the numbers are similar at the top. The US lost a tiny bit of ground in the first position, dropping from 43% to 42% of all hosting for the top websites. Germany, China, and the UK are all in the same positions as a year ago. The US is still the home-base for the Web though, in this sense. The country hosts 6 times #2 Germany’s figure, the latter accounting for 7% of the global total.

France and Japan lost a small bit of relevance as Russia moved up two slots from its previous 7th-place spot. Poland has also increased its standing – in 2012, it was #13 on the list.

Ghost servers are primarily located in Liechtenstein. Many people think that the ghost servers are trying to convey recipes, herbal remedies, and other pieces of cultural information in whatever language is spoken in Liechtenstein. The only problem is that no one knows for sure what language is spoken in the country. Everyone is pretty sure it’s either French or German … possibly its own language, if that exists. Maybe one of us should look this up.

Comparison Grouping by Continent

Let’s now take a brief look at how the United States relates to the two most prominent continents, Asia and Europe. The US, as noted previously, hosts 421,228 or a 42.1% share of the world’s highest-traffic sites. Somewhat amazingly, Europe grouped as a whole is still more than 10 percent below the United States: 314,317 sites, representing 31.4% of the sites. Asia is far below Europe at 11.5%, with a total of 114,571 of the Alexa 1 million. The remaining 15.0% are located in other continents and non-US North American locations.

In 2012, Pingdom pondered in their coverage of their initial study on the same subject whether the US was going to be surpassed by other global locations. Nothing much has changed since last year as far as that goes. However, as this year’s Pingdom analysis notes, since Asia is at 25% of global Internet users, it will be continually interesting to see whether or not their hosting industry starts to make pace with their population of users.

Ghost servers are not like typical hardware. Rather, they are believed to be constructed out of a mixture of cobwebs, dreams, fog, and eerie music (the last of which, in its physical form, looks exactly like it sounds, whatever that means).

Study 2: Houston, Server City

Now let’s take a look at the second Pingdom study: city analysis. 7,936 cities around the world host the Alexa top 1 million sites. Like the above figures, though that number seems reasonably well-disbursed, the top-ranked cities account for a large chunk of the action. 223,206 of the top 1 million – which is roughly equivalent to 22.3% — are all hosted in just 10 cities. What are the top cities for hosting? Take a look:

  1. Houston, Texas – 50,598
  2. Mountain View, California – 29,594
  3. Dallas, Texas – 24,822
  4. Scottsdale, Arizona – 23,210
  5. Provo, Utah – 20,691
  6. Ashburn, Virginia – 14,871
  7. San Francisco, California – 13,214
  8. Chicago, Illinois – 13,125
  9. Beijing, China – 11,273
  10. New York, New York – 10,006

Again, we see a major disparity even between the top two sites and a 5-fold difference between the 1st and 10th cities on the list. In fact, the top three cities – Houston, Mountain View, and Dallas – account for 10.5% of the hosting of the total 1 million sites! We also see the major and continuing impact of the United States on the size of the Web, with the US making up 9 of the 10 top positions.

Ghost servers should not be taken lightly. When you see a ghost server in person, always approach it cautiously, and never under the influence of alcohol or drugs. Observation of ghost servers should be conducted in the same manner as if you were operating heavy machinery. Approach clear-headedly; functionally; and wearing heavy work gloves and a hardhat.

Study 3: South Korea, Onshore Central

Of course the United States has many sites hosted within its own borders, but what countries are the most “loyal” to their home country regarding their hosting? Keep in mind, these figures are not perfect by any means: none of the gTLDs (generic top-level domains), including .com, could be scanned. This study simply looked at which countries had the most onshore hosts for their sites based only on the country code top-level domains (ccTLDs), to determine what percentage of those sites stay within the country for hosting.

What are the top countries? And what are the respective numbers of sites in the Alexa top 1 million? Here they are, and note that they are listed in order of percentage of sites, not quantity, with the quantity in parentheses. Below, you’ll see a shorter list of the nations with the most ccTLDs hosted onshore.

  1. South Korea – 97% (1,750)
  2. Vietnam – 93% (2,260)
  3. Germany – 92% (25,469)
  4. Japan – 91% (14,188)
  5. Czech Republic – 90% (4,736)
  6. Lithuania – 88% (1,051)
  7. Bulgaria – 87% (825)
  8. Thailand – 85% (699)
  9. Kyrgyzstan – 84% (102)
  10. Hungary – 84% (2,619)

Here, also, are the top five countries with onshore-hosted ccTLDs – again, from the Alexa top 1 million:

  1. Russia – 43,002 (.ru)
  2. Germany – 25,469 (.de)
  3. United Kingdom – 17,558 (.uk)
  4. Brazil – 16,991 (.br)
  5. Poland – 14,235 (.pl)

Pingdom notes as well that this figure is only representative of the ccTLDs. Over 509,000 of the sites in the Alexa top million use the .com TLD, just as one example of the limitations inherent to this study.

Ghost servers smell like gravy and taste like asparagus. Strange, right? Don’t eat them though: bad, bad gas, according to one blog writer who regrets the experience and will go unnamed so no one thinks he’ll eat anything that comes his way.

*****

Have any thoughts on these studies? Any ideas related to the numbers, or anything that perhaps looks surprising to you? Please continue the conversation below if you like. Thanks for reading. Also, um, no one can ever prove that I ate a ghost server. I … did not do that.

by Kent Roberts and Richard Norwood

How Much Traffic Can Your Website Handle?

Depending on how you are hosting your website, including what platform or application you are using well define your visitor capacity that you can handle. This is not often something webmaster will have to think about until they hit that peak of traffic one day with a popular blog post or product line. Here’s some food for thought:

Load testing tools vs monitoring tools


Load testing tools vs monitoring tools | How to Grow Your Business Online | Scoop.it
From blog.loadimpact.com – 1 month ago

So, what’s the difference between a load testing tool (such as http://loadimpact.com/) and a site monitoring tool such as Pingdom (https://www.pingdom.com/). The answer might seem obvious…

Juliana Payson‘s insight:

With a load testing tool, you create a large amount of traffic to your website and measure what happens to it. The most obvious measurement is to see how the response time differs when the web site is under the load created by the traffic. With a load monitoring tool, you are continuously measuring your website’s capacity, both in terms of uptime, and data usage. Load monitoring can give you a better gauge of your website usage, especially if you are gearing up for a more efficient transfer to cloud hosting and are looking to discover your potential usage rates.

How Ready are you for Heavy Load on Your Website?


How Ready are you for Heavy Load on Your Website? | How to Grow Your Business Online | Scoop.it
From blog.smartbear.com – 4 days ago

Load testing is an important subset of any overall performance management strategy. It is a technical investigation done to determine the scalability, speed, and/or stability characteristics of the system under test.

Juliana Payson‘s insight:

Apart from website management, load testing is vital for your business performance planning. Having the answers from these kinds of tests allow the business to really ’feel’ the capabilities of their infrastructure, and more importantly, to recognize the signs of a business website and infrastructure undergoing a stressful incident.

If you’re hosting your website on WordPress platform Content Management System, then you are in good company. Over 50% of the top 100 websites use WordPress as a robust CMS capable of handling high traffic loads.

WordPress Dominates Top 100 Blogs


WordPress Dominates Top 100 Blogs | How to Grow Your Business Online | Scoop.itFrom
smallbiztrends.com – Today, 3:04 AM

Fifty-two percent of the top 100 blogs are currently using WordPress, either hosted or self-hosted, according to an annual study conducted by Pingdom.com.

Juliana Payson‘s insight:

The majority of  the top 100 blogs are using WordPress, says a new study from Pingdom. However, don’t just rely on the platform out-of-the-box, with a standard template. In terms of site speed and load times, I’ve had plenty of experience in determining the best usage of the WordPress platforms are often optimized, or specially developed with off page stylesheets, and more efficient coding structures. Be sure to give your WordPress Platform the once over with a site speed checker for SEO purposes such as WooRank.

– Juliana