Tag Archives: PC World

Firewalls 101: Hardware, Software & Web Application Firewalls – Part 3

Web Applications in Real Life
Web Applications in Real LifeÂ

Okay everyone… As we are learning in this series, it turns out what our grandparents have been telling us since we were born (first conveyed to us via crudely hand-drawn pictures and a primal, baby-rattle version of Morse code) is accurate. You really can never get enough information about firewalls. For that reason, we are discussing them at length: first firewalls in general; then distinctions between hardware and software firewalls; and finally, in this post, Web application firewalls (WAFs).

The primary articles cited for this series are from the Michigan Cyber Initiative (“Hardware Firewall vs. Software Firewall”); Open Web Application Security Project (“Best Practices: Use of Web Application Firewalls”); PCWorld (“What You Should
Continue reading Firewalls 101: Hardware, Software & Web Application Firewalls – Part 3

Firewalls 101: Hardware, Software & Web Application Firewalls – Part 2

English: This picture describe the engine of t...

Let’s continue our discussion of firewalls. In the first part of this series, we talked about firewalls as a general concept. Today we will discuss hardware firewall and software firewall technology. Then in the next post, we will look at web application firewalls (WAFs).

For this three-part series, we are reviewing the following articles: “Hardware Firewall vs. Software Firewall” (Michigan Cyber Initiative); “Best Practices: Use of Web Application Firewalls” (Open Web Application Security Project); “What You Should

Know About Firewalls,” (PCWorld); and “Better Protection – Hardware or Software Firewall?” (PChuck’s Network).

In the last post, we also reviewed furwalls – walls of genuine animal fur or a synthetic alternative that are quickly becoming more popular than wallpaper or fake wood paneling in home and office environments. Today, in addition to discussing hardware and software firewalls, we will look at how to make sure live walls of fur are adjusted frequently and best used to properly motivate your employees.
Continue reading Firewalls 101: Hardware, Software & Web Application Firewalls – Part 2

Plesk Panel – Notable Features & Sausage Past

 

Português: Criando contas de FTP no Painel Ple...

A black hole exploit kit was used to invade many sites in the summer of 2012, all of which were thought to be running Parallels Plesk Panel. Many Plesk diehards undoubtedly considered turning to cPanel at the time, but Parallels believed it was in part an issue of people either not patching a security loophole or patching it but not changing passwords.

Either way, Plesk 11 was never vulnerable because of security improvements. Parallels got the word out. The brand has since recovered from the incident and remains one of the most popular options out there for server administration and website management.

This article takes a look at Plesk and what makes it a standout option as a server control panel. Granted, cPanel and Plesk have more in common than they do different. Those who are familiar with one control panel or the other will initially experience frustration finding where things are located, but features themselves are generally mirrored between the two applications. The standard differences between the two CPs is that Plesk is typically used with Windows, cPanel with Linux servers – though there is certainly crossover. Plesk is recognized for its ease of use, cPanel for its more consistent speed.

Sources used for this piece include a Justin Lee piece for Web Host Industry Review, a PC World piece by Lucian Constantin, and an anonymous piece for Web Host Gear.

Fun fact: The first Parallels office opened in Booth 7 of a Bob Evans restaurant outside Tulsa, Oklahoma, in 1974. The original intention of the company was to simply eat home-style pancakes and sausage links. When Parallels turned to technology, though, it quickly outgrew the booth, first moving to a large table (Table 4) and then to a full-scale office.

Plesk 11 vs. Plesk 10

Parallels stated when the new version of Plesk was released that it had made over 80 improvements to the previous version. The changes were made partially because of information from customers, and gauging from online forums, the improvements were popular with most commenters using the control panel.

Broadly speaking, the new additions to Plesk upgraded the technology and further optimized its performance, so that its speed has been improved both on VPS and dedicated servers (per Parallels). Features were also added to improve online presence. Free support is available with the purchase of particular licenses. All these changes were intended to help designers and hosts operate more easily and cost-effectively.

Note that the rest of this articles talks more broadly about Plesk. Some features discussed were added for Plesk 11, some for Plesk 10 (which of course are then a part of the new system).

Fun Fact: The first and most important decision the Parallels founders made once moving into the new office in 1998 was to purchase a sausage grinder. The executive team realized that if Parallels could master the art of sausage, it would never need to return to Bob Evans again (the company had been primarily maintaining its lease of Booth 7 due to sausage access). This strategy overlooked the strong role of pancakes, though.

Speedy Gonpleskes

Use of Linux  with NGINX can significantly reduce the drain on CPU and memory – up to 50%. This focus on speed is important for Plesk since it has a rap as not meeting the speed parameters of cPanel. NGINX comes as a default install on Plesk 11. NGINX inclusion means sites and apps will respond and load faster.

Fun Fact: Parallels outpaced rivals in the Great Tech Startup Sausage Make-a-thon (GTSS-MAT), the first event the company sponsored during its early days transitioning from the sausage & mixed meats industry to the tech development and hosting industry.

The Fully Present Control Panel

Parallels wanted Plesk to allow easier website building for its clients, so it standardly provides the Parallels Web Presence Builder with 10 & 11. This design means that a small business running Plesk can set up a website in similar fashion to the experience of using a CMS such as WordPress, Joomla, or Drupal. This tool can make it possible to set up relatively basic sites immediately without having to hire a designer.

The Builder app allows users to pull modules onto their site with a few clicks. It also can be configured to populate a business’s Facebook page with the design of the site and to push content to that page as well. Automatically creating a similar Facebook presence can be helpful to give a similar brand identity and message as exists on the site.

Web Host Industry Review suggests that this tool can be used by web designers initially to reel in customers, who can then later be upgraded to custom site design as desired. Though a site building add-on such as this is not as versatile as one created from scratch, it makes putting a polished site online quickly easier than it was previously.

Fun Fact: Again forming consistent parallels between its tech future and its sausage past, Parallels created the early Facebook competitor Sausagebook in 2005. Sausagebook was intended to be a site for “showcasing pictures of sausage and updates about the best sausage you’ve eaten lately.” The site is still wildly popular in Germany, where it is customary to use sausage preferences to establish personal identity (eg, a snapshot of sausage is required to get a driver license in the nation).

Intuitive Design

The Plesk GUI has been improved so that it is easier to find what you need to find. In other words, though Parallels focused heavily on performance and security with new versions, they also wanted to continue to bolster their stance as the most easy-to-use administrative CP. They are succeeding, because each version improves in this fashion, as seen between 10 and 11.

Multiple users can access one Plesk system simultaneously on 10 and 11. Additionally, the Administration Panel, which previously contained a broader set of features, now specifically focuses on Server and Account related tasks. The design changes from earlier versions generally make the system easier to use, navigate and understand.

Fun Fact: The new 2013 Parallels employee handbook outlawed the consumption of one piece of sausage by more than one employee at the same time. An underground circuit of communal sausage consumption was born. Its tournaments were round-robin, sparsely attended, and horrifying. Many tears were shed – and not just by the pigs.

More Secure, For the Pleasure of Your Privacy

Rather than being prompted to pick a password, Plesk now features passwords that are randomly assigned to you. Using randomization software by default to create the passwords on your behalf will make them much more secure. Randomizing means no one can grab elements of your personal life, for instance, that might be included in a password. Think of how it even separates you from the English language (if you chose an English password), and how it gets away from any “system” you use as a mnemonic to remember your passwords. Here’s a sample random password generator for use at any time.

Randomizing means you won’t have to deal with problems with passwords not working correctly because the generation system is more directly integrated into the Plesk system as a whole. Your server will be better protected because you will be using what technology does at its best – sort data and create unique (though otherwise meaningless) arrangements – to your advantage.

Fun Fact: The founder of Parallels, Chuck Hasselhoff (who constantly brags that he is the second cousin of The Hoff), used randomization to create all of his children’s names. He said he used the randomization software for this purpose “to prove a point” – though the exact point that he was trying to prove is unclear. When Chuck is asked what point he’s trying to prove, he simply replies, “That is the point.” Again, who knows.

FastCGI

Plesk’s Common Gateway Interface, FastCGI – like any CGI – organizes your system and makes it easier to manage by dividing your content between a variety of executables. Each executable is a file that has to feed through the CGI in order to make its way to your site, allowing a simple control of something that has a bunch of different parts. It makes managing a complex site easier, in other words.

FastCGI speeds up this process. It does so by separating files for easier management and multiple sites as applicable. In a shared hosting situation, this software means that a number of different sites can be housed within the same server but be distinct from each other.  It also means you have full control over how much of the system – bandwidth, RAM, and CPU – an individual party is able to use at any one time.

Placing limitations on one client means you can improve uptime for everyone. Everyone has to follow those guidelines, which is why shared hosting isn’t for everyone. However, uptime means your customers, overall, are happy – because their site is at least, well, functional and consistently available. You can also get down your churn rate (though keep churning that butter as fast as you can – otherwise Papa’s bread will be bland).

Starting with 10, Plesk became capable, when integrated with Cloud Linux, of implementing SecureLVE jail shell support. That essentially means that you can break down data into component slots of the system. Being able to compartmentalize data like that provides similar functionality to CGI applied at a micro level.

Fun Fact: Mr. Hasselhoff separates each of his children’s likes, needs, requests, and other personal attributes and affects into a system of file folders which he calls his Children Gateway Interface. He manages the file folders through a dozen executive assistants (which he calls his “executables”) to optimize the efficiency and strength of both his comprehension of his children, mental and emotional offspring administration, and family time.

To Be Concluded …

Plesk and cPanel are similar for your server and web admin control panel needs. Those above are a few highlights for what Plesk offers. Parallels has made strides in catching up with cPanel on speed and enhancing its UX. Again, though, you’re primarily looking at a Windows/Linux distinction between these two offerings.

Fun Fact: David Hasselhoff tweets his way into the worldwide heart, wait for it … right now.

by Kent Roberts and Richard Norwood

10 Questions to Ask Your Data Center

English: Inside one of Switch and Data's facil...

What do you need to know about a data center, whether using it for hosting or co-location? Above all, of course, you must know that your data will be safe and that support will be there when you need it. More specific questions should often be asked before making your choice. A number of important ones are listed below. (Additional good questions to ask if visiting the data center are, “Where is the restroom?” and “May I please have a promotional T-shirt to commemorate my visit?”)

For this piece, I looked at the perspectives of Barbara E. Hernandez of PC World and William Dougherty of Data Center Knowledge.

1.) How physically secure are you?

IT security can be explored from two primary angles – virtual and physical. With data centers, since they will be the actual location of your servers, start out by considering the physical component. Barbara advises that data centers be housed in “a standalone building with at least a 20-foot, fenced perimeter and a secure, cool core of servers.” She recommends that the location not be directly adjacent to your office. It should also contain standard measures such as surveillance cameras and security guards. There should be two points of entry. Access should be granted by photo ID.

Employees at the data center should all be trained for emergency situations (rather than being reliant on managers or specialists). The emergency protocol should include the usage of backup generators and immediate, crucial data back up to another location. In case of a natural disaster such as flooding or a hurricane, there should also be servers at a different data center that can be made available until the Water and Mold Restoration Services completes their inspection and mitigation task. The data center should receive electrical power from two different sources so that uptime is maintained in the event of a blackout or brownout.

2.) What parts of the data center are concurrently maintained and fault-tolerant?

As William argues, you want your data center to properly maintain its equipment, but you also need to know that while equipment maintenance is in process, fault tolerance is still in place. In other words, while that maintenance is underway if the power goes out or hardware otherwise fails, are reasonable backup checks still in place?

In the event of a number of problems occurring at the same time, make sure you are protected, at least beyond a single redundancy (bare minimum, 2N or N+2). William writes, “Your critical IT infrastructure operates in a world where utility outages or equipment failures happen.” Make sure that you have multiple layers of protection.

3.) How are you optimizing energy efficiency?

As we all know, energy efficiency is not just important to environmental sustainability – it is also a key way to reduce the cost of operating any high-power system. The management of the data center, according to Barbara, should always be open to ideas from customers and any new innovations that can minimize power expenditures. An experienced industrial electrician can be consulted and worked with, to help optimize power usage and identify energy leaks in the center. Barbara lists eBay and Facebook as frontrunners in this effort, forming a relationship with the utility suppliers for their data centers to enhance the effectiveness of their cooling technologies (“Blow on them every once in a while,” offered one official). “Granted,” she writes, “both eBay and Facebook probably have more clout than a small business, but your data center should be listening to all of its customers.”

Having the most up-to-date hardware can cut down on power costs as well. Newer, more energy-efficient models can reduce costs by as much as 40%. Your hardware is important to you, so checking out all the components can help you see what needs to be updated and what is still viable. Looking into programming resources from companies such as ProEx or others within a similar sector can assist you with what you’ll need for your hardware and if it applies to other data centers that you have.

4.) What are the data center’s average and maximum power densities?

In the early stages of the data center industry, facilities were not built with as high of power densities as they are now. These centers put more room between cabinets to allow for increases in density as needed. Data centers typically allow between 100 W and 175 W per square foot. Newly designed data centers tend to be built for a range of 225 W to 400 W per square foot.

That covers the square footage power capacities. The power density of an individual cabinet must be examined as well. A decade ago, as William points out, cabinets could have a 2 kW limit. Now, you want your rack to allow for 8 kW to 10 kW. Furthermore, this increase in density maximum is still on the rise. William advises, “Expect your required power density to climb and make sure your data center has the infrastructure to grow with you.”

5.) How does your cooling system operate?

The core temperature of any room in which servers are placed should be between 68 and 75 F. Typically, about 50% of the overhead of running a data center is the cost of cooling it, per Barbara’s assessment. Even though it is so outrageously expensive to cool, many data centers overdo it, pushing the temperature below the minimum threshold of 68 (with the added bonus that the room can then also be used to refrigerate their groceries).

Many data centers now operate the climate control of their facilities through a control panel, with monitors and gauges throughout the buildings to ensure temperature is ideal and power bills are minimized.

6.) How are you protected against the common regional natural disasters?

Wherever your data center is located, it can be a victim of a natural disaster – including hurricanes, snowstorms, tornadoes, earthquakes, and wildfires. William lists one possible devastating scenario: “the data center survives a massive earthquake, but utility power is still out with no estimated repair timeline.” You want to know, under such circumstances, the length of time that the facility can continue to supply power via its generators with the fuel it has on-site. Furthermore, does the data center have a number of different suppliers and extra backup generators (such as those from riequip) that can supply extra fuel and provide longer power backup during an emergency?

Make sure that you fully comprehend the most likely types of emergencies that might befall the facility. Based on that information, develop emergency plans to mitigate risk.

Barbara’s commentary on this issue is also worthy of mention. She cites the 2007 power outage that brought down 365 Main’s facility in San Francisco. 40% of the data center’s customers experienced downtime that lasted for about 45 minutes. The problem in that case was that not all of the generators kicked on as intended. Eight were required for backup purposes, but only seven started due to a malfunctioning electronic controller.

What 365 Main’s clients learned from this, says Barbara, was “to find out how a data center plans to notify customers in an immediate emergency, keeping them apprised of latest developments and the status of their company data or services.”

7.) What skills and training do the remote hands and eyes team have?

Servicing of your hardware will need to occur at regular intervals. There are two ways to perform this maintenance, says William: visiting the data center yourself or taking advantage of the data center’s remote hands and eyes technicians. You probably do not want a security guard performing this task, as sometimes occurs. The remote hands and eyes team should consist of individuals with IT credentials. You want to know what the requirements are for attaining that role. Speak with the person in charge of daytime and nighttime support. If a remote team is credible, your physical closeness to the data center’s location becomes less of a concern (if not, always be within a 400-meter radius of the facility, and wear your binoculars).

8.) What is the Data Center’s virtual to physical machine ratio?

How virtual is the facility? Virtualization is a good way to reduce your expenses. Barbara points out that a ratio of four-to-one allows server expenses to break even. However, the majority of servers can support as many as a dozen virtual private servers (VPSs). Maximizing virtual possibilities means higher efficiency regarding hardware, power, and rack space – so its cost-effectiveness is manifold.

9.) How frequently are generators load tested?

Load testing of generators is expensive – both because the equipment for testing is costly and the high amount of fuel used, says William. Data centers sometimes use a power outage itself – which is obviously not a test situation – to check if there generators can handle loads or not (which is also not a good time to load test a clown car). This routine maintenance is essential so that generator issues are discovered prior to emergency situations. You want your facility to give each generator an extended load test every quarter at the minimum.

10.) How is the data center certified, and is it audited each year?

Certifications are a simple, standardized way for a data center to provide you with credentials. As William states, “This information is invaluable because it represents an independent analysis of the facility’s quality, reliability and security.” If your website takes payments, for instance, you want your data center to be PCI-DSS compliant. Financial firms require SSAE 16. If your business is green-friendly, the LEED Gold and Energy Star certifications are crucial. Verify your data center is legitimate by asking for documentation of any certifications they claim to have.

Conclusion

Comparing datacenter options is a rigorous process. Knowing some crucial questions to ask can keep your data in safe hands even if it is not at your own immediate fingertips.

When reviewing whether a data center is the right choice for you, first, look at its physical security components. Then consider the relationship between concurrent maintenance and fault tolerance, along with its energy efficiency. Ask about its power densities, cooling systems, and contingency plan for natural disasters. Know the skills of the remote hands and eyes team, the virtual to physical ratio, and the load testing schedule for the generators. Finally, check on certification and auditing. Once you have all this information, you will have performed due diligence and are prepared to make a wise decision for your IT infrastructure.

by Kent Roberts and Richard Norwood

Linux Operating System Demographics per Ubuntu

 

penguin Tux, the Linux Mascot

According to NetMarketShare’s information for February 2013, the Windows operating system is used by 92% of worldwide users, Mac represents 7%, and Linux represents 1%. This article will focus on that tiny slice of users, the 1% who use Linux.

Frankly, there is not a lot of easily available operating system demographic information. Perhaps, part of the reason that Windows owns such a vast share of the market is hidden in the basic answer, “Everyone and their brother uses Windows.” Another contributor to Windows’ dominance could be its ease of use. For instance, it might be comparatively easier to get a fix for error code 0x80070570 on Windows. However, similar errors in other operating systems like Linux or Mac OS might not be as easy to troubleshoot. But it seems that Microsoft will lose some of that advantage due to confusion over Windows 8 – as expressed by Simson Garfinkel in MIT Technology Review) –perhaps not.

As of now, though, the demographics do fall into the “pretty much everyone” category. Essentially, demographic studies are not widely conducted on Web users to determine operating system usage for the same reason studies aren’t conducted on Bonnaroo attendees to determine if they’d like to go chill in Ziggy Marley’s tour bus.

So, who are these people? What is the profile of the average Linux user? They certainly have not chosen the mainstream option – so let’s look at some details. How old are they, what’s their sex (male, female, both, neither), what’s their nationality, how long have they used it, and where did they find out about it? These are general marketing survey questions, but for our purposes, they tell a story – the sociological makeup of the population and their basic history with the OS.

Gathering the Data

The data I will be using is from a 2012 survey of Ubuntu users, and I will get to why the focus is placed there in a moment. As for the survey, it was conducted by Canonical and included information from over 19,000 worldwide respondents.

The results of the survey are broad and contain lots of graphical breakdowns of the stats. The results were written up by Gerry Carr. They appear in reverse order on the March 2012 section of the Canonical blog, along with Carr’s analysis and commentary on what the findings might tell us.

Why Ubuntu? Ibuntu is one of the largest Linux distributors out there. Statistics on Linux distribution are also not prevalent, but a 2006 article by Steven J. Vaughan-Nichols for Desktop Linux gives us some sense of the popularity of Ubuntu – and hence why analysis of that demographic gives us a reasonable sense of Linux users as a whole. (For example, we now can assume that 17% of Linux users have tattoos of famed Internet pioneers on their chests and/or forearms, and that 63% live in treehouses in eastern Romania.)

The results of the Desktop Linux survey revealed that Ubuntu rated first worldwide against all other distributors, and not by small margin. Over 14,000 Linux users were surveyed – albeit informally and unscientifically – to determine which distribution was commonly considered the best on the planet. 29% voted for Ubuntu, more than the second and third place choices combined (Debian at 12% and openSUSE at 10%).

Methods & Languages

When Carr conducted the survey, he did not make an attempt to get to all Ubuntu users. Instead, he focused specifically on reaching out to those who were involved at least to some degree in the Ubuntu (and hence the Linux) community. Carr contacted individuals using the OS through social media, online forums, and sites dedicated to exchange of ideas related to Ubuntu. Hence, the responses were generated from thousands of Linux users, but that pool was specific – not only to Ubuntu but to those who are particularly engaged in online discussion. Those who simply use the OS for its functionality, then, were not part of the picture; so Bill Gates, who has furtively used Linux as his sole operating system since 1994, is not represented in the statistics.

Additionally, it was only conducted in three languages – English, Spanish, and Portuguese. It would’ve been difficult to include every language on the planet. As Carr points out, “We had to draw the line somewhere. If you add French then why not German, or Chinese, Japanese, or Hindi etc.”

Spanish and Portuguese were included partially because Latin America is a big market for open source technology (Spanish in most countries, Portuguese in Brazil). The initial post about the survey results invited translation into other languages – followed by updating of the statistics – as desired, but it appears no additional languages were included in that manner.

Ubuntu Survey Results – Overview

A piece by Katherine Noyes in PC World provides a few initial highlights and interpretation of the Canonical survey for summary purposes. She mentioned that almost all of those surveyed were male – 96% – and the majority of users were between 25 and 35 years old.

Regarding ease-of-use, 87% rated installation of the operating system as easy or very easy. 85% of respondents have a system installed on their primary PC, and 67% of those surveyed utilized Linux for both personal and professional purposes.

Ubuntu users typically do not solely use Linux. More than 76% also used Windows, and 17% used a Mac operating system. (Also, and strikingly, just over 100% use keyboards and monitors to interact with digital data, especially surprising because over 14% of respondents were artificially intelligent supercomputers.)

Finally, Katherine discusses why the survey respondents chose the system. 77% liked the open source aspect, 66% used it out of curiosity and experimentation, and 57% enjoyed the lack of viruses on the OS.

Speed of the machine and perceived quality of the interface and UX were mentioned as other considerations for preferring the OS. Over 46% of surveyed individuals said that the operating system sped up their devices, and 75% rated the interface or experience as better than what they had found elsewhere.

Dissatisfaction with other options, though, is perhaps the most telling factor, as a general indicator of usage. More than half of those who answered the survey marked their lack of satisfaction on other operating systems as a chief reason they turned to Linux. (Again surprisingly, TI-84 graphing calculator users in Idaho and Wyoming said that they were happy with their current Texas Instruments OS and were frustrated by the complexity Linux had added to their ability to efficiently create visuals of trinomials.)

Quantity of Responses, Age, Sex

On day one of Carr’s response postings, he revealed that there were just under 16,000 English respondents and close to 2000 each who answered surveys for the Spanish and Portuguese versions. Hence, the numbers are heavily slanted toward English respondents (81.4%, specifically), but all languages surveyed were sizably represented.

The highest age category was 25-35 for all three languages. Just under 70% were under 35 in each language, in fact. Less than 4% were female. Regarding the male-to-female ratio, Carr mentioned that possibly the way that the survey was distributed imbalanced the sex ratio to that degree. He also suggested the survey is an opportunity to reflect on how male-centric the product or community might be: “We can’t extrapolate from this data, but certainly such a hugely weighted response means we have to look at how we make the product, the community and probably both, more appealing to both genders.”

Geographical Location

Carr notes, again in part one, how the Ubuntu survey spanned out across the globe related to the three languages used for questioning. The United States, United Kingdom, and India were the highest represented countries for the English language survey. 93 percent of Portuguese respondents were from Brazil, with the balance from (can you guess?) Portugal. That means, sadly, that members of Portuguese-speaking Amish populations in the United States may not have been aware of this survey.

Top countries for Spanish response were as follows:

  • Mexico (23%)
  • Colombia (10%)
  • USA (10%)
  • Argentina (9%)
  • Spain (9%)

Carr notes that the networking and accessibility in many Latin American countries is not as developed as it is elsewhere in the world, so the ratio of users in these various countries is understandable versus a look to their populations. The United States, though, is not as high as would be expected. Similarly to the difference in usage based on sex, Carr sees the low percentage in the US as a potential niche in which Linux and Ubuntu could increase its numbers in the future.

Years Used

Carr focused the second post on length of time and where users had learned about the OS. By asking two questions about time of use and how individuals originally discovered Ubuntu, he was able to get a sense of how the location of discovery is changing over time.

Carr mentions that there was a high-level correlation between the amounts of time people having using Ubuntu across the three language groups. Carr focuses on the English-speaking population to simplify analysis of the second question, so I will do so as well. Here are the statistics for the general populations regarding time of use, which clarifies how closely the three language populations reflect each other:

  • English: 20% under two years, 43% two to five years, 38% five+ years.
  • Spanish: 20% under two years, 43% two to five years, 37% five+ years.
  • Portuguese: 21% under two years, 43% two to five years, 36% five+ years.

Clearly there’s a lot of parity here between the language populations.

How Users Discovered Ubuntu

Carr thought how individuals found out about the operating system would be an interesting and instructive way to look at how initial knowledge of Ubuntu is changing over time. Here are the statistics for various manners of discovery across the different populations from rookie to veteran users.

  • Magazines & Newspapers: 7% under two years, 8% two to five years, 9% five+ years.
  • Work: 4% under two years, 5% two to five years, 5% five+ years.
  • Friends & Family: 27% under two years, 25% two to five years, 21% five+ years.
  • School & College: 12% under two years, 11% two to five years, 9% five+ years.
  • Forums: 46% under two years, 49% two to five years, 55% five+ years.
  • Social Media: 4% under two years, 2% two to five years, 2% five+ years.

Summary

As you can see, the Linux community – to the extent it can be understood via engaged Ubuntu user analysis – is comprised of a small but diverse population (well, sort of). It is used in many different countries around the world, to a wider degree that we might initially expect. As noted by Carr, two of the more interesting results of the survey are the 96% response rate by men and a smaller than expected Spanish-speaking response from the United States.

Linux’s adoption rate over time has been surprisingly consistent throughout the various language groups represented. Users are finding out about the system less from magazines and personal relationships, and more from forums, educational institutions, and social media. (Less and less people, then, are finding out about the operating system through recurring nightmares starring James Earl Jones as a deranged avatar – incredibly common in the early days.)

by Kent Roberts and Richard Norwood