Category Archives: What’s New?

How to Protect Your Business from Cloud ‘Overselling’

Cloud hosting can be hit or miss regarding reliability, speed, and uptime.

In reality, most Cloud Hosting Providers (CHPs) are overselling resources. By overselling, CHPs are compromising the performance of consumer applications. A direct result of overselling is application slowness and even outages.

Many CHPs use the buzzword ‘Cloud Hosting’ to sell the idea of reliability. Most consumers have the opinion that cloud hosting is easily scalable and can handle just about anything. Further, most CHPs are not transparent and hide the underlying architecture and resources available. With that in mind, it is important to note that no cloud environment has unlimited resources. For this reason alone, you must do all that you can to protect your business.

Due to overselling, it is not uncommon to read about outages in the news from major CHPs like Amazon and Microsoft.

The truth is, most clients never use the full capacity of their hosting resources. This allows the CHPs to pack a lot more customers into a cloud environment.

To shed some light on cloud hosting transparency – and why transparency is mission critical for your business – I have asked Superb Internet CEO Haralds Jass to discuss how you can protect your business from overselling.

Interview with CEO of Superb Internet, Haralds Jass

haralds_superb_internet1. What is the single biggest challenge that Cloud Hosting customers face?

HJ: Comparing different providers objectively and finding out what level of performance (processing power, IO speeds, etc.) they will get. The cloud industry is highly obfuscated: no Cloud Hosting Provider discloses the actual processing power – such as, for example, amongst how many customers is a CPU core shared, what % of it, if any, is guaranteed to each customer, based on what CPU it is, what is the actual performance to expect, and so on. Comparing three providers simple headline specs is practically useless, as Provider A may outperform Provider B several folds over in real life, even though “on paper” the Provider A specs appear to be less. Doing actual performance: cost based objective comparisons is impossible due to the massive lack of transparency and information, and even just running some short-term tests on various providers and finding the undisclosed actual performance indicators is of little use, as most CHPs’ actual delivered performance varies many folds – often 10+ fold – over time. That’s right, most CHP’s performance graph looks like a rollercoaster.

The cause of all this? Massive and rampant overselling, which no CHP wants the customers to know about, or even think about. Since not everyone is using all their resources all the time, all CHPs oversell – some to staggering double digits massive degree. So when just one of your many neighbors starts actually using the resources that they are paying for, then you and everyone else assigned the same CPU core(s) and memory suffer and the performance drops, massively. Therefore, there is highly variable (peaks and valleys, rollercoaster-like) performance with most CHPs.

In short, buying cloud hosting service these days is like the purchase of a mystery box of chocolates: you have no idea of what you’re getting.

2. How does Superb Internet go about solving this challenge for customers?

HJ: No overselling and full transparency.

We do not engage in the rampant overselling that practically all other CHPs do. In fact, we do not oversell our cloud service at all. Instead, we provide fully guaranteed, unshared resources, which are secured and dedicated to each customer (for example, one client per core for our dedicated core cloud accounts). As a result, our cloud performance is consistent and predictable – quite unlike the other CHPs.

We are also fully transparent: we disclose the exact CPU processing power, expressed in objectively measurable and verifiable passmarks, that each core and each cloud account gets – which is always fully guaranteed and dedicated to each client, and we likewise disclose the actual real-world performance of the other components, such as Disk, RAM, and Network. All the CPU power, RAM, and disk that a customer buys are theirs, and theirs alone – guaranteed, unshared, dedicated. In fact, it is theirs two-fold over, given the built-in High Availability.

Therefore, with us, there is no “mystery box of chocolates.” Instead, with us every customer knows exactly what they are getting and exactly the consistent performance that they can expect – and that will be delivered to them – each day in and out, consistently and unvaryingly so. In short, we take the obfuscation out of the cloud and replace it with full disclosure, openness & accountability.

Oh, and did I mention that we also have, by far, the industry’s strongest and most stringent cloud SLA? I am very proud of the fact that no other CHP comes even close to our cloud SLA – which is again not just a guarantee, but what we deliver. Check out the executive summary of our SLA here:

3. How many years has Superb Internet been involved in Cloud Hosting?

HJ: We were the very first web host worldwide to bring the predecessor of cloud, the Virtual Private Server (VPS) service, to the commercial hosting market back in 1999 – back 18 years ago. Even today, what some mistakenly call cloud is just a VPS. (Keep in mind that the main difference between VPS and Cloud is High Availability. VPS is single server based and thus with a single point of failure, while a proper Cloud is running on a distributed architecture with no single point of failure and with multiple layers of high availability built in.)

4. 18 years is a long time for the Internet world. Can you tell me the differences in reliability between 1999 and 2016?

HJ: The differences are massive. I’ll make no ifs or buts about it, back in 1999, 2000, 2001, 2002 the VPS service was in its infancy. It was a good option for those who couldn’t afford dedicated, but, it was nothing even close to the 100% uptime that our modern fully distributed cloud platform delivers now. Back then VPS was a viable option for SMBs on a tight budget, but not an option for an enterprise. In fact, the less-than-stellar reliability of the early VPS platforms back then leads us to enter the low-cost dedicated server market in 2002, as we felt that our customers would be better served by a low-cost dedicated server vs. a comparable price VPS.

Still, we persevered with continually improving the VPS service, and back in 2007 set out to embark on what ended up being a six-year path of research and development of our Superb Cloud platform. Our goal, as set out in 2007, was that the cloud service has to be better than dedicated in every way. For us, the Superb Cloud had to have fully guaranteed and dedicated resources like dedicated, no noisy neighbor issues of VPS, have no compatibility< OS or software limitations, and, most important of all, have the true 100% uptime and High Availability of the cloud. We were determined never to release or sell anything less than this, which would be just a VPS, misrepresented as a ‘cloud.’ Essentially, our definition of cloud was – and is – that a single cloud server is equivalent to a set of two redundant load balancers and two redundant servers with real-time replication built in, in an equivalent dedicated environment. That is exactly what one gets with every Superb Cloud VM: fully guaranteed resources with guaranteed and consistent, predictable performance, along with High Availability – true 100% uptime, which in a dedicated environment would take at least four physical devices to achieve.

5. Doesn’t Superb Internet have a 100% guarantee for resources? How would a customer check that they are getting 100% of the resources they are allocated?

HJ: We invite all of our customers to run performance tests and see for themselves that their resources are always there and available, dedicated to their sole, exclusive use and that their performance is consistent and non-variable. We also publish a benchmarking report online, where we show our cloud sample VM performance vs. AWS, Microsoft and IBM clouds (, from data collected over an extended period, and still continually collected each day.

6. I suppose this Guarantee is part of the ‘Transparent Cloud’?

HJ: Exactly. And it is also a part of our Cloud SLA. So it’s not just a marketing promise, it is an actual legal and binding guarantee. As you can see on, we have, by far, the most comprehensive and complete cloud SLA in the marketplace.

We are calling the Superb Cloud the Transparent Cloud because of our full disclosure of all the performance indicators that a customer will be delivered – and guaranteed.

7. I know there are a lot of mass market offerings, such as those from AWS, Azure, IBM Softlayer, how does Superb Internet compare against those major players?

HJ: Simply put, we outperform them several folds over, as our benchmarking report ( clearly illustrates. Not only that, but we also do it for less. That is, we deliver a higher level of consistent performance for a lower price, thus providing the hands down, best value in cloud hosting. We also guarantee practically everything in our SLA, while others’ cloud SLAs are, let’s just say, rather lacking.

Now, we are not perfect, and in the areas where we don’t beat the others, we readily admit to that in our benchmarking report. We are currently working on Generation 2 of our cloud, which we expect to unveil in the coming few months, and you be assured that in the few areas where we currently do not already beat the rest with our Generation 1 cloud, that we will do so with Gen. 2 of the Superb Cloud.

8. How can a customer of another host compare their performance against Superb Internet?

HJ: The best way is to buy our popular starter CS1-312 account, where every new customer can get 2 cores @ 312 passmarks, 2GB of RAM, and 20GB of SSD disk space, for only $2 for 2 whole months ( There are no tricks, and there is no catch. We make starting in the cloud easy. The only reason we charge the symbolic $2 for 2 months is to ensure the legitimacy of every new order and account.

In fact, the main reason for this offer is to invite all prospective customers to run their own tests and assure themselves of our superior to the competition performance, before they upgrade to a larger cloud account. We believe that the best customer is an informed customer.

9. Lastly, why should someone choose Superb Internet over one of those big guys?

HJ: In my mind, there’re so many reasons…some of the top ones that come to mind are:

* Our company-wide Customers First policy: in every department, in every discipline and every instance, our customers always come first; it’s as simple as that. That has been one of our core guiding principles ever since day 1 back in July of 1996.

* With us, a customer is more than just a number. We see ourselves as a key partner in success to each and every customer. We succeed when our customers’ business grows and succeeds. Each and every customer has a personal account executive, whose job is to know and understand the customer and their needs and to ensure that we do all we can, in all disciplines, to help the customer reach – and exceed – their goals.

* We operate our own data centers, staffed 24x7x365. All support is provided directly out of the data center, and customers can always talk live to a technician at the same data center where their systems are located. We do not outsource any part of our support or operations. Every customer is serviced right out of the data center, not some remote support center.

* Our coast-to-coast IP backbone reaches more than half of the Internet routes directly and completely intermediary transit network free. Our ultra-low latency and a low number of hops most direct routes are unmatched. No wonder that we serve many real-time financial traders, VoIP providers, and game server hosts, where each and every millisecond matters. In addition to the unsurpassed performance, we not only guarantee but deliver true and uncompromised 100% uptime of our network. There hasn’t been a single SLA claim on that since 2002 – that speaks louder than anything else that I could say.

* For the last two decades we have firmly established ourselves as an industry innovator and leader. We have many industry firsts behind our belts and continue to lead the industry in innovation. In other words, we are, and always will continue to be, Ahead of the Rest®.

* Our SLA is one of, if not the, strongest in the industry – not just on the cloud, but on dedicated, colocation, IP transit: on all the services that we offer. All of our data centers, network, platforms and services are built with full redundancy and concurrent maintainability in place, and as such we offer a 100% uptime, compliance and conformance guarantee in our SLA – which is not just a guarantee, but something that we deliver.

* Our Certifications, such as ISO 9001 Quality Management and ISO 27001 Information Security Management, on top of the more common standards such as SSAE-16 SOC-1 Type II audits, PCI-DSS, HIPPA, FISMA, FedRAMP, etc. compliance, are unparalleled in the field. Our facilities, network, and services are built – and multiple third party auditor continuously audited and certified – to the world’s highest standards, as demanded by the world’s largest multinational enterprises, international organizations and the U.S. Federal Government. Our SMB customers benefit from our high level of investment to achieve and maintain these numerous certifications, that no competitor of our has been able to get fully to, and the drastically improved service quality and security as a result. In short, our SMB customers receive an enterprise level of service for a fraction of the coast, benefitting from our massive investment in facilities, network, platform, services, policies, and procedures, to adhere to these demanding global standards for our largest federal and enterprise clients.

* And last but not least, simply because We Care.

How to Test Drive Superb Internet’s Transparent Cloud for Only $2

Why perform a no risk test of the Superb Internet’s Transparent Cloud? You can test drive a cloud server featuring 2GB, 2Core, 20GB HDD Cloud Server for just $2 for 2 MONTHS!

Better yet, every 20th customer will get 20 YEARS HOSTING at NO COST.

Click the Link below to lock your cloud server down for just $2 for 2 months:


And to be clear, you’ll get a 2GB RAM, 20GB HDD, 2 CORE Cloud Server (CS1-312) for just $2 for 2 MONTHS… that you can use as a:

… backup server
… dev / test server
… replacement for your current server at your existing host
… or upgrade from shared hosting to a FASTER and more reliable platform
… emergency cut-over server

Twenty Years: The Story of One of the Internet’s Longest-Running Hosting Companies

Superb Logo

Every well-run company keeps itself squarely focused on solutions that alleviate common pain-points. Superb Internet was created in 1996 to treat a problem encountered by its founder when developing a game with a collaborative global team. It’s grown exponentially over the years and is now one of the longest-running web hosting companies in existence – celebrating a full 20 years in operation on July 23, 2016.

  • 1994: Origins in treating pain points
  • 1996: Formation
  • 1999: Industry front-runner
  • 2001: First data center
  • 2002: Expansion
  • 2005: To the West
  • 2009: And then there were four
  • 2011: Security focus
  • 2013: Cloud embrace
  • 2014: Enterprise-grade compliance
  • 2015: First fully government-authorized cloud provider
  • Today: Starting a third decade of hosting service

1994: Origins in treating pain points

As is often the case in business, Superb Internet arose from a pain-point experienced by its founder. Like any online entrepreneur, Haralds Jass wanted to see his project come to fruition without any performance problems. What was a little atypical about Jass’s experience is that he was only fourteen years old!

Superb Entertainment was composed of Jass and a group of talented developers, graphic artists, story-tellers and other creators from all corners of the world. The team set out to create a game that was revolutionary, going far above and beyond the current offerings. Called Woodlands, it was fully immersive, highly realistic, and included an original musical score packaged using newly minted MP3 technology.

The game attracted the attention of an established publisher who could potentially fund its further development and handle the global distribution.

Almost as quickly as the dream seemed headed toward realization, though, everything began to fizzle. Commercial web hosting was just getting started, and the shared hosting service the developers were using subjected them to crashes, power outages, and frequently slow data transfer. Superb Entertainment tried numerous hosting providers, and there was no respite from these problems. Poor reliability was clearly an industry-wide problem.

Fast Company indicated in 2012 that “customer pain is your most important resource” because it drives you toward meeting genuine user needs. The same is often true with business leaders as they seek to solve their own problems, and in turn, those of others. Jass started looking at the hosting industry closely and determined that a company was needed to provide service that was “more responsive, more reliable, and more customer-centric; a service that customers could always depend and rely upon.”

1996: Formation

During the summer of 1996, Haralds was ready to move forward and solve the problems he’d personally experienced with web hosting by becoming a provider himself. In so doing, he intended to provide what no one else in the market could: reliable, dependable service. Jass studied O’Reilly’s Essential Systems Administration for several weeks, lined up an initial group of customers, and incorporated Superb Internet Corporation (July 23, 1996). Through these initial steps, he had created an Internet Presence Provider – the original term for a web hosting company. To get Superb underway, Jass purchased a Sun SPARCstation 2 dedicated server for $600 per month via a loan from his family dentist; the initial investment was paid back only ten days after launch. To this date, that remains the only loan or financing that Superb has ever had.

By end of the year, Superb Internet was already using seven dedicated SunSPARC servers to host over a thousand websites. The growth was nothing short of phenomenal.

1999: Industry front-runner

By summer 1999, just three years later, Jass had become a college student with a company that owned 400 servers and hosted over 10,000 unique sites. Superb Internet was regularly rated as the best hosting provider by various industry analysts, especially in terms of service quality and reseller programs. Plus, Superb was the first provider to offer name-based virtual hosting (reducing client costs) and customer-controlled virtual hosting with unlimited third level domains. These innovations carved out a solid place for the firm as a hosting industry bellwether.

2000: First data center

By 1999, it was becoming more obvious that the company would greatly benefit from Harald’s full-time management. He left school and took on a huge immediate project: transitioning Superb from colocation to supporting and maintaining its own datacenter. After conducting a thorough search of the United States, Jass decided on a datacenter in the highly networked hub of Washington, DC. DCA1 in Georgetown, Washington, DC, was opened in early 2000.

In 2000, Superb was also the very first provider worldwide to offer commercial VPS (Virtual Private Server) service, the predecessor of today’s cloud; Superb called it the Superb Power Server (SPS). The VPS (SPS) service was designed as a step in-between shared hosting and a full-fledged dedicated server. In this way, it filled a gap in customers’ service demand and made owning a fully customizable operating system and set-up environment more affordable than ever. Superb was, as always, true to its motto of “Ahead of the Rest”® – leading the industry, while others followed years later.

2002: Expansion

The next year, Superb Internet added another data center – DCA2 in McLean, Virginia. Also in 2002, the IP backbone of HopOne Internet Corporation, the datacenter and network operator that Jass founded in 1999, reached the west coast. Since Superb Internet was then its largest customer, HopOne effectively became the first ever web-hosting coast-to-coast IP network. To this day, it remains one of the world’s best connected networks, reaching the majority of Internet routes directly (free of intermediary transit networks). These two changes more than quadruples the company’s server capacity and improved network speed and reliability. Dedicated servers and colocation, for the last several years is an ever-rising part of the business, became the primary business in 2002, further broadening the client base and better positioning the brand as a major player in reseller hosting.

2005: To the West

Wanting to better meet the needs of companies with West Coast demographics, Superb opened a third datacenter in 2005: SEA2, located in Seattle, Washington. This location, which offered multiple diverse-path fiber-transport circuits for premium connectivity, also allowed high-demand customers and those with mission-critical needs to perform geo-load-balancing and replication on both of the coasts.

2009: And then there were four

Just a few years later, a fourth datacenter was added – DCA3. This one was located in Springfield, Virginia, and was upgraded to meet the strict redundancy expectations at Superb. Like the other datacenters, this one was staffed with certified engineers working around the clock to make sure all systems and data were kept uncompromised.

2011: Security focus

Superb Internet became increasingly recognized as one of the most secure web hosting companies globally, winning awards from several respected security and pro-Internet institutions.

2013: Cloud embrace

This year the Superb Cloud platform was introduced, that actually resulted from a full six years of design and development. The Superb Cloud was the culmination of tens of thousands of hours spent researching, testing, and developing; resulting in a next-generation system with 100% high-availability and impeccable performance.

Distinguishing characteristics of the Superb Cloud included a modern distributed storage system and an underlying 40 Gb/s InfiniBand network. The former offered the absence of any single point of failure, with performance equivalent to local storage. The latter had previously only been used in super-computing applications, this being one of the first commercial hosting implementations of this revolutionary technology. Using InfiniBand in the cloud architecture resulted in real-world performance that was lightning-fast – many times better than even the theoretical maximums of the lower-cost and much more rudimentary 10 Gigabit Ethernet protocol favored by others. Plus, it is completely free of packet loss and jitter and has a completely decentralized high-availability architecture.

Additionally, Superb Internet differentiated itself from the market by guaranteeing and always allocating resources exclusively to each customer. This decision provided for a fully predictive performance, a first in the cloud hosting field.

Through these customer-centric technological approaches, the Superb Cloud delivered on what Superb set out to do back in 2007 when the R&D process started: deliver a cloud that was in every way better than a physical dedicated server. In other words, the Superb Cloud offered the same or greater performance than dedicated systems: comprehensively guaranteed resources; a fully distributed architecture, free of single points of failure; and the predictability, reliability, high availability, and 100% uptime that only the modern cloud allows.

RELATED: So that we can avoid any single point of failure for optimal reliability, Superb Internet’s cloud infrastructure uses distributed rather than centralized storage. Along similar lines, we opted for InfiniBand over 10 Gigabit Ethernet for guaranteed always-zero packet loss. Explore our cloud.

2014: Enterprise-grade compliance

In 2014, Superb announced that it was building on its cloud offerings and centering itself on better meeting the needs of enterprises. The enterprise growth that the company started to experience was in two different areas: infrastructure-as-a-service (IaaS), i.e. cloud hosting, and core underlying infrastructure, i.e. network/datacenter services. Working closely with companies in the law, finance, and healthcare verticals gave Superb a better sense of the needs and expectations of enterprises with sophisticated compliance expectations.

One thing that Superb Internet knew would attract more of these types of clients was getting certified and audited as meeting various international standards. That way potential clients could know that a third-party organization had assessed and verified the company’s architecture and processes. One major form of compliance adopted in 2014 was SSAE 16 (Statement on Standards for Attestation Engagements 16), a set of guidelines developed by the world’s largest accounting professional association. The American Institute of Certified Public Accountants (AICPA) developed and continues to amend this standard, entitled “Reporting on Controls at a Service Organization.” Other standards and certifications Superb attained in 2014 include PCI-DSS, ITIL (the Information Technology Infrastructure Library), and two major, world’s toughest international standards: ISO 27001:2013 (Information Security Management) & ISO 9001:2008 (Quality Management) Certification & Registration. It should be noted that practically no other hosting provider has been able to achieve ISO 27001 and ISO 9001 Certification & Registration with the International Standards Organization.

Finally, in November, Superb started a key strategic partnership with OnApp called CloudPOD, making it a lot easier for anyone to be able to design and provision their own custom-build private cloud IaaS architecture. This out-of-the-box, turn-key solution is delivered through collaboration with OnApp and motherboard company Supermicro. It features Superb’s world-exclusive InfiniBand networking technology, put in place to outperform the 10 Gigabit Ethernet protocol used by many providers, and highly efficient distributed storage technology.

2015: First fully government-authorized cloud provider

Last year, Superb continued to pivot toward compliance standards so that it could more clearly indicate its ability to supply secure, mission-critical services to enterprises and government institutions. This shift was essentially a broadening of the firm’s approach since it was still completely dedicated to the huge and loyal SMB base that kept the company in business for 20 years.

The transition toward better meeting strict public-sector requirements was allowed by the company’s attainment of a GSA Information Technology Schedule 70 contract with the US federal government in May. This approval was granted by the US General Services Administration, the federal government’s procurement agency. With this federally approved status, Superb Internet was Ahead of the Rest® again as one of a small number of facilities-based providers of enterprise-equipped web hosting, colocation and cloud to federal, state, and local government institutions. This contract allows Superb to be offered through the federal online shopping portal, GSA Advantage.

In September, Superb Internet again made a name for itself as an industry bellwether with an award from the General Services Administration to sell pre-authorized Cloud Computing Services to government agencies at all levels. This became possible through an invitation for Superb to sell SIN 132-40, the most recent update to GSA IT Schedule 70. With this award, Superb became the very first organization to offer high-demand cloud solutions through the official preapproved federal platform that speeds up the procurement process for government IT buying.

These types of approvals, along with the various compliance mechanisms, have won the company clients such as the United Nations, World Health Organization and various agencies at all levels of government.

Today: Starting a third decade of hosting service

Now, this story has sounded like it was all about us; but keep mind, all of the above steps were taken to provide better service. Since Superb Internet Corporation’s formation in 1996, our company has been built on the satisfaction and loyalty of our customers, as directed by the leadership of Haralds Jass.

When we initially transitioned to our own datacenters and network, we had 400 servers. Today we have more than 10,000. In other words, we have quite literally expanded more than one-hundred-fold just in terms of the number of machines we own. Through our infrastructure, we host hundreds of thousands of websites.

As our upward trajectory continues, it is our commitment to provide the best possible service that keeps Superb Internet Ahead of the Rest®. In fact, “Customers First” is a core operational principle at Superb; as Jass often says internally, “Without our customers, there is no us.” On July 23, we will celebrate twenty years of treating customer pain-points just as we alleviated our own.

How Open Government Data and Cloud Computing Create Value

Saving Money

  • Innovations in Power and Information
  • Better Insights
  • The human Side of Technology
  • Continuing the Push
  • The Right Cloud

Innovations in Power and Information

In recent years, the federal government has moved to adopt open data and cloud technology. Open data makes it easier for governmental offices: data access is more affordable, and it is simple to make information publicly available. Cloud computing renders the costs of IT infrastructure more manageable and creates an environment within which big data analytics can allow agencies to technologically address complex issues.

“Cloud computing and open data take two previously costly inputs—computing power and information—and make them dramatically cheaper,” explained Center for Data Innovation analyst Joshua New. “Government agencies invest large amounts of capital and time to build and manage their own data centers and IT infrastructure.”

In other words, the cloud makes planning for the future dramatically more flexible. When only traditional IT was available, it was necessary to establish capacity by predicting how many resources would be needed in the coming months and years. Changing the capacity was complicated, so organizations ended up setting up systems with extra resources so that they would not run into a wall.

The cloud makes it possible for the federal government to adjust as it goes, scaling up and down in tune with demand, which is both more energy-efficient and more cost-effective. Since government offices now have to make data available on the Internet in languages readily understood by devices, the public sector has drastically reduced the time it was spending to transfer data out and take it in: no need for individual transfer of outgoing data, and no need to ask for data due to immediate accessibility.

Better Insights

Since both open data and cloud computing enhance the government’s information-sharing capabilities, these two technologies have made it easier to work with the data and learn from it.

One great example is the Consumer Sentinel Network. The network, managed by the Federal Trade Commission, is a coalition of dozens of governmental agencies at all levels. They share information when individual complaints are made about companies, such as fraudulent telemarketing offers, do-not-call violations, and credit scams.

Cloud systems don’t just make intergovernmental sharing easier; they also make it easier for the public sector to share information with businesses and American citizens.

One example of a major cloud migration is being conducted by the National Oceanic and Atmospheric Administration (NOAA). The agency expects its total data storage to increase by 90,000 TB annually beginning in 2020, and that data could not be publicly available without the cloud.

“NOAA expects the scalability and ease of deployment of these cloud solutions will help reduce the bottleneck effect that limited government IT infrastructure can have on organizations and businesses that rely on government data,” said New.

The Human Side of Technology

With the increase in data capacity, governmental offices can improve their position related to tech talent. After all, private industry often outpaces the public sector because business has historically been able to pay more than the government has for individuals who are highly skilled at data. There are relatively few people who are experts in data science, and every organization wants the top people so that they can benefit from processing their data in meaningful ways.

With cloud computing and open data, the public sector is better able to compete. In 2014, the General Services Administration created 18F, an office charged with enhancing federal IT services.

“18F hosts the competitive Presidential Innovation Fellows program,” New commented, “which attracts highly skilled technologists to improve government services with open data, such as making education [more] accessible and improving opportunities for private sector entrepreneurs.”

The US government has also trying to better connect with top talent through conferences and events, such as Health Datapalooza and hacking events geared toward finding solutions for public challenges. The 2014 National Day Of Civic Hacking addressed more than three dozen national and international problems by leveraging open data.

Continuing the Push

As you can see, both open data and cloud have much to offer the public sector and the American people. Many federal systems have been migrated at this point.

“The US Government is spending a considerable amount of its budget on cloud services,” said technology journalist David Hamilton. “US agencies are expected to spend about US$3 billion on cloud projects in fiscal 2014 (which began October 1, 2013), which is around $800 million more than officials predicted in 2013.”

However, the transition to cloud is still far from complete. Many agencies have only transferred their email and storage systems at this point, for instance. Two initiatives, Cloud First and the Open Government Directive, have proven incredibly beneficial, but the benefits will multiply as these technologies continue to see broader use.

The Right Cloud

Cloud computing has many advantages for government and business. However, it’s important to remember that cloud technology is not uniform. You want a cloud service provider that offers Passmark-rated performance. Passmark is the only objective comparison that you can use to determine actual CPU performance (since gigahertz and other variables are irrelevant between various CPU generations).

Spin up your Passmark-rated cloud VM today.

By Kent Roberts

Pluses and Minuses of India’s Interoperability of Things


  • As Cloud Computing Rises, Interoperability of Things Advances in India
  • Gameplan Lacks International Charm
  • India of Things Caves In on Itself
  • The Road to True Interoperability

As Cloud Computing Rises, Interoperability of Things Advances in India

The business world is great at innovating to create particular devices and systems that will support intelligent technology in residential, governmental, and industrial environments. However, the headway made by tech companies is disorganized. Of course it is: the challenge created by the free market is the need to cooperate to advance interoperability. In fact, lack of interoperability is already a major problem today worldwide, as evidenced by American healthcare: 50% of registered nurses say that they have witnessed a medical error because devices were not integrated.

India is attempting to meet this challenge head-on by creating an established national plan to build the Internet of Things in a manner that will enhance the ability of data to flow seamlessly and securely between devices.

“In what amounts to the world’s first national strategy for the Internet of Things,” explains Center for Data Innovation analyst Joshua New, “the roadmap outlines the framework for a comprehensive, systematic approach to support digitization efforts in India, particularly the recently approved plan to build 100 smart cities across the country.”

Gameplan Lacks International Charm

The Indian plan establishes guidelines that would increase the pace at which smart machines would be ready for broad use – such as widening the bandwidth that the technologies could use, devising standards for integration, and going easy on regulations. The downside is that a number of isolationist “India first” rules delineated in the strategy would make it more difficult to incorporate devices from other countries and necessitate that the Indian version of IoT would only be powered by India-based servers.

That India is moving forward with the first-ever Internet of Things national roadmap is essentially positive for data innovation. However, partitioning itself off from the rest of the world is a major mistake if the nation wants to take as much advantage as it can of the financial and civic potential of connected devices.

The gameplan correctly targets integration challenges as paramount in the development of the IoT market. It identifies various methods to simplify the ability of networks and smart machines to interact. One simple step that’s being taken is that the country’s Telecommunication Engineering Center is developing standards in order to certify any technologies that are built ready-made for interoperability. This move is a good thing, provided that international standards are used to build the India-specific ones so that companies building machines can make sure they are useful worldwide.

In addition, says New, the plan advises that all IoT services used by local government, such as mass transit and trash removal, operate through Internet Protocol (IP) to avoid data lock-in. “The roadmap also identifies the need to ensure that wireless spectrum is available for the increasing amount of devices in the Internet of Things,” he adds, “though at this point the roadmap only commits to exploring the issue further and allocating some licensed spectrum bands for experimentation purposes on a limited basis.”

A significant amount of the guidelines are dedicated to building Internet of Things specific to India, as indicated above. Venture capital and incubation entities will be created to spark more private interest in the industry. The Technology Engineering Center will create testing centers to accelerate the rate at which products are certified. Since any technology also requires people who understand it for development and support, the country’s National Telecom Institute for Policy Research, Innovation, and Training will write training manuals specifically for the connected environment.

India won’t just jump headfirst into the Internet of Things but will instead start with 15 smart city test projects. Additionally, the Center of Innovation (created in 2012) will be tasked with management – handling regulations, tying in international interests, and fostering research.

India of Things Caves In on Itself

The protectionism exhibited by India is a serious problem. It really goes against the entire idea of interoperability while supposedly attempting to tackle that issue.

“Under the guise of encouraging interoperability, India plans to require import licenses for short distance and low power transmitting devices,” says New, “which could potentially allow the Indian government to charge foreign companies extortive fees to access Indian markets, or block them entry altogether.”

The governmental plan also states that specific types of smart machines, including those with GPS and PGHD (patient-generated health data) capabilities, should be drawn within the framework of the Preferential Market Access (PMA) policy, which represents another instance of “India-first” over interoperability.

It makes sense that the government of India is interested in building up businesses within its own borders. However, PMA is not likely to build the Indian economy because, due to the lack of competition it will allow, it will promote more expensive products that don’t work as well. Keeping the Internet of Things infrastructure within Indian borders is also problematic, keeping industry from being able to find the best and most affordable ways to store and process data.

The road to true interoperability

The road to interoperability is fundamentally cooperative. Companies with the expertise to build the Internet of Things must work in tandem. One of the simplest and most reliable ways to move toward interoperability is with standards, so that technologies are all moving toward the same level playing field.

That’s why it’s important to build your IoT project with a cloud provider that is fundamentally dedicated to national and international IT standards.

By Kent Roberts

House Goes Orwellian in Response to Recent Hacks

Three perspectives on the PCNA and CISA, which many believe are simply ways to broaden the powers of the government to collect digital information and advance a 1984-ish, Orwellian agenda.

  • PCNA Passes
  • TechCrunch – They had no choice, folks
  • ACLU/Wyden – “Cybersecurity” bills just sneaky ways to expand spying
  • EFF – Not buying the propaganda
  • Prioritizing Security AND Privacy

PCNA Passes

On April 22, the House of Representatives voted in favor of the Protecting Cyber Networks Act. In fact, Democrats and Republicans supported the bill, which passed by an incredible margin due to bipartisan support: 307-116. The stated intention of the bill is to get pesky laws out of the way, facilitating transfer of security details between American businesses. In turn, the idea goes, we can reduce vulnerabilities and prevent breaches – such as the ones perpetrated against Sony, Anthem, global banks, and the US State Department.

The bills are backed by President Obama and some professional organizations, a few of which are IT-specific. Here are some perspectives.

TechCrunch – They had no choice, folks

“Privacy advocates have criticized information-sharing bills as surveillance bills by another name,” explained TechCrunch. “They worry that sharing cyber threat information with the government will give surveillance agencies even more access to citizens’ personal information.”

Those who really think the bill is just absolutely great and not a horrible threat to personal freedom say that an amendment was tacked to the bill so that individual rights are better protected – in other words, it pays better attention to the issue of data privacy than does the Cybersecurity Information Sharing Act (CISA).

The Protecting Cyber Networks Act removes certain data related to individual users on both ends, both within the business and within the government system collecting the information. Advocates also say that the legislation is better designed to prevent government abuse. The National Security Agency ostensibly won’t see the information, for instance.

While the PCNA takes privacy into better account than CISA does, TechCrunch cited critics who believe it’s still going to mean that the government is getting a green light to be a peeping Tom in the window of our digital lives. Those opponents argue that it will be easy for the federal government to misuse the data. CISA didn’t get to the floor of the Senate for a vote in 2014 because privacy was the fundamental concern.

Now, everyone is filled with fright about hacking – and for good reason. In the last year, Sony Pictures was viciously attacked; the White House and State Department were both hacked; and almost 80 million users’ information was stolen from the nation’s second-largest health insurer, Anthem.

“In the past, Congress overlooked the issue of cybersecurity because it faced no public pressure to address it,” argued TechCrunch. “But after these high-profile hacks, it has backed itself into a corner where it has no option but to pass legislation that will address them.”

Were they really backed into a corner, though? Let’s look at a couple of other perspectives.

ACLU/Wyden – “Cybersecurity” bills just sneaky ways to expand spying

No one in Washington or anywhere could really disagree that data security is a serious threat that must be handled immediately. However, as the ACLU sees it, simply stating that a bill is intended to help with security as a publicity stunt isn’t the same thing as passing a security bill. While national leadership could focus on truly improving security, according to the ACLU, this bill is a bait-and-switch that runs roughshod over personal privacy rules, possibly sending huge amounts of personal data to federal agencies – such as the NSA.

Remember the Cybersecurity Information Sharing Act, the bill that’s even worse than the PCNA? That bill passed out of the Senate Intelligence Committee in flying colors as well, 14-1.  Oregon Democratic Sen. Ron Wyden (remember, these people are technically allowed to go against the party line) cast the only vote against the bill.

“I am concerned that the bill the U.S. Senate Select Committee on Intelligence reported today lacks adequate protections for the privacy rights of American consumers,” commented Wyden, “and that it will have a limited impact on U.S. cybersecurity.”

The basic message shared by Wyden and the ACLU is that there is no reason for us to have to give up our privacy in the name of security.

EFF – Not buying the propaganda

The Electronic Frontier Foundation, like the ACLU, is fundamentally dedicated to protecting the civil liberties of the individual user. The organization banded together with more than four dozen like-minded groups that submitted letters to Congress in opposition to the two bills.

As indicated in the letter signed by the EFF, CISA and the PCNA are spying bills, not security bills; and the former legislation is particularly troubling.

“CISA would significantly increase the National Security Agency’s (NSA) access to personal information,” stated the EFF, “and authorize the federal government to use that information for a myriad of purposes unrelated to cybersecurity.”

Prioritizing Security AND Privacy

Security and privacy aren’t at odds.

At Superb Internet, we care deeply about meeting the compliance and security needs of our clients, as indicated by our various national and international certifications. However, we are also fundamentally dedicated to the privacy of our users – just take a look at how much they like us.

By Kent Roberts

Photo via Wikipedia

“Black Lives Matter” Pushes Forward the Body-Camera Cloud

  • Oakland – Building the Body-Camera Cloud
  • Why Body Cameras?
  • Why Cloud Storage?
  • Everyday Compliance with Body Cameras
  • The Obvious Choice

Oakland – Building the Body-Camera Cloud

Oakland is one of the primary strongholds of the “Black Lives Matter” movement. The civil rights project started in response to a pair of grand jury decisions in New York and Missouri to let police officers walk after two black police deaths were captured on video and distributed online. Most recently, 80 to 100 protesters shut down the northbound lanes of Interstate 80 as part of a coordinated, nationwide response to the death of Walter Scott, an unarmed African-American man shot by white police officer Michael Slager in South Carolina.

Now, many police officers are good people and aren’t out to get anyone. My cousin is a police officer in Colorado, for instance. I also have family serving in the Ohio State Highway Patrol. But clearly, accountability is needed.

One change that many believe could help improve accountability among police officers, as well as exonerate those unfairly accused of wrongdoing, is body cameras. In other words, we have technology that prohibits people from misleading us on either side, so why don’t we use it? Oakland is actually one of the cities at the forefront of the transition to body cameras for greater collection of evidence and monitoring of law enforcement.

The Oakland Police Department is currently testing out a cloud storage solution that can serve as an archives for the office so that the city isn’t buried under an avalanche of on-site video.

Why Body Cameras?

Body cameras have become more prevalent in 2015 as the Black Lives Matter efforts continue in response to news of additional, questionable black deaths. In December 2014, President Barack Obama requested funding from Congress for 50,000 body cameras to be used by police departments around the country.

According to Dave Burke, a police officer in Oakland, the city’s PD said that body cameras are helpful not just to gauge the behavior of officers but of people being arrested as well.

Former Oakland Mayor Jean Quan said that the police department was awash in more than 2200 complaints of excessive force in 2009. In 2014, however, with more than 600 body cameras in operation, those complaints plummeted to 572. The statistics suggest that three out of every four incidents of reported police violence (74%) are being prevented by simply turning on cameras. Another stat is similarly compelling: before body cameras, the Oakland Police was involved with an average of eight shootings annually. Last year, that number was zero.

Why Cloud Storage?

Think about walking around all day with a camera capturing everything that you do. Now imagine everyone on your workforce having the same technology that feeds in their own shift-long perspectival shots. What we are talking about is a huge amount of data. In fact, 12 months of video from just one camera can be in the terabyte range (in other words, thousands of gigabytes).

Cloud storage could offer a cost-effective method to make it easy to stow away and easily retrieve videos as needed. However, public cloud typically doesn’t have the security mechanisms to meet the requirements of the FBI’s Criminal Justice Information Services (CJIS) Division – which is necessary to allow police departments around the country to log onto the federal agency’s system.

Now, cloud providers are starting to step forward to meet this growing demand so that law enforcement offices around the country are able to prove their security.

Cloud systems can be extraordinarily helpful to police departments so that they have a robust way to immediately access distinct portions of their video libraries.

“It cuts down on time and also aids in investigations, crime trends and analytics,” explained the Oakland PD’s Burkel.

Everyday Compliance with Body Cameras

How about this scenario: A police officer goes to a hospital to collect information related to possible child abuse of a young boy. She talks to the boy, the physicians treating her, an official from the school, and the mother and father. The conversations range from classroom behavior to a potential pattern of abuse to IRS wage garnishment that’s removing funds directly from the father’s paycheck. Essentially, the officer is gathering as much information as she can so that the responsible parent can be arrested and face an airtight, evidence-rich effort from the prosecution.

If the officer is wearing a body camera, video must meet the parameters of numerous laws enacted at various levels of government.

“If that video is not properly stored, managed or disclosed,” argued Government Technology, “its value to the investigation can be compromised, which in turn can have devastating consequences for the people involved in the case.”

The Obvious Choice

Not everyone is gung ho about security cameras. Lynne Martinez, president of the ACLU branch in Lansing, Michigan, commented, “We must make sure these cameras don’t violate the rights of victims and are anxious to have a conversation about that.”

However, most people view body cameras as a necessary evil – and the decrease in force complaints at Oakland PD back up that perspective.

In terms of storage, cloud is the obvious choice – fast, accessible, and affordable – for the massive amount of data generated by body cameras. However, police departments must consider compliance with stringent security rules.

Just as cloud is the obvious choice for police video storage, we want to be your obvious choice for cloud. Our compliance audits and certifications are wide-ranging. Talk to us today about crafting a solution to meet your needs.

By Kent Roberts

Image via Wikipedia