Category Archives: Technology

How Data Scientists Can Turn Your Big Data into Marketing Magic

Big Data

  • Data Rapidly Takes Over the Earth
  • The Science and Art of Data
  • The Customer is Sooooo Right
  • Listening with Apps and People
  • Big Data Magic Tricks

Data Rapidly Takes Over the Earth

These days, the data is large and in charge. It grows and grows, and it can certainly give us extraordinarily valuable insights. Big data is allowing for more intelligent choices, better efficiency, stronger interaction with customers, disease prevention, and stockmarket predictive models. Some day big data analytics could replace us behind the steering wheel, and it can already trump us in the Daily Double.

Business now has an extraordinary amount of information. Data is prevalent to the extent that it is almost overwhelming. Look at it this way: the rate at which data will be created in 2020 is expected to be 44 times what it was in 2009. How do we turn all the data into meaningful insights?

The Science and Art of Data

Analytic tools are becoming more sophisticated, but skilled data scientists should open their safes and get ready to start dumping in some dough. Smart businesses will get both technology and human capital so they can access the right data, gathered and sorted ideally for maximum impact. You don’t just need insights, actually, but the capacity to explain what you found succintly and engagingly. What’s your data story? You actually want your data analyst to be both a scientist and an artist. Combine the two, and you have marketing magic.

McKinsey forecasts that in just three years, “the United States alone could face a shortage of 140,000 to 190,000 people with deep analytical skills as well as 1.5 million managers and analysts with the know-how to use the analysis of big data to make effective decisions.”

Is your office looking for a data scientist? The Oval Office already did. Early in 2015, the Obama administration welcomed the newly appointed US Chief Data Scientist. Also this year, the University of Rochester created its own Institute for Data Science. A poll by recruitment agency Burtch Works reveals the obvious, that more people want data scientists than ever before and that they’re willing to pay big bucks to get them.

The first place that you want a data scientist to specialize is marketing, argues Brian Kardon, CMO of marketing analytics company Lattice Engines: “In marketing,” he says, “effectively using data to understand customers and predict buying behavior can make the difference between a winning customer experience and a failing one that lives forever on social channels.”

The Customer is Sooooo Right

Okay, customers are often wrong. However, increasingly sophisticated digital environments mean that they are more empowered than ever before to find solutions that match their expectations:

  • 57% of buying is individual research
  • 70% of the sales funnel precedes the salesperson

It’s all about the content, and it’s critical that the content is customized (as proven repeatedly by research). Data analytics will allow you to find the most ideal prospects and customize in the best possible ways.

“Big data is perhaps the most important way to create a user experience that treats customers in the way that they want to be treated – like individuals,” says Distilled outreach director Adria Saracino. “[I]f you listen properly, your data will tell you vital information about your customers and clients.”

Listening with Apps and People

Clearly you want to leverage automation with predictive apps. Software helps to filter your leads and gives you immediate intelligence. That intelligence is meaningless without expertise on staff, though.

A well-trained and thoughtful data scientist can straddle the line between IT and business. They should be grounded in math, of course, so that they are aware of all elements of the analytic algorithms. Their background should help them tweak equations to fine-tune campaigns and build sales.

Data scientists live in the land of intent data, behavioral data, and fit data:

  • Intent data – keywords and website tracking
  • Behavioral data – content and emails users access
  • Fit data – characteristics of company and credit score

Integrating these three elements creates incredible customer profiles.

Data scientists can also devise and interpret strong split testing for better results .

Again, according to Kardon, you need both the software and the human touch to create marketing magic. “Without technology, we would lack the clouds of data that inform outreach decisions,” he says. “Without data science, we would lack the insights that escape algorithmic automation, and the ability to translate data insights into effective decisions.”

Big Data Magic Tricks

As big data tricks becomes more fundamental to business success, you need to hire a guy with a cape and magic wand: a data scientist (data magician?). You also need the technology to set the stage for that individual to perform.

Our cloud stage offers major performance and reliability (true 100% HA) differences over others using mainframe-era/style centralized storage and inferior,
Ethernet-based networking technology.

By Kent Roberts

Pluses and Minuses of India’s Interoperability of Things

India

  • As Cloud Computing Rises, Interoperability of Things Advances in India
  • Gameplan Lacks International Charm
  • India of Things Caves In on Itself
  • The Road to True Interoperability

As Cloud Computing Rises, Interoperability of Things Advances in India

The business world is great at innovating to create particular devices and systems that will support intelligent technology in residential, governmental, and industrial environments. However, the headway made by tech companies is disorganized. Of course it is: the challenge created by the free market is the need to cooperate to advance interoperability. In fact, lack of interoperability is already a major problem today worldwide, as evidenced by American healthcare: 50% of registered nurses say that they have witnessed a medical error because devices were not integrated.

India is attempting to meet this challenge head-on by creating an established national plan to build the Internet of Things in a manner that will enhance the ability of data to flow seamlessly and securely between devices.

“In what amounts to the world’s first national strategy for the Internet of Things,” explains Center for Data Innovation analyst Joshua New, “the roadmap outlines the framework for a comprehensive, systematic approach to support digitization efforts in India, particularly the recently approved plan to build 100 smart cities across the country.”

Gameplan Lacks International Charm

The Indian plan establishes guidelines that would increase the pace at which smart machines would be ready for broad use – such as widening the bandwidth that the technologies could use, devising standards for integration, and going easy on regulations. The downside is that a number of isolationist “India first” rules delineated in the strategy would make it more difficult to incorporate devices from other countries and necessitate that the Indian version of IoT would only be powered by India-based servers.

That India is moving forward with the first-ever Internet of Things national roadmap is essentially positive for data innovation. However, partitioning itself off from the rest of the world is a major mistake if the nation wants to take as much advantage as it can of the financial and civic potential of connected devices.

The gameplan correctly targets integration challenges as paramount in the development of the IoT market. It identifies various methods to simplify the ability of networks and smart machines to interact. One simple step that’s being taken is that the country’s Telecommunication Engineering Center is developing standards in order to certify any technologies that are built ready-made for interoperability. This move is a good thing, provided that international standards are used to build the India-specific ones so that companies building machines can make sure they are useful worldwide.

In addition, says New, the plan advises that all IoT services used by local government, such as mass transit and trash removal, operate through Internet Protocol (IP) to avoid data lock-in. “The roadmap also identifies the need to ensure that wireless spectrum is available for the increasing amount of devices in the Internet of Things,” he adds, “though at this point the roadmap only commits to exploring the issue further and allocating some licensed spectrum bands for experimentation purposes on a limited basis.”

A significant amount of the guidelines are dedicated to building Internet of Things specific to India, as indicated above. Venture capital and incubation entities will be created to spark more private interest in the industry. The Technology Engineering Center will create testing centers to accelerate the rate at which products are certified. Since any technology also requires people who understand it for development and support, the country’s National Telecom Institute for Policy Research, Innovation, and Training will write training manuals specifically for the connected environment.

India won’t just jump headfirst into the Internet of Things but will instead start with 15 smart city test projects. Additionally, the Center of Innovation (created in 2012) will be tasked with management – handling regulations, tying in international interests, and fostering research.

India of Things Caves In on Itself

The protectionism exhibited by India is a serious problem. It really goes against the entire idea of interoperability while supposedly attempting to tackle that issue.

“Under the guise of encouraging interoperability, India plans to require import licenses for short distance and low power transmitting devices,” says New, “which could potentially allow the Indian government to charge foreign companies extortive fees to access Indian markets, or block them entry altogether.”

The governmental plan also states that specific types of smart machines, including those with GPS and PGHD (patient-generated health data) capabilities, should be drawn within the framework of the Preferential Market Access (PMA) policy, which represents another instance of “India-first” over interoperability.

It makes sense that the government of India is interested in building up businesses within its own borders. However, PMA is not likely to build the Indian economy because, due to the lack of competition it will allow, it will promote more expensive products that don’t work as well. Keeping the Internet of Things infrastructure within Indian borders is also problematic, keeping industry from being able to find the best and most affordable ways to store and process data.

The road to true interoperability

The road to interoperability is fundamentally cooperative. Companies with the expertise to build the Internet of Things must work in tandem. One of the simplest and most reliable ways to move toward interoperability is with standards, so that technologies are all moving toward the same level playing field.

That’s why it’s important to build your IoT project with a cloud provider that is fundamentally dedicated to national and international IT standards.

By Kent Roberts

Robot Report: The Children of the Cloud are Coming to Get You

Robots

NOTE: This is the second part of a 2 part article…to read Part 1, please click HERE.

  • Fast, Cheap, and Out-of-Control? [continued]
  • Hide Your Kids from the Children of the Cloud
  • Can We Stop Robots-Gone-Wild?
  • Aligning Yourself with the Robot Future

Fast, Cheap, and Out-of-Control? [continued]

It’s becoming more apparent all the time that security practices at many organizations cannot withstand the increasing sophistication of the threat landscape. If the current approach is taken by companies with robots, the negative possibilities will be much more substantial (again, the self-driving car).

The reason that the Internet of Things is such a dicey climate for security has to do with the many points of access it allows.

“An Internet-connected robot is still a secure control environment,” says Cooper.

However, the sensors that gauge temperature throughout a manufacturing facility are not nearly as sophisticated and are easier to trick. Just as with a hacker going through a coffeepot to get to a homeowner’s PC, cybercriminals could go through the sensors to make the robot perform incorrectly. A hacker could send inaccurate temperature readings to a robot that would cause it to weld for a longer or shorter period of time, botching the task.

IT security has not completely figured out how to know when this type of threat is active – in other words, when a robot or any computer should and should not trust data feeding in from the Web and Web-enabled sensors.

To determine whether or not one sensor’s information has veracity, the first step is to integrate data from many of them.

“If one sensor records a drastically different temperature than the other sensors do, or if that one sensor is supposed to be in the US, and all of a sudden its DNS registry is in Romania,” explains Cooper, “attackers may be spoofing it.”

Hide Your Kids from the Children of the Cloud

The insightful information flowing through environmental sensors and shared between numerous robots, many of which will be built by different companies with their own proprietary code, isn’t going to be easy for companies to safely handle.

Even our living spaces will become aware with the advent of the smart home. The smart home is made up of various devices, such as the Roomba, coming to certain conclusions based on their programming and the data available. The home is quickly becoming smarter, with Cooper saying that it will be aware by 2025.

For instance, the smart home will take its standard services, along with an understanding of the location of household members and their current activities, to let the Roomba know it should get out of the room, or to inform the assisted living system that items should be taken out of the way of a patient, or to instruct a robot butler to bring you your pipe and slippers.

In some ways, the home and the workplace will become further integrated over the next 10 years. Young creative professionals will be supported by a virtual assistant both in the workplace and at home that will handle administrative tasks, said the chief executive of an AI firm. “That … collection of distributed software … will answer phones, schedule appointments …, manage the care and maintenance of that person’s living quarters and work environment, do the shopping and (where appropriate) be responsible for managing that person’s financial life,” she said.

In order for the smart home and smart office to be able to anticipate the needs of its occupants, base services must be accessible and flow through the system as an artificial awareness. This need for awareness also makes the system more vulnerable both to cybercrime and to interoperability glitches.

The only way to reasonably approach this challenge is to create a complex environment within which to determine and manage data provenance – where the data has been, how it has been manipulated, and how other systems processed it related to security. By understanding the source of data and essentially giving it a credibility check, the vast majority of sensor-spoofing could be rendered ineffective.

Can We Stop Robots-Gone-Wild?

The robots and artificial intelligence that characterize the Internet of Things will be omnipresent in just 10 years, according to Pew Internet. The question is how companies will build systems to address the new landscape.

David Geer thinks that many firms will build their data provenance models in the cloud. By spreading awareness, every individual aspect of the physical environment would benefit, such as a drill head on a manufacturing floor. “The cloud could take that drill head data output, perform some additional intelligence analysis on it, and provide that back to the cloud and down to the drill head to capture and provide provenance about data.”

In this way, a data provenance model could provide security right at the points where information is captured.

Aligning Yourself with the Robot Future

The Internet of Things will be a further evolution of the third platform – using the technologies of mobile, cloud, and big data to make every aspect of our lives easier to manage.

To start building and testing your own data provenance system, choose a hosting provider with a broad spectrum of independent certifications for your PassMark-rated cloud server.

By Kent Roberts

Robot Report: The Children of the Cloud Are Coming to Get You

Robots

Robots are about to see their heyday, operating through the cloud-served Internet of Things. Wait, is this the climax of their master plot to tear out the fabric of our civilization? In Nebraska, the nightmare is in the corn cloud.

  • The Robots are Our Cloud Saviors
  • The Nightmare is in the Cloud
  • The Internet of Autonomous Control Loops
  • Fast, Cheap, and Out-of-Control?
  • Hiding in Your Bunker or Ahead of the Robot Curve

The Robots are our Cloud Saviors

Many people expect artificially intelligent, big-data-driven robots to become much more prevalent over the next decade.

“By 2025, artificial intelligence will be built into the algorithmic architecture of countless functions of business and communication,” argues City University of New York entrepreneurial journalism director Jeff Jarvis. “If robot cars are not yet driving on their own, robotic and intelligent functions will be taking over more of the work of manufacturing and moving.”

The designers of these robots are building them with Internet of Things capabilities to improve how they operate. They connect with Wi-Fi, take advantage of big data analytics, integrate with open-source systems, and exhibit machine learning, said UC Berkeley Prof. Ken Goldberg. IoT sensors allow the machines to gauge temperature and sense vibrations, achieving better control of their actions so that they can perform the tasks for which they were created (such as surgery, home cleanup, and autonomous transportation).

The Nightmare is in the Cloud

It all sounds pleasant and helpful. But in Nebraska and elsewhere, some say that the nightmare has moved from the corn to the cloud.

As with any new technology, security is a central concern. The Internet of Things is particularly disconcerting to data protection advocates because the attack surface is broad and includes consumer products. For instance, if your coffeepot is connected to the Internet of Things, you don’t want a hacker to be able to get in through your coffeepot and use that access to steal account information from your PC. You also don’t want someone to mess around with your steering wheel – in fact, all it takes is a 14-year-old with $15.

Security is such an unproven element that it has made it difficult for the Internet of Things industry to build momentum, said Prof. L.A. Grieco of Italy’s Politecnico di Bari, who is one researcher bringing the security discussion to the forefront so that the Internet of Things can be responsibly applied to robotics and other fields.

The Internet of Autonomous Control Loops

Many people with labor careers think of robots primarily as a threat to jobs, since these sometimes anthropomorphic “beings” are becoming more prominent in the manufacturing sector. However, applications are much wider than the Industrial Internet.

“We see IoT creating autonomous control loops where components that aren’t considered traditional robots are automated,” explains M2Mi engineering director Sarah Cooper, “delivering close-looped intelligence on the floor, generally through a connection with the Internet.”

In other words, the Internet of Things will have Midas-like powers that allow it to turn anything into a robot.

The sensors on these robots, built into closed autonomous loops, will gather information about the machine and its surroundings in real-time. Fog computing, which brings cloud to the ground by building it seamlessly into household and industrial objects, will allow robots to adjust their activities with knowledge about other characters within the Internet of Things and the space in which they operate.

Sophisticated robots will take advantage of sensors that are distributed within the environment in a similar manner to the servers of the cloud. These machines and any computers that control them remotely have three specific needs to function coherently that are still not completely met:

  • More robust interoperability
  • Better distribution of control mechanisms
  • Improved data safeguards

“As IoT matures, we see the industry adding more robotic and AI functions to traditional industrial and consumer robots,” says Cooper. That maturation process will allow the machines to move beyond automation to utilize predictive modeling, machine learning environments, and intricate solutions to immediate issues that arise, she notes, adding, “The autonomous nature of these systems and their often critical function in the larger system make them of particular concern when it comes to security.”

Fast, Cheap, and Out-of-Control?

The 1997 Errol Morris documentary Fast, Cheap & Out of Control focused in part on robotics designer Rodney Brooks. Brooks, a professor with the MIT Artificial Intelligence Lab, actually coined the title to the film with his paper “Fast, Cheap and Out of Control: A Robot Invasion of the Solar System.”

The reason that security is of great concern is because the Internet of Things will allow hackers to take the reins of robots that are moving around in the world, as with the self-driving car. These fast and cheap cloud devices could easily get out of control.

The Internet of Things is so broadly distributed that patching can become difficult, according to Minnesota Innovation Lab fellow James Ryan: “The ‘patch and pray’ mentality that we see inside many organizations won’t work here,” he says.

Hiding in Your Bunker or Ahead of the Robot Curve

It’s no secret that security is paramount when exploring the Internet of Things. However, exploring IoT while it is still emergent will give frontrunners a competitive advantage.

Get out of your bunker and ahead of the curve with a cloud host that knows what it’s doing – as proven by standards, certifications, and guaranteed, PassMark-rated performance.

NOTE: This is Part 1 of a two part article…to read Part 2, please click HERE.

By Kent Roberts

Supercomputing vs. Cloud Computing

Supercomputer

What do people do when they have a difficult problem that is too big for one computer processor? They turn to a supercomputer or to distributed computing, one form of which is cloud computing.

  • Processor Proliferation
  • Why People Choose Super vs. Cloud
  • Applications
  • Cloud as a Form of Distributed Computing
  • Cloud is Not All Created Equal

Processor Proliferation

A computer contains a processor and memory. Essentially, the processor conducts the work and memory holds information.

When the work you need to conduct is relatively basic, you only need one processor. If you have many different variables or large data sets, though, you sometimes need additional processors.

“Many applications in the public and private sector require massive computational resources,” explained Center for Data Innovation research analyst Travis Korte, “such as real-time weather forecasting, aerospace and biomedical engineering, nuclear fusion research and nuclear stockpile management.”

For those situations and many others, people need more sophisticated systems that can process the data faster and more efficiently. In order to achieve that, these types of systems integrate thousands of processors.

You can work with a large pool of processors in two basic ways. One is supercomputing. Supercomputers are very big and costly. With that scheme, the computer sits in one location with all its many processors, and everything is flowing through the local network. The other way to incorporate various processors is distributed computing. With this scenario, the widely accepted standard form of which is cloud computing, the processors can be located in diverse geographical locations, with all communication through the Internet.

Why People Choose Super vs. Cloud

Since information moves so quickly between processors in a supercomputer, they can all contribute to the same task. They are a great fit for any applications that require real-time processing. The downside is that they are often prohibitively costly. They are made up of the best processors available, rapid memory, specially designed components, and elaborate cooling mechanisms. Plus, it isn’t easy to scale a supercomputer: once the machine is built, it becomes a project to load in additional processors.

In contrast, one reason that people choose the distributed computing of the cloud is that it is much more affordable. The design of a distributed network can be incredibly elaborate, but hardware components and cooling do not need to be high-end or specially designed. It scales seamlessly: processing power grows as additional servers (with their processors) are added to the network.

On the downside, Korte commented that supercomputers have the advantage of sending data a short distance through fast connections, while distributed cloud architecture requires the data to be sent through slower networks.

However, that is at odds with what supercomputing expert Geoffrey Fox of Indiana University (home of Big Red II) told the American Assocation of Medical Colleges: “Fox … says the cloud’s spare capacity often enables it to process a researcher’s data faster than a supercomputer, which can have long wait times.”

Applications

When you check the weather ahead of time and are expecting clear skies, it’s easy to be irritated with the meteorologist. However, weather is extraordinarily complex and notoriously difficult to predict.

Often, weather forecasting systems use supercomputers, said Korte. In order to properly determine how the weather might evolve in a given area, a supercomputer simulation will look at huge datasets containing the levels of temperature, wind, humidity, barometric pressure, sunlight, etc., across time. Furthermore, you don’t just want to look at this information locally but globally. To get reasonably accurate answers in real-time, you have to process all that data very quickly. Korte argued that it’s necessary to  use a supercomputer if you want updates in real-time, but there are millions of real-time applications hosted in the cloud.

Continuing this line of thinking, Korte said that distributed computing such as cloud is useful particularly for projects that “are not as sensitive to latency.” He continued, “For example, when NASA’s Jet Propulsion Laboratory (JPL) needed to process high volumes of image data collected by its Mars rovers, a computer cluster hosted on [a cloud provider] was a natural fit.”

Cloud as a Form of Distributed Computing

A forum topic on Stack Overflow discussed the differences between cloud and distributed computing.

“[W]hat defines cloud computing is that the underlying compute resources … of cloud-based services and software are entirely abstracted from the consumer of the software / services,” commented elite user Nathan. “This means that the vendor of cloud based resources is taking responsibility for the performance / reliability / scalability of the computing environment.”

In other words, it’s easier since you don’t have to handle the maintenance and support.

Cloud is Not All Created Equal

We will continue this discussion in a second installment; before moving on, consider that describing these categories requires broad strokes. The truth is that there is a lot of disparity in quality between different cloud systems. In fact, many “cloud” providers aren’t actually distributed. That also means they don’t offer true 100% high-availability.

Benefit from InfiniBand (IB) technology and distributed storage, highly preferable to the centralized storage and Ethernet used by many providers. The technological upgrade will usually allow you to process data 300% faster with Superb Internet than with Amazon or SoftLayer when measuring VM’s with similar specs.

Note: Part Two will be coming soon…stay tuned!

By Kent Roberts

Internet of Things Could Defend Against Climate Change

Climate Change

By cutting down the amount of waste generated and power used by businesses and individuals, the Internet of Things can make a positive impact on climate change.

Key Points:

  • When we don’t focus on making systems efficient, we consume power excessively.
  • Almost 2 billion gallons of gasoline is burned annually by people whose cars are stopped (traffic jams, intersections, etc.).
  • The Internet of Things is not just an investment opportunity but a structured chance to make a difference.

What do you get when you put together cloud computing, big data analytics, mobile technology, and social networks? A. The “third platform,” a platform that was essential to the development of the emergent Internet of Things (IoT). Often called machine to machine (M2M) communication in technical circles, this broadening of computing to objects throughout the physical world includes mobile apps that monitor your health, cars talking to each other, and homes changing light and climate control based on the current location of residents.

“At its very core, machine to machine communication is the ability to connect everything, I mean everything, through a vast network of sensors and devices which can communicate with each other,” explained tech finance writer Tyler Crowe.

By taking advantage of cloud technology, M2M is able to process and integrate data in real time, sometimes beating supercomputer speeds. One way that it will be particularly impactful is on the extent to which we use energy and the way in which we interact with those systems. Light switches could become less essential, and we can also slow our release of human-created greenhouse gases.

No matter how substantially you think that our environment is being impacted by these gases, it should be obvious to all of us that carbon pollution is negative, Crowe argued – if you are unconvinced by melting glaciers, consider the smog that’s produced. When we talk about this issue, we are often comparing fossil fuels to wind and solar, but alternative tech is insufficient: it won’t be enough to get to the level that most climatologists believe is necessary to prevent catastrophe.

The focus on wind and solar is therefore ill-advised. We need to think about consumption rather than production of energy, and the Internet of Things – powered through cloud computing – should be central to our strategy.

Changing the Way We Use Energy

How are we inefficient? The list is long, but two primary examples are the gas guzzling from traffic congestion (fuel burned to slow down, idle, and get back up to speed) and oversupply of the power grid. We aren’t trying to overuse energy. Our devices just need to be integrated rather than piecemeal: our technology must become one big, brilliant brain powered by cloud virtual machines.

In the U.S. alone, 1.9 billion gallons of fuel is consumed every year from drivers sitting in traffic,” said Crowe. “That’s 186 million tons of unnecessary CO2 emissions each year just in the U.S.”

If all cars were able to get data to each other, drivers could be redirected to avoid congestion. As our devices become interoperable, everything changes.

A whitepaper published by the Carbon War Room argued that the impact of M2M on our public and private operations would be incredible: we could cut down CO2 emissions by 9.1 gigatons. That’s equal to the current emissions of both India and the United States, and three times the drop estimated by the most optimistic studies on transitioning to clean energy.

Through the Internet of Things, the real-time interoperability possible in the age of cloud computing will not just improve travel but streamline water use, prevent deforestation, create an exponentially more intelligent energy grid, and conserve when rooms aren’t in use.

The UN Environment Program has stated that anthropogenic carbon gases would need to drop 15% for us to keep the temperature from rising more than 2 degrees centigrade, considered a bellwether of planetary catastrophe that could threaten life on earth. If the Carbon War Room forecast is accurate, we could get emissions down 19%; we could succeed.

M2M is skyrocketing as an opportunity for investors, software developers, and technology companies. Google purchased home energy use company Nest Labs for $3 billion. We already have billions of devices that are capable of interacting, although they aren’t interoperable. The sector could eventually rise to a market value of $948 billion.

The emergence of the IoT will reshuffle the deck in many ways with new opportunities for startups; however, huge tech names such as Google, Cisco, and Intel are throwing themselves into the mix too. Cisco is investing $30 million in a Barcelona-based Internet of Things center. Google’s Open Auto Alliance project is an effort to create the integration necessary for a self-driving future.

Cloud Virtual Machine for Your IoT Project

The Internet of Things could be a wise place to put your money. More importantly, it will make the environment more sustainable. For your Internet of Things project, get one of our Flex Cloud servers and pay what you use.

By Kent Roberts