While I don’t find myself hopping from city to city, with commuting, contract work on weekends, and a pseudo-nomadic lifestyle, I find myself constantly on the go. Between fighting to find an Internet connection (a free, wireless one, preferrably), worrying about who’s going to try and grab my laptop from my car, and trying to keep files synched and up-to-date on different systems, I have more than enough concerns – and that doesn’t even involve crossing borders and contending with changing copyright laws. To help make things easier for those always on the go, Computerworld has come up with a list of 8 tools for road warriors:
…dropped cell phone calls in hotel rooms, faint Wi-Fi signals in airports, data files that vanish without a trace and power-sucking gadgets that invariably go dead just when you need them most.
Overwhelmed? Take heart — adding a few key gadgets to your arsenal could help. The eight handy devices and services detailed below can help you charge, connect, boost and protect your array of mobile devices and the data they contain. With some of these you might find yourself carrying fewer cables, chargers and other travel detritus.
Click here for the full article.
It’s been 5 years since Microsoft has made a release in the Windows Server family. Like many of the other software offerings from Microsoft, the development cycle has been inflated as the company looks to make significant changes to usability and security. Windows products are often bogged down with unnecessary features and are a favourite target of those looking to exploit security holes; with more important data tending to be stored on corporate servers than individual desktops, more security is expected and needed, but the server still needs to run effificiently.
The core code base of Windows Server 2008 is derived directly from the secure development model (SDM) that was used to develop Windows Vista. Initially that was a point of concern, having been less than impressed with the stability of Vista in my personal experience, but it does make sense. The thinking at Microsoft is that these new releases are being built on a more secure backbone of code, which makes future upgrades and enhancements less likely to create security headaches. With the two OSes sharing the base code, it should make it easier to identify major flaws and address them before they are exploited, too.
The most notable feature of Windows Server 2008 is probably the Server Core installation feature. This minimal installation option allows for a faster running, more secure server that is perfect for many basic roles: Domain controller/Active Directory Domain Services, ADLDS (ADAM), DNS Server, DHCP Server, file server, print server, Windows Media Server, Terminal Services Easy Print, TS Remote Programs, and TS Gateway, IIS 7 web server and Windows Server Virtualization virtual server (to be available in approximately 6 months). The server core installation is a great way to make use of older hardware at branch offices, for example.
From all the reviews I’ve browsed, this is a strong release by Microsoft; probably too long a wait, but it seems to have been done much better than Vista. It’s obviously not perfect, but the improvements that have made are improvements and they don’t overcomplicate or alienate longtime users. For more on the release, see the following articles and reviews:
Official Site for Windows Server 2008
Wikipedia: Windows Server 2008
Tom’s Hardware: Windows Server 2008 Reviewed
Computerworld Review: Much to like in Windows Server 2008
Computerworld Image Gallery: Much to like in Windows Server 2008
For many companies, the idea of the Internet going down is beyond a disaster. Multi-national, multi-office Enterprises and Mom-and-Pop E-commerce sites could be hit equally as hard if there was a multi-car pile-up on the Information Superhighway, and what separates those that survive and those that fade off into the sunset may be determined by the preparations made just in case. Whether it be a planned attack, an accident, or an act of God, The Business Roundtable, a Washington-based public policy advocacy group, suggests that in the next 10 years there is a 10% to 20% chance of a “breakdown of the critical information infrastructure.”
With so many pipelines delivering the Internet across the globe, it’s easy to ignore the idea of a significant loss of access to the Internet, but what is the cost benefit trade-off with this kind of thinking? It’s easier and less costly to be prepared rather than react; are you willing to gamble with your company?
Wikipedia defines disaster management as the discipline of dealing with and avoiding risks. That includes having several (secure) back-ups of important documents, applications, and web content – and confirming that these back-ups can be restored to fully working. Since the chaos of Y2k, most companies have probably started to think less and less about disaster management, but with heavier and heavier reliance on the Internet, it’s important to consider the worst case scenario: no Internet access at all. While it’s easy to overlook, being prepared beats going out of business.
For more on the topic, see The Internet Is Down — Now What and Disaster Recovery Made Easy (Well, Sort of).
I’d never actually heard of the term domain tasting until recently; a Slashdot article was forwarded to me about domains disappearing after individuals had done some searching for them. There’s a bit of a dispute between those who practice the art of domain tasting and those that are against it, with questions surrounding the legitimacy of domain tasting as a business. For the average user, myself included, if I do a search for a domain I would expect that search to remain (for the most part) confidential; that information shouldn’t be available to be exploited, at least in my opinion.
Ultimately, this is a part of the dark side of the Internet. Technically, I’m sure that no one is doing anything illegal. Instead they are simply exploiting the unsuspecting. By sticking with larger, reputable sites for your searches, you can probably help lower the chances that your domain search information will be exploited, but it’s definitely not a complete solution. While looking into the topic, I found an article that gives some useful advice for those searching and how to help protect their potential domains:
Delay searching for available domains until you’re actually prepared to follow through with the registration. Better still, search for and register new domain ideas immediately whenever inspiration strikes you.
If one of your domain searches is registered by a domain taster shortly after you checked availability of the domain, and you still want the domain, wait five days and it might become available again. Do not visit the domain during these five days, otherwise the domain taster will believe that the domain gets enough traffic to warrant adding it to his permanent portfolio!
If you’re thinking of several domains for a project and are undecided which one to use, register all of your domain ideas immediately. If you use a registrar like Moniker or Dynadot, you’ll have 4-5 days to decide if you actually want to keep a domain once you have registered it. This practically eliminates the danger of impulse registrations that you might regret later.
Taken from the following article: http://www.dailydomainer.com/200775-domain-tasting-monitoring-searches.html
Security relating to computers and networks has always been a concern for IT managers tending to Enterprise-class operations. Despite all their efforts to keep their networks free from intruders – be it a hacker, a worm, a trojan, or a virus – the biggest security risk to these systems is most often the users themselves. Over time, more and more businesses have started to depend on Technology and their hardware infrastructure, for their daily operations, and as these aspects of a business have become more critical, these hackers, worms, and trojans have become more targetted, again, typically focussing on the users. Instead of coming from the 13-year-old computer aficianado looking for some fun and fame, organized teams have been setup with a specific target, which, more often than not, is data.
In the last year, security flaws or breaches at large corporations have resulted in individuals being at-risk. Consumer information, from search results to personal credit card and debit card numbers, has been compromised; sometimes it’s a simple (albeit costly) mistake from an individual, but with multi-national corporations and millions of dollars potentially at stake, insider breaches are also a concern. But what does this mean for the average business?
All companies with sensitive data need to be aware that they are a target. Unless the proper steps are taken to ensure that your networks are secure, your data and your systems will be susceptible to attacks. The average webmaster or designated ‘IT guy’ in the company will not have the ability to maintain this level of security, and these types of services may require an outside resource to perform security audits on your systems. It’s also important to note the differences between network and server security.
For those that take advantage of co-location or a hosted server from companies like Superb, network security isn’t the issue; instead, keeping up-to-date with patches and updates for the server’s operating system and maintaining solid, secure coding practices is key to preventing unauthorized access. To help prevent unnecessary risk, we (the Superb Team) are putting together an unofficial checklist for self-managed servers, but it is definitely recommended that a professional review your server security regularly.