Traditionally, the US has been the industry leader when it comes to Web hosting providers. But there are growing concerns, since the Patriot Act was passed on October 26, 2001, and because of that, online privacy is not considered by most experts to be secured in the United States. To get a better understanding, one only has to look at a very recent example where data thought to be protected but was actually available under the scope of the Patriot Act.
Europe, long known to have strict security and privacy requirements, has traditionally been a location for businesses to store their data and it was thought that European datacentres were protected from the Patriot Act. In June of 2011, Gordon Frazer, managing director of Microsoft UK, confirmed that “Office 365 cloud data stored at European datacentres could still be handed over to American officials,”and went as far as saying that “Microsoft cannot provide guarantees and neither can any other company.” The issue is based on the fact that Microsoft’s headquarters are located in the US, and it is obligated to abide by American regulations, meaning that any data stored on their servers is subject to seizure and inspection.
Because of this, many businesses are now looking for alternativesincluding “offshore hosting,” often scaled out to include Third World locations where the perceived notion is that these offshore marketplaces will prevent their data from being captured under the rules of the Patriot Act. But the notion is often flawed as offshore hosting is often priced well above their US counterparts and they’re often located in areas with a poor technological infrastructure — meaning slow speed, and slow resolution of technical problems.
So what choices does a business owner have to protect their customer’s data and their business? Staying with a US hosting company is getting risky from a privacy point of view, and Europe no longer provides the safe haven it used to. More and more businesses are turning to their neighbors to the north. Canadian web hosting companies — like Canadian Web Hosting Ltd. (www.canadianwebhosting.com) — are located in areas that have comparable capacity to deliver infrastructure solutions and premium web hosting services.
Cities like Vancouver and Toronto have excellent staffing and infrastructure, comparable high-speed connectivity to major US population centres, and in some cases it is better than popular US hosts located in the major metropolitan areas like Los Angeles or Dallas. In addition, service providers like Canadian Web Hosting Ltd. are now beginning to offer pricing for services like Cloud Hosting, Dedicated Servers, VPS and shared hosting at rates that are very competitive. So if Canadian web hosting and US web hosting services are similar, what separates a Canadian web hosting company from their counterparts?
To fully understand this, you have to first off look at how Canadians view privacy. If you read the Treasury Board of Canada Secretariat’s website (a government website), you will see the following statement, “Privacy has long been considered a fundamental right in Canada. The Government of Canada is recognized internationally as a leader in the creation of privacy laws and policies.” Then, they go on to say that the chances of personal information being accessed under the Patriot Act are “remote” and that to date the “federal government is not aware of any such cases where information about a Canadian was accessed.“
Moreover, Canadian privacy laws protect individual Canadians. One recent example occurred when The Privacy Commissioner of Canada forced Facebook to amend their privacy practices and policies for not only Facebook users in Canada but around the world because Facebook breached the Canadian privacy laws. This example shows the depth of the matter and how all companies including Facebook need to improve their privacy practices.
What really separates Canadian web hosting companies from their counterparts is the safeguards that the Government of Canada has implemented to protect personal information. The primary method has been through the Personal Information Protection and Electronic Documents Act (PIPEDA), which requires web hosting organizations engaged in commercial activities to obtain individuals’ consent to the collection, or disclosure of their personal information. As an example, Canadian Web Hosting Ltd. maintains full compliance with PIPEDA and ensures that the following privacy requirements are met:
Consent must be garnered for collection of personal information
Collection of personal information limited to reasonable purposes
Limits use and disclosure of personal information
Limits access to personal information
Stored personal information must be accurate and complete
Designates the role of the Privacy Officer
Policies and procedures for breaches of privacy
Measures for resolution of complaints
Special rules for employment relationships
Further than that, provincial governments have implemented secondary standards that say that even if disclosure is permitted through PIPEDA, regional statutes such as the Business Records Protection Act (Ontario), which essentially says that business records cannot be removed from Ontario.
Until the US government provides additional clarity into the uses of the Patriot Act, web hosting customers who are concerned about privacy and security of their data should look north of the border. Companies like Canadian Web Hosting Ltd. who are 100% Canadianowned andoperated can deliver industry-best web hosting services, but are also contractually obligated through regulations to protect the data included in their datacentre.
Every year at HostingCon we are always looking for new and innovative products that can add some real value to our product line and provide a competitive advantage for customers who engage us for web hosting services. Listed below are 3 products that stood out from the pack and seem to be a step ahead of their competition. In putting this article together, I realize that what ties them together is something that I didn’t actually see a whole lot of at this year’s show – security. Through various conversations with industry experts, I expected to see a lot more security products on the floor in support of various cloud initiatives or various hosting ecosystems like OpenStack. But interestingly enough, the focus this year was more on information management and automation, but will save a review of that topic for another article.
Black Lotus Mitigation Pro
As many web hosting companies can confirm, DDoS attacks are just a part of the business that we have all had to learn to get along with. Depending on your bandwidth provider, or strategic direction of your company, how you deal with DDoS is a company decision. Over the years, we have continually gone with the Null Route approach and to help protect our network, we were not always lenient with those customers getting attacked. As the company has changed, we have continued to look for new avenues to protect our networks and customers from DDoS and we think that the Mitigation Pro might finally be a tool that we are willing to utilize. This tool could provide us with a level of control since it is built into our network and allows us to partner with a company who has the expertise and services to support our initiatives.
In our case, we looked specifically at the Mitigation Pro 1008 appliance (manufactured by Intruguard), which Black Lotus defines as a “robust mitigation platform with advanced features that accurately detect and mitigate attacks on layers 3 through 7.” It works by utilizing active production from attacks such as HTTP GET and POST, SYN Floods, UDP Floods, spoofed packets at Layer 7 and many more. One of the biggest benefits of the tool is the ability to either integrate it your existing network and offload the attacks on to a separate line, or offload the attacks to the Black Lotus’ network. For us, this provides a layer of control that we are comfortable with, while still having options and expertise to back us up. One side note is that if you are not a DDoS expert, you will definitely want to consider using Black Lotus’ services to setup your boxes, as there does appear to be a very high learning curve.
One of our areas of focus this year for Canadian Web Hosting was security. We’ve heard a lot of about Cloudflare and this was a great opportunity to talk to the team there to get a better understanding of their service offering. While doing that, we also took some time to meet one of our existing vendors called Imperva. While standing there, they took the time to introduce us to Marc Gaffan, Cofounder of Incapsula. He gave us a quick overview of the product, and talked a bit about how Incapsula came to be. Similar to Cloudflare, Incapsula offers end-users the ability to improve performance, security and uptime. Essentially the way it works is your site traffic is routed through a network of high-powered servers all the while analyzing all incoming traffic to keep threats out of the network. It does this by caching and optimizing its content and serving it directly from their globally distributed servers.
One of the things that stood out about Incapsula is just how easy the service really is. To get started, you just need to make a couple of changes to your DNS including CNAME and A records, and everything is good to go. Marc walked us through an example with one of our domains and it took maybe all of 3 minutes to complete the process. You start by simply entering your domain URL, let Incapsula analyze your settings, and then you are given directions on what changes need to be made. It really is that easy.
Getting back to the original reason we stopped by the booth, and one of the biggest differentiators that Incapsula offers is the built in Web Application Firewall that protects end-users again SQL-injection and Cross Site Scripting in their free personal plan. At this time, you have to pay $20 to Cloudflare for a similar service.
Global Sign One Click SSL
Probably the biggest surprise (thought it shouldn’t have been) of HostingCon 2011 was Global Sign’s recently introduced One-Click SSL technology. Historically speaking, applying and installing SSL certificates has always been an area where we sought improvement. Whether it is the length of time to complete the request, or the verification requirements, SSL’s have always been a hosting product that we have to have but couldn’t find a way to seamlessly execute because it was always a manual process.
After spending some time with Frank Romito at Global Sign, we were directed to a recent press release that accurately describes the issue and how Once Click SSL technology addresses that issue. “Automation of the SSL Certificate lifecycle (CSR generation, application, approval and Certification installation) is essential to avoid unwanted and expensive customer support issues. Issues can range from customer needing assistance with CRS generation, approvals and/or Certificate Installation. “
What makes it unique is that we are able to move away from provisioning of the SSL and can automate all aspects of the SSL lifecycle, thereby allowing us to focus on what we do best – exceptional service for our customers. The technology works by automatically creating the keys, the Certificate Signing Request (CSR), validates the control of the domain and installs the issue certificate to the appropriate website within several minutes.
On occasion, you might need to create a scheduled task for your site. A good example is if you install AWStats on your site, or are running a content management system like Drupal or Joomla that requires a background program to run at a certain time. If this is the case, the documentation usually asks you to schedule a cron job on your web server.
Before we look further at how a cron job is setup, we need to first look at the use of cron commands and how they are used. For those of you interested in history, cron commands are derived from the Greek word for time, and can be used to execute a command or script at a scheduled time. I like to think of them as an automated alarm clock for your server. Essentially, cron commands are a Linux OS daemon that lies dormant after being loaded onto the memory. It is used primarily for system administration and syncing emails, but its uses are vast.
Cron is the daemon that executes cron commands by searching for any existing crontabs, and Cron may be initiated via /etc/rc or /etc/rc.local. Once started, Cron wakes up every minute, examining all stored crontabs, checking each command to see if it should run in the current minute. In addition, cron check to see if its spool directory’s modtime has changed, and in the event that it has, Cron will then examine the modtime on all crontabs and reload those which have changed. Typically, Cron will send an email that contains the output to the owner of the job.
As an example, if you are running CentOS or Redhat, you would use the following Linux commands start, stop and restart your cron job.
Task: Start cron service
To start the cron service, use:
# /etc/rc.d/init.d/crond start
Task: Stop cron service
To stop the cron service, use:
# /etc/init.d/crond stop
Task: Restart cron service
To restart the cron service, use:
# /etc/init.d/crond restart
The first step that needs to be decided is figuring out the schedule. Are you going to want to run your cron job to run hourly, weekly and one time? It is important to note, that Canadian Web Hosting does monitor our shared hosting accounts who are running cron jobs so they do not impact the other sites hosted on that server. If you need to run them in higher frequency, we would recommend a VPS or Dedicated Server where the resources are dedicated to you.
Now that you have decided on your schedule, you need to write your schedule in a way that crontab will understand it. The basic format of a crontab schedule consists of six fields, separated by spaces and is formatted like the following:
“minute hour day month day-of-week command-line-to-execute”
The acceptable values for each field are described below:
Based on that, let’s look at a few examples that follow-up the designated format:
“15 9 5 1 * /your/directory/dogtired.pl”
This entry will implement a command that runs a Perl script called dog tired on the 5th of January at 9:15AM.
“30 5 * * * /usr/bin/wget http://www.mysite.com/citizen.php”
This entry will run a command starting at 5:30AM that calls a script from a web browser, or if in this case a Drupal installation, we will use a command called “wget.” This will allow us to get the web server to run the script called “citizen.php.”
30 18 * * * rm/home/imtheuser/tmp/*
This example will run remove the tmp files from /home/imtheuser/tmp each day at 6:30PM.
Working with our clients, there are several common requests that we receive on a regular basis. Oftentimes, clients will ask us how to disable the automated email that gets generated when the cron job executes and how to generate the log file that is collected when the cron job executes.
To disable the automated email feature, simply use the following command:
Using our 3rd example above, we can generate the log file use the following sample:
30 18 * * * rm /home/someuser/temp/* > /home/someuser/cronlogs/clean_tmp_dir.log
San Diego, California, treated us well. Everyone was very excited to attend again this year and the event delivered as expected. Yes, we're finally back from HostingCon and in the next few weeks, we're excited to share all of our experiences and connections that we've made at this fantastic three day conference from different perspectives. Our team in attendance comprised of new and long-time attendees, so we're sure that you'll find some interesting insights into our upcoming blog posts, and introductions to several new products and services that may be incorporated in the months ahead.
As a first time goer from attending and observing others in the audience, I wanted to share four main thoughts including four suggested tips for upcoming HostingCon attendees for the years to come.
1. Review the Conference Schedule
With three packed days at the conference, it's quite helpful for a first time goer to scan the schedule ahead of time. There are many sessions that you can attend via different tracks and selecting them ahead of time will help you manage your time more effectively during the conference, so you're not wasting time trying to figure out where to head to next.
2. Pre-Connections using HostingCon Connect
This was a useful tool to use prior or even during the conference to connect with other registered attendees. I received a few pings from other attendees and even though, our schedules didn't sync for a meet up, I was still able to make new connections online for the future. It's useful and sometimes, more efficient to start introduction online since it allows you to do research on one another, then once you meet face-to-face, you already have content to discuss. This brings us to our next point.
3. Networking during the Conference
With around 1800 attendees, it can get overwhelming very fast when you're walking through a sea of people. By using the previous point, it can ease the introductions, however, it's during non-scheduled events that you can take advantage of connecting on a more personal level with your peers. You can do it anywhere from waiting in line at the lunch buffet to sitting near someone at a session, or perhaps even in the bathroom if you happen to spot their nifty HostingCon badges, or even the good ole water cooler in the back of the informational session rooms. The prime area to network was, of course, at the exhibition hall where we quickly found out that you needed to have a big stack of business cards to hand out or drop off or else, you'd run out quickly. It was fun to see the creativity with all the different booths on deck, especially the way each company engaged with us through simple conversations or better yet, with their interactive games; we'll have more to share on the games.
4. Connecting and Learning
Thanks to all of the open slots for networking, before and at the end of each day, not to mention the after parties, we all gathered a pile of business cards throughout each day and it was especially great to see how many countries were represented. What diversity! During the informational sessions, I was able to connect with the speakers by reporting their talks live to other attendees but also to those who couldn't be on site by using the #hostingcon hashtag. It even worked for keynote speakers. Twitter can be a very powerful tool and we love being social with our peers, our customers and at conferences, it gives you a new venue to connect in a more conversational, informal way.
Overall, I thoroughly enjoyed my first HostingCon because of the new connections that I made online and offline, the new insights that I learned during the marketing and sales sessions, and witness the endless amount of freebies and prizes including teeshirts, iPad 2, AR drone, $100 VISA card, $5 Starbucks giftcards and much more that our team gathered and won. From talking to past attendees, each year has improvements and we can only expect Boston in 2012 to be even bigger and better.
Were you there? What did you think?
CTO / SEO Guru