A data center is the infrastructure a business uses to house its IT assets — space, power, cooling, network connectivity, wiring, etc. Depending on the business’ size, it may be a spare closet, a dedicated building or space leased at a public data center.
“The data center itself is infrastructure and doesn’t generate revenue or create differentiated business value,” says Mike Tighe, executive director, Data Products at Comcast Business. “So, the CFO frequently says, ‘Rather than utilize precious capital to build or expand a data center, there are other options including great public data centers where we can lease space.’”
Smart Business spoke with Tighe about data center best practices, including network and bandwidth considerations.
Why are data centers so important today, and what’s in store for the future?
The function of a data center is to ensure availability of IT applications and data. If employees don’t have access, they can’t be as productive and in some cases, the business can’t run. The trend to place IT assets -—applications, servers and storage — in public data centers is rapidly evolving for businesses of all sizes, either as a main data center or as part of business continuity strategy.
Over the next five years the trend of renting rather than owning IT infrastructure will accelerate as businesses utilize cloud-based infrastructure and applications. This is not just because of better economics, the ‘cloud’ enables rapid deployment and the ability to scale applications that drive better productivity.
When should you look at outsourcing a data center?
When IT becomes an important component of how you run your business, you have to ensure high availability. If, for example, you install specialized applications used for resource planning and creation of content, but the server starts going down because of power or network connectivity loss, it impacts your business’s ability to run.
Another factor is economic. As businesses make IT decisions, they may not have the capital to build or upgrade data centers, so they’ll look at alternatives.
What are some options to consider with public data centers?
By their very nature, there are more capabilities in a public data center because everyone is sharing the cost of the generator, the physical security monitoring, having multiple network providers, etc. However, some things to consider are:
- Physical security procedures.
- Redundancy of critical components.
- The ability to expand as your IT infrastructure requirements increase.
- Network for primary and backup connections. What providers have extended their network into the data center to provide connectivity and ensure access?
- Location. Regional events including loss of power and natural disasters dictate that the backup site be located far enough from the main data center so as not to be affected by a single incident. Hurricane Sandy certainly brought home the point that a redundant data center far enough inland on a separate power grid helps ensure application availability.
How can companies build the right network?
Strong network connectivity becomes more important as IT assets are put into public data centers. Know how much your company’s bandwidth requirements are growing, and your network’s ability to scale for future requirements. On average, over the past decade, a business’s bandwidth requirements have grown around 50 percent per year. Look at network technologies that can cost-effectively scale — from 10 megabytes, an average site requirement, to one gigabyte, for example. Ethernet technology, which local-area networks are built on, is one solution that businesses are leveraging for their networks.
How do data center solutions impact a business’s bottom line?
With the economic downturn, use of company capital became a focus. Executives decided that the data center, while important, doesn’t produce any intrinsic value. And you can lease the space and preserve capital for projects that improve the bottom line. Companies can rent space by the square foot, rather than having to build another data center as IT needs expand.
Mike Tighe is a executive director, Data Products at Comcast Business. Reach him at (215) 286-5276 or email@example.com.
Insights Telecommunications is brought to you by Comcast Business
Cloud computing may invoke images of an abstract technological force floating in the atmosphere, but the term itself is misleading. The term originated because on technical diagrams, a cloud was drawn around any mixture of resources that made a particular application work, says Pervez Delawalla, president and CEO of Net2EZ.
“Cloud computing means so many different things to so many people, and there is a lot of confusion,” says Delawalla. “It’s cloudy out there in the cloud. An easier way to explain it is by using the utility computing concept. This resonates the most with people because they can compare to how they use gas or electricity as a utility, so now can you use computing power as a utility.”
Smart Business spoke with Delawalla about what cloud computing is and how to apply its advantages to your business.
How does cloud computing work?
You have to envision that the physical architecture itself is vast. Data centers all over the world house servers, and each server or set of servers is designated for a certain type of application or resource. Servers, routers, switches and security devices combine with network connectivity and an operating system to form the cloud. You could compare it to an electrical grid, in which power comes from substations and power generation points before electricity goes over wires to provide power to households or businesses.
Some examples of cloud computing include Apple products that back up data to their iCloud, and Microsoft products with which data is backed up on SkyDrive, Google Drive, Salesforce.com or Dropbox. A business may approach a data center for complete automation of its infrastructure and take care of the software itself.
The data center is then responsible for ensuring that all of the hardware pieces are working in harmony with each other and have different versing capabilities within that physical layer.
What are the advantages of cloud computing?
A major advantage of cloud computing is time. As a startup business you can get up and running online reasonably quickly and with minimal investment because you are not buying servers, routing or switching equipment. Instead, you are plugging into a utility that doesn’t require any setup, that is already functioning, and you simply pay for it on a monthly basis.
Cloud computing also offers more versatility and capacity. For example, say your company has a new e-commerce site and a product becomes an overnight sensation. If your website is on a cloud computing platform, you can scale up and sustain a high volume of traffic without having a performance degradation for users of your site.
Cloud computing can also improve your ability to be agile and nimble because the monthly fee for service includes taking care of your hardware and resources. As a user of the cloud, you don’t need an army of IT personnel or consultants, freeing you up both financially and staffing-wise to concentrate on your target business.
Additionally, a minimal amount of software is installed on the personal computer or path, so instead of downloading the entire Microsoft Office suite, for example, you can sign up for Microsoft Office 365, which allows you to subscribe to the cloud-based service on a month-to-month basis to access all Microsoft Office products.
What is the difference between public and private cloud services?
On a public cloud, you don’t know where your data is stored or who has access to it, but you are able to increase your capacity more quickly. An example is Amazon Web Services, which hosts websites on hundreds of thousands of servers which allows users to increase capacity as needed.
A private cloud can be established for businesses that know their growth plans and that want extra security. The business can then control who has access to that data and knows that it is stored in a secure location.
How are security and privacy handled with cloud computing?
Security and privacy are the main reasons businesses are hesitant to go over to the cloud. However, with a private cloud, you manage your environment so closely that the security is as good as with conventional computing. Because of privacy issues, HIPAA-compliant and PCI-compliant credit card companies will always have to use private cloud services.
For extra security, you can do an automation deployment with a private cloud, but that will result in higher costs because you have dedicated resources just for your company. For data that isn’t as sensitive, a public cloud offers more versatility and nimbleness.
What should businesses think about when considering cloud computing?
Ask yourself exactly what it is you want out of the cloud. What are your needs and what do you want to accomplish? With so many different products, you have to ascertain what you will use it for. If you require word processing or Excel, you could use Google Drive, Microsoft Office 365 or Google App. For massive data storage needs, there’s Box.com or Amazon Web Services.
For private cloud service, you need to find a data center company to meet those requirements. Examine what the various companies are providing as their feature set for cloud computing, then choose which company best suits your business needs.
If your focus and expertise is not in IT, the more you can outsource to a cloud computing environment, the lower your costs will be for computing needs and data storage. Then your company can focus on its strengths, knowing that the rest is being taken care of up in the cloud.
Pervez Delawalla is president and CEO at Net2EZ. Reach him at (310) 426-6700 or firstname.lastname@example.org.
Insights Technology is brought to you by Net2EZ
For many companies, a data center has become a necessity. Whether your company is considering building its own stand-alone data center or renting space in a colocation center, there are some recent trends to consider.
“The trends happening fall down different lines,” says Tim Chadwick, president of Alfa Tech. “There are some commonalities that happen on both sides, but things that work for the enterprise market don’t always work in the colocation world.”
Smart Business spoke with Chadwick about what companies should watch for in the changing market for data centers.
What are the latest data center trends in the market?
The major trend right now in designing, building and operating data centers is improving energy efficiency — reducing the carbon footprint. The fringe benefit of this environmentalism is saving on utility costs.
In moving toward energy efficiency, what we’re able to do for enterprise customers is look at the specific equipment to be used — the specific types of computers, storage devices and networking equipment — and really tailor the space to that equipment. The best example of that is the air that enters the computer to cool it. With an enterprise customer who may know specifically what equipment he is going to buy, the designer of the data center can let that temperature go a lot hotter or colder, or the air more humid or dry than past data center designs allowed.
In the colocation market, the colocation facility’s owner rarely knows what his tenants will require. The whole point of being a colocation landlord is you want a facility that appeals to everybody. If you were to build a facility that could only house a type of computer that can take hotter air, you are limiting your market. So builders and designers of data centers don’t always have as much flexibility in colocation spaces. Enterprise spaces have more innovation, more things happening. But what’s happening in the enterprise market is starting to find its way into the colocation market.
How is the enterprise market driving the colocation market?
For instance, Facebook has proven that you can use most servers at higher inlet temperatures. Once you start to get people to realize that, there is a larger clientele that is willing to go into a data center (like a colocation space) with higher temperature ranges. Instead of always seeing facilities using 75-degree air, they might see 80 or 85 Fahrenheit. When you have those higher temperatures coming in, that is where you can achieve big energy savings. You are spending less to cool your computers and equipment. And sometimes you’re not spending anything; the data center could just be bringing in fresh air from outside.
Because it’s already happening in the enterprise market, more people are willing to try it in the colocation market. Ten years ago, the only lease people would sign up for was guaranteed 75-degree air going into your computer. Now more and more people are considering that the old way of thinking. As long as people are open-minded to go to a higher inlet temperature, the colocation owner can pass along the savings. If the facility is cheaper to build and operate, he can charge you less for rent.
What other trends are occurring?
There have been changes in electrical in regard to the power coming in. We are starting to see more variations on the voltage coming into servers. That opens up opportunities for energy efficiency, and it has led to a similar situation with the enterprise customers driving the market for those sharing and leasing data center space.
Once one enterprise customer buys a thousand servers at a previously unknown voltage, the price point comes down. If there is no demand, the supply price is high. If you can increase demand to where there is a viable market, Dell, HP, IBM — the big computer manufacturers — are starting to see a market in these different voltages, so the price point is coming down. If you can buy cheaper equipment and save energy, that is the best of both worlds.
How would a company know what voltage is necessary for its data center?
You can buy the same computing platforms at different voltages. It’s a matter of what type of infrastructure you are plugging into. If you are in the U.K., for example, the voltage is different than in the U.S. So you have to make sure your server can plug into that and absorb the different voltage. If you’re in a colocation market, you have to talk to your provider and find out the available voltages at his facility. If you’re in a purpose-built data center, you work with your consultant to figure it out and design around your needs.
However, companies need to find a balance. Everyone is trying to wring out every last cent from the design and construction, without realizing that if you can get the end customer to buy different voltage equipment, they could save 5 percent in annual energy costs, for example. That might be true, but that different voltage equipment might cost 10 percent more in capital costs.
It might be worth it. But when you compare 10 percent more in capital costs versus a 5 percent annual savings on energy, sometimes it just doesn’t pencil out. You have to be careful to look at the whole picture. Quite often, data centers are not designed holistically. They aren’t designed knowing what equipment is going to be in them.
The key to a good design is a good understanding of the equipment that will be used in the space.
Tim Chadwick is president of Alfa Tech. Reach him at (408) 487-1278 or email@example.com.
When Cisco was looking to move its production data center facility out of San Jose, Calif., it considered more than 140 locations nationwide.
The clear winner was Allen, Texas, says Tony Fazackarley, IT project manager in charge of the Cisco data center in Allen.
“There were a number of factors that were key to selecting a site for our data center, including the availability and cost of power, the availability of high-tech staff and the availability of carriers for carrying data around the rest of world,” says Fazackarley. “There were also geological considerations, the temperature around the year and a whole host of other factors that fed into that calculation.”
Smart Business spoke with Fazackarley about how the Allen Economic Development Corporation helped Cisco settle on Allen and what the company is doing in the community.
What other things factored into the consideration to move the facility from San Jose to Allen?
San Jose posed a number of challenges, including geological challenges. The San Andreas fault is not a good environment in which to have your production center, and Texas is far more stable from a geological perspective.
The cost and availability of power was also a factor in both the move out of California and the choice of Allen. San Jose constantly has rolling brownouts, and the cost of power is very high. In addition, there were more and more regulations coming out about how you can operate your facility, and regulations generally equate to adding costs to your business. Cisco is fully behind being corporately responsible, but if you have the opportunity to save costs for your business, you owe it to yourselves to do so.
In Allen, power is more abundant and less costly.
Another factor is that we had an existing campus in Richardson with a shell building that we were able to convert into a high-tech data center. Once we selected Richardson as the first data center, we chose a greenfield site in Allen for our production data center facility, where we consult and assist customers in building data centers that are of the standard we built in Allen. The facility allows us to show best practices for building a modern data center and show how others can optimize their data centers and maximize their energy uses.
Why is energy use such a concern?
At some point, there is going to be regulation around data centers. We’re already seeing it in areas of California. If you build a data center in San Jose, you have to have the ability to generate your own power on site. We know those regulations will roll out as the energy squeeze becomes more constrictive. What we’re trying to do is anticipate what those regulations are going to be and put into place best practices to not only meet those regulations but also to be as corporately responsible as we can. As a result, most of the new technology we’re showcasing in the Allen data center is around energy savings.
How did the Allen Economic Development Corporation facilitate Cisco’s move?
It was very key to our decision to move to Allen. They were very helpful not only on the economic side of things but also in giving us assurance on how the land around us would be used. When you have a facility that is running your top applications that run your business, there are some businesses, such as a munitions factory, that you don’t want going up around you. Getting those assurances out of the city was vital.
You need support from the local area, and that was a big part of what went into the short list for selecting a site. And having an ongoing relationship with them, speaking with them regularly and keeping abreast of what is going on is vital to us, as well. They work to keep businesses happy
How do you work with the community?
One of the things we do in Allen is have a lot of customers come into the facility, where we showcase Cisco’s data center products. Most of the Fortune 100 companies have come through that facility, as have companies from around the world. And when they come through, they stay in local hotels, eat at local restaurants and use local facilities. My project manager deals with scheduling those resources and works with local businesses, restaurants, hotels and catering to support our needs.
We also do tours with local school districts. Most of those who come through are never going to build a data center like ours, but if they can take away even one best practice, we’ve done our job.
How does the data center impact the environment?
The Allen data center is a LEED (Leadership in Energy and Environmental Design) gold-certified building, one of very few in North America. At the moment, the LEED program is a general building program; if you build an office, garage or stadium, you build to the same categories for certification. But LEED is working on data center-specific specifications and we believe we can improve on the gold standard that we have for the building. Once data center specific LEED criteria become available, we hope to be able to meet the platinum standards.
Also, we hope to be part of the EPA Energy Star program, but you need to have 12 months of continuous operations data, and we only went into full production in March. But we are collecting that data to that end.
Tony Fazackarley is IT project manager in charge of the Cisco data center in Allen, Texas. Reach him at (408) 894-4149 or firstname.lastname@example.org.
Insights Economic Development is brought to you by the Allen Economic Development Corporation, strategically positioned in the Dallas/Fort Worth metro.
When the time comes for a corporation to decide on a data center location, there are a number of various inputs that must be addressed to properly validate this decision. A wrong projection, estimate, or even a simple oversight can be financially devastating. The initial step of a mission critical services team is to understand the requirement. You need to consider current capacities, age of existing equipment, new technological strategies such as virtualization and private/public cloud offering, and how they apply in the ever-changing world that resides on the raised floor. Once you understand your existing capacities, you can predict your growth, or contraction, over the course of the data center’s life. This will dictate the type of solution the IT organization will want to pursue.
Next, you identify criteria that eliminate sites with “fatal flaws.” A data center site search is basically a process of elimination. In identifying the optimal data center location, key factors would include latency, cost of power, likelihood of natural disaster, tax considerations, legislative issues, regulations, disaster recovery, business continuity, availability of specific fiber providers, cost of construction, proximity to labor and suppliers, and types of power. In terms of power, green companies might want to stray from coal while some companies are hesitant to rely on nuclear power after Japan’s recent crisis.
Most of the IT activity occurs from one of two types of data centers: colocation/wholesalers and enterprise users. An enterprise user manages centralized data shared by many users throughout the organization. A colocation/wholesaler furnishes the space, power and high-speed Internet links for a customer’s Web servers. Typically, the colocation/wholesaler will pursue major metros where the users exist or want to be — places like New Jersey, Chicago, Dallas and Allen in Texas, Northern Virginia, Northern California and Southern California. Some colocation/wholesale users are investigating emerging markets, or second- and third-tier cities, like Phoenix, Nashville, Seattle, Atlanta, Salt Lake, or Denver that have moderate client demands and fairly strong fiber connectivity. The very large enterprise users who build their own facilities will tend toward more rural areas, like Kansas City, Charlotte and Omaha, that are able to provide location-based economics, such as cheap power, lower tax burden and favorable incentives.
The 5th annual IDC Digital Universe study projected overall data will grow by 50 times by 2020, driven in large part by more embedded systems such as sensors in clothing, medical devices and structures like buildings and bridges. In 2011 alone, 1.8 zettabytes (or 1.8 trillion gigabytes) of data will be created, the equivalent to every U.S. citizen writing three tweets per minute for 26,976 years. And, the number of servers managing the world’s data stores will grow by 10 times over the next decade. The cost to build a data center is measured in terms of kilowatts, with the price ranging from $8,000 to $12,000 per kilowatt. The equivalent construction growth rate in dollars is $12 to $15 billion.
Per a recent McKinsey Report, data centers are making an impact in corporate planning. IT assets represent 50 percent of the total corporate assets, and data centers represent 50 percent of that 50 percent. Expenses are higher than ever, with IT budgets growing at 6 percent annually. Operational expenditures for facilities came in at 8 percent for IT, with a growth rate of 20 percent. Intensive data center users are facing meaningful reduced profitability.
Growth in supply at 6 percent is outpacing demand at 11 percent. Growth-related factors involve additional storage for security replication, governmental regulations, financial information, medical records and retail as well as popularity of handheld devices and gaming, and increased use of streaming video. Use of collocation models and speculative purpose-built data center building shells are increasing. Primary reasons for moving a data center are the need for additional power, cooling and space, and lease expirations. According to a Gartner Group Study, 70 percent of the Global 1000 corporations must move or modify their data centers. The average age of today’s data center is 12 years old. Today’s enterprise users’ existing facilities are space constrained, in the wrong geographic location due to changes in demographics or corporate service areas, unable to support high density loads, insufficiently fault tolerant, and overly exposed to external threats. Nevertheless, until the credit market changes, the lack of capital will continue to result in postponed improvements and construction of few new centers.
Bo Bond is a managing director/regional director in the Dallas office of Jones Lang LaSalle and co-leader of the Data Center Solutions team. With more than 16 years of experience in the commercial real estate industry, Bond has successfully negotiated more than 15 million square feet of real estate transactions in multiple states. His knowledge of technical issues, infrastructure and labor assessment has also allowed him to develop the unique skill set required for mission-critical and contact center requirements.
Insights Economic Development is brought to you by the Allen Economic Development Corporation, strategically positioned in the Dallas/Fort Worth metro.