Steve Trusty

Friday, 24 November 2006 19:00

Keeping abreast

Interest in professional development programs continues to grow. As technologies, new ideas and new laws advance, companies are looking for ways to keep abreast. Individuals also are seeking ways to advance their careers. With more than 4,000 colleges and universities around the U.S., it is important to choose the right program for your needs.

“This is a huge industry and, while most schools do a good job at what they do, some do not,” says Tom Green, Ph.D., associate provost at National University in La Jolla. “The first areas potential students must identify are their own needs and expectations from a professional development program. Then they can determine which schools might fulfill those needs.”

Smart Business talked with Green about how companies and individuals can best choose the right program for their needs.

What options should be considered when looking at professional development programs?

The first area to consider is the importance of a degree. If a degree is important, then you need to find the programs that lead to the degree you wish to attain and then determine which of those programs is best for you. If a degree is not as important as the skills or knowledge desired, then you look for the program that provides you the most information in your area of need for the time and funds expended.

While some people are just looking for extra initials after their name (in that case almost anything will do), most people are returning to school to broaden their knowledge or to gain additional skill sets.

What criteria should be used to find the right program?

If you are looking for a degree, you need to find out if the college or university is regionally accredited. If the degree is a precursor to licensing, then you need to look for specific accreditation. It is then important to look at the specific program for the degree and the individual courses offered. How does it all fit with what you want to do next? How open is faculty for discussion of your needs and how their offerings fit those needs? Are graduates available to provide additional insight? Talk to others in your organization about their experiences.

If a nondegree program will fit your needs, it is important to have a sharp focus on just what those needs are. Is the purpose to refresh skills or gather new knowledge? Will the course make a difference in your organization? Will you be more effective in what you are doing or will this course open new doors?

How do you know if the program is right for you?

Review the courses. Does the delivery method match your particular learning style? Visual learners have different needs than verbal or participative learners.

Do the instructors have real-world experience? The teachers should be academically qualified and can be most helpful if they have experience in the field as well. A good combination of practicality and theory is especially important for adult learners. Some programs are now adding the component of application, which also comes from real-world experience.

Another area to consider is who else is taking this program. The educational content may not differ much among several programs, but the other participants might vary widely. Networking can be a very important component of learning and advancing.

If you are looking for specific knowledge, is the supplier willing to customize a program to fit your needs? Some universities have set their ways and cover broad interests. Some smaller schools may have the ability to tailor their program to fit your specific needs.

Look at the oversight. What are their standards? What are the important elements and how are they met?

Are there any other considerations?

It is so important to be a smart consumer. You have many choices, so you need to be a savvy shopper. It is up to you to analyze and match your needs with what is offered. The more you can learn about the programs that are available, who is teaching them, and how others view them, the better your ability to make the right decisions to move ahead in the competitive environment in which we live and work.

TOM GREEN, Ph.D., is associate provost at National University. Reach him at tgreen@nu.edu or (858) 642-8493.

Friday, 24 November 2006 19:00

Toward broader coverage

You face all sorts of exposure to liability as you go about your daily life. While your personal liability policies may offer some coverage, they may not be sufficient and won’t extend to all potential exposures.

Nationally, nearly one in every six jury awards now tallies $1 million or more. In one recent five-year period, the average award for personal negligence cases went from $264,765 to $2,959,047. That excludes defense costs, which can run into the hundreds of thousands of dollars — even when the defendant wins in court.

“As your income and assets increase, so does your exposure to larger jury awards,” says Loretta Kirchhoff, personal lines risk manager at DLD Insurance Brokers Inc. in Irvine. “The best way to protect yourself against the rising exposure is through a Personal Umbrella policy.”

Smart Business asked Kirchhoff for more details about these types of policies.

Why should business professionals consider a Personal Umbrella policy?

People need to protect their personal assets above what is provided through their standard auto, homeowners or water-craft policies for catastrophic losses. Besides higher liability limits, you are buying broader coverage in case you are sued. Today’s lawsuits are filed for everything from serious injuries resulting from tragic accidents to seemingly silly disputes. If you are served with legal papers, you must have enough insurance to cover your legal liabilities.

What kinds of broader coverage do you mean?

Your umbrella policy agrees to cover you if you cause bodily injury, property damage, or personal injury. While your homeowner’s policy will provide coverage for bodily injury and property damage, many homeowner policies won’t cover personal injury without an extra endorsement. Generally, personal injury encompasses false arrest, false imprisonment, malicious prosecution, defamation, invasion of privacy, wrongful entry or eviction.

If your leisure activities involve rental of a moped, dune buggy or boat, you may want to explore coverage under a Personal Umbrella policy for protection from exposure to liability should your negligence cause a severe accident involving multiple injuries. You should familiarize yourself with possible exclusions in any of your policies — including a potential umbrella policy.

Another area of exposure is the company vehicle. If your company car is your only vehicle, you have access only to the limits the company has. If you have $3 million in assets and your company carries a typical $1 million policy, your personal assets could be at risk without higher limits. When a ‘drive-other-car’ endorsement is added to the commercial policy, you can request the company car be added to your Personal Umbrella policy. The umbrella policy limits would be available once the underlying policy limits were exhausted.

Certain umbrella policies also provide coverage if you face liability arising from your service on the board of a civic, charitable, or religious organization. While the organization may have a directors-and-officers policy, it typically may only cover $1 million per occurrence. All the board members share this limit and, when that limit is exhausted, board members are personally liable for the judgment beyond that amount, putting personal assets at risk.

Are all Personal Umbrella policies the same?

No. Some very unique Personal Umbrella policies are available, offering excess-uninsured/underinsured-motorists coverage; an endorsement that will pick up limited domestic employment-related lawsuits; and an endorsement that will broaden the directors-and-officers coverage for those on not-for-profit boards.

Talk to your agent about all of your potential exposures, your assets and other coverage needs.

What limits should you consider for your Personal Umbrella policy?

As with any type of insurance, you don’t want to buy unnecessary coverage.

You should have enough liability coverage to resolve a claim or lawsuit that will take into consideration your current assets, your current earnings and future earnings. Even if you buy the top-of-the-line personal liability umbrella policy, you can’t protect yourself against every possible claim or lawsuit. There will be exclusions in the umbrella policy, as with every insurance policy. Work with your agent to fully understand what your Personal Umbrella does and does not cover.

As a CEO of your own business, you may be allowed to schedule your Personal Umbrella under your commercial umbrella. This can be an economical way to have access to very high limits. If this is a route you would like to explore, be aware that when you schedule your Personal Umbrella under your commercial umbrella, you will be sharing limits with your company and the commercial umbrella will be subject to an annual aggregate limit versus a per-occurrence limit.

LORETTA KIRCHHOFF, CIC, is the personal lines risk manager at DLD Insurance Brokers Inc. in Irvine. Reach her at (949) 553-5690 or lkirchhoff@dldins.com.

Saturday, 28 October 2006 20:00

An Internet backbone

The digital age is rapidly changing how we operate in a number of different areas. Digital telephone service is one of those areas that is not only evolving at the speed of sound, but is providing direct cost savings and greater efficiencies. This is especially true for companies with multiple locations.

“By installing readily available equipment that converts the signal of traditional phone service to a digital signal, you can eliminate virtually all usage charges for local and long distance service,” says John Curry, owner of Curry IP Solutions.

Smart Business asked Curry for additional insight on cutting costs with digital phone service (IP).

How can digital phone service reduce costs for organizations with multiple locations?
We are all used to the telephone service model of obtaining local service when opening up each new branch of an organization. Phone systems would be installed to handle the expected traffic, and a long distance carrier would be selected to handle calls between headquarters, branches and customers. Basic service alone could average $300 per month or more and continue to rise with each additional minute of long distance conversation.

By converting to digital service, you can use the same Internet access that you rely on for inventory monitoring and ordering to provide the backbone for the new telecommunications technology that is sweeping across the country. Data and voice communications are moving simultaneously over the same channels.

What kind of savings are you talking about?
I know of one national chain with 1,000 locations across the country. The communications service between locations averaged at least $300 per month. With $300,000 per month savings, the company saved $3.6 million annually. Each company’s savings would depend on the number of locations and the average monthly costs. Savings for companies with international locations can be even greater.

Would a company that is about to see large growth benefit more than an established company?
I would say yes. With traditional phone service, one may need a small phone system at each location. Let’s say the system costs about $4,000 for each location with expected growth to 100 sites in the first year for a total cost of $400,000. A newer digital system would only require a one-time investment for the system and the phones at each location, with an overall investment of about $130,000 — resulting in a savings of $270,000.

Can 911 services be a problem?
There have been concerns about this, but there are ways to work out the technical issues. With new developments and solutions arriving daily, E911 services soon will mirror the traditional services already in place. An improved E911 service is being developed to effectively use GPS (global positioning systems) to locate the calling party.

What about mobility?
Mobility is another great benefit of these systems. When a person goes out of town on business, he or she can take an IP-programmed telephone. He or she can plug it in to any high-speed Internet access and be seamlessly connected to the company’s phone system. So when clients call your office, they are actually reaching you at your hotel, or wherever you might be, without knowing it.

If a business in New Orleans had been equipped with this type of system when Katrina hit, it could have had calls immediately redirected to another location either permanently or temporarily.

How does this system work?
It uses the Internet’s invisible web that is accessible almost anywhere in the world. Each telephone is programmed with its very own alphanumeric code, a signature that identifies that phone with the system. So when a client calls, the system searches for that signature, authenticates it, and completes the call if the phone is plugged in. If the phone is not plugged in, or if you’re unable to answer or choose not to answer, the call can go to your cell phone and then to voicemail.

What about quality of service?
The quality of service is not much different than that of traditional phone service. In some cases, our customers indicate that quality is better than in the traditional services. Improvements in technology over the last couple of years continue to make this possible.

JOHN CURRY is owner of Curry IP Solutions (www.curryip.com), which caters to business clients. Reach him at (412) 307-3600, ext. 9007 or john@curryip.com

Tuesday, 26 August 2008 20:00

Will you be tax exempt?

More than seven years ago, President George W. Bush signed into law the Economic Growth and Tax Relief Reconciliation Act of 2001. This legislation was designed to reduce federal estate taxes by increasing the exemption amounts and reducing estate tax rates over time, with a complete repeal of the estate tax scheduled for 2010. However, it is expected that Congress will act in 2009 to head off the scheduled repeal of the estate tax.

Against this backdrop, it is crucial to review existing estate plans and prepare for potential future scenarios.

“Estate plans should be revisited whenever estate tax laws are about to change significantly,” says Sally Larson Sargent, senior vice president and head of personal trust administration for MB Financial Bank in Chicago.

Smart Business spoke with Sargent about the current estate tax laws, what changes may be in store for the future and the importance of staying abreast of changes to the tax code.

What are the current laws in regard to estate taxes?

The federal estate tax exemption has increased dramatically since 2001, when it was $675,000. This year, the federal estate tax exemption is $2 million. Next year, the exemption will increase to $3.5 million, with a complete repeal of the federal estate tax scheduled to take place in 2010. Federal estate tax rates have dropped over the same period of time, from 55 percent in 2001 to 45 percent this year and in 2009.

If Congress takes no further action, the federal estate tax will disappear in 2010, only to be reinstated in 2011 with an exemption of $1 million and highest estate tax rate of 55 percent. It’s a strange situation. Few estate planners believe that federal estate tax regime will play out over the next few years as initially enacted back in 2001.

What changes in the estate tax law are being discussed?

Various bills have been introduced over the last few years to address the pending repeal of the estate tax, but none has made it into law. Today, there doesn’t seem to be any real expectation that the estate tax will be abolished as the law currently anticipates — both because of budget deficits and the belief that such a move would not be supported by a majority in Congress. Assuming that the estate tax is not abolished in 2010, there have been recent proposals that would freeze the federal exemption at the 2009 amount of $3.5 million and others that would increase the exempt amount to $5 million. At recent Congressional hearings, estate planning experts have suggested modernization of the rules that apply to installment payments of estate taxes for closely held businesses. Experts also have suggested that the exemption amount should be ‘portable’ between spouses, meaning that if an exemption is not used fully in the estate of a deceased spouse, the unused portion should be available to the estate of the surviving spouse.

How do you anticipate the upcoming presidential election will impact estate tax laws?

All that we know for sure is that the estate tax laws are unlikely to be touched until after the November elections. After the election, Congress likely will make the estate tax a priority, given the looming repeal of the estate tax in 2010. Neither of the presumptive presidential candidates supports the full repeal of the estate tax. Sen. John McCain’s tax plan favors a permanent $5 million exemption and a 15 percent estate tax rate, while Sen. Barack Obama’s plan favors a freeze at the 2009 estate tax parameters of a $3.5 million exemption and a 45 percent rate. The likelihood of either plan becoming law will depend on political, economic and deficit-related considerations.

Why is it so important to stay on top of changes to the tax code?

A carefully crafted, flexible estate plan that anticipates changes in the estate tax laws is the key to minimizing taxes and maximizing what will pass to beneficiaries. The federal estate tax is only one component of taxes imposed at death, however. State taxes also can be significant. As the federal estate tax exemptions have increased over the last several years, many states have enacted new or revised state death taxes, because states no longer are receiving state death tax credits that previously were part of the federal estate tax regime. With many states ‘decoupling’ from the federal estate tax system and creating their own death tax regimes, planning for state taxes can be critical, as well.

SALLY LARSON SARGENT is senior vice president and head of personal trust administration for MB Financial Bank in Chicago. Reach her at (847) 653-2158 or ssargent@mbfinancial.com.

Saturday, 26 July 2008 20:00

Complacent versus resilient

Businesses in the Midwest didn’t expect the floods. Others didn’t know a tornado was going to hit them. Companies don’t know when hackers are going to get into their database. The company in Florida didn’t expect its former employee would wipe its hard drives clean before she left (and it didn’t have a backup).

“It is not a matter of if, but when,” says Bill Douglas, a Certified Business Continuity Planner (CBCP) and Program Director for Houston-based DYONYX. “Two out of five businesses without a continuity plan never come back from a disaster. One in five go away within five years. Those that have a plan have the tools to survive. The first place to start with a continuity plan is with a Business Impact Analysis (BIA).”

Smart Business asked Douglas for his input on the need for and development of a BIA.

What is a BIA, and why is it important?

The BIA is the first step in any recovery plan. You need to know the critical processes and what it is going take to keep them going. The BIA is a procedure that helps you prioritize the importance of every one of your business processes in each of your business units. Unless you analyze each process and its importance to the company, you may be spinning your wheels on things that are not cost-effective.

You need to divide all of the processes into one of three categories: mission critical, business sensitive and business tolerant. Mission critical are those processes that your business can’t operate without. Answering the phone or access to customer records may be two examples. Business sensitive are those processes that are needed but may be able to be delayed for a short time until the critical things are handled. Business tolerant are those processes that make someone’s job easier. That might be certain spreadsheets that are used from time to time and can be recreated once everything else is up and running.

What is the process in developing a BIA?

Start with each department and find out what its critical processes are. You know your applications; build questionnaires to identify which of the three they are. Drill down to how necessary they are. Talk with business managers and IT managers to determine what would happen to the business if each of the applications went away.

A full BIA takes time and money. Is there a way to shorten the process?

Typically, a full BIA takes a minimum of two to three weeks to analyze business work-flow practices, depending on the size of the business. It is important to involve both business managers and IT managers. The business units know what would happen if each application was not available. IT knows what resources are available and what would have to be replicated at alternate locations. If you work with both IT and business at the same time, you can more quickly get the whole picture. Time is money. You may find that some of your applications are not needed. It is possible that you could save upfront by eliminating applications or processes that bring no value.

If all my systems are designed to be fault tolerant, why do I even need a recovery plan?

Fault tolerant means that the system is always available. If some portion of your system becomes contaminated, you need to find, isolate and recover. You need a plan in place to be able to keep part of the system going while the source of the problem is handled. Things may be going very well, but what are you going to do if the whole building or data center disappears or is inaccessible for a period of time? Since Sept. 11, it is easier to recognize that ‘the smoking hole’ can happen to anyone.

How do I know if I am paying too much or too little for recovery?

Business recovery is insurance. Like everything else there is some price you are willing to pay. Looking at impacts over time you have to determine what is essential. Can you get by for a day, a week, or a month? Looking at cost over time you can view different solutions and the internal impact. You have to consider the return on the investment. You hope you never have a disaster, but what if you do? What is critical? Maintaining customer contact may be most critical. If customers can contact you and receive assurance that you’re going to be back online and within a reasonable time frame, there is a much higher likelihood they will wait.

BILL DOUGLAS, CBCP, is Program Director at DYONYX. Reach him at (713) 293-6314 or bill.douglas@dyonyx.com.

Wednesday, 25 June 2008 20:00

A well-run service desk

It’s all about customer service. Today’s expectation of IT organizations is that customer support will be friendlier, faster and more thorough, with the technology available to restore service quickly to prevent loss of productivity.

“The focus should be on what the customers’ needs are and ensuring that you have the knowledge and right tool sets for supporting them,” says Sandy Barr, ITIL, Service Center Manager for Houston-based DYONYX. “When the IT organization has a solid understanding of the customers’ needs and they work together to define their service levels, this develops a strong relationship and creates actual, documented measurements of success.”

Smart Business talked with Barr for further insight into a well-run service desk and how this impacts overall operations.

Why is new technology alone not the answer to improved productivity?

New technology is inevitable and does provide efficiencies that we didn’t have 20 years ago, but along with new technology comes new support issues. New technologies may give us an edge and/or provide us with direct, constant communication, allowing us to quickly respond to customers, co-workers or other key people. Technology does fail from time to time. Having a professional and highly skilled service desk up to speed with such technologies can alleviate loss of productivity by restoring service immediately. Training your service desk staff members and keeping them on the technology fast-track is key. The more knowledge they have, the quicker, better, more accurate service is provided to the customer. In addition, the service desk, while providing restoration of service, can educate the customers to empower them with the knowledge, as well.

How do you reduce costs?

Providing a single point of contact for first-call resolution is a good start. By streamlining support staff and arming your service desk with today’s technological resources, such as remote control capability, software distribution and a robust ticketing and asset management system with a dynamic knowledge base, the ‘single point of contact’ takes on a whole new meaning. Within this environment, you suddenly realize that your more technical resources that typically get tapped for assistance are now able to focus on true problem management and root cause analysis. With a strong service desk, you’re building the most valuable information repository for your organization. You’re able to identify, track and document known issues and their resolutions. This increases the first-call resolution rate of the service desk and, if made available to the customers, allows them to diagnose their own issues and lessen the call volume, thus reducing costs and decreasing downtime.

How do you avoid the challenges with the battle of ‘business versus IT’?

Keeping the business and IT organization in sync requires frequent communication regarding the company’s standards for desktop hardware, software and PDAs or other devices. The IT organization has a responsibility to keep the standards current with technology and not only identify where there is business justification for change but also that the technology has been thoroughly tested and will not cause a drop in productivity. It is the responsibility of the IT organization to educate the business users for proper synergy. The key is in the promotion of the culture from the top level down to promote acceptance of standards without the organization feeling as if the standards have been mandated without business consideration.

Another key factor is to ensure that the IT organization is not just a nameless, faceless voice on the phone. It’s important to rotate support roles so that the business users can always put a name with a face. This helps to promote comfort and familiarity and develops strong relationships and trust with the IT organization. While the trend over the years to reduce costs has been to offshore support to India or China, that strategy has not proven to be a true ‘service-minded’ approach. Providing a higher quality of support with a stronger focus on customer service is preferable.

What is ITIL and is it the best standards-based approach?

ITIL stands for information technology infrastructure library, developed in the late 1980s in the U.K. Before ITIL, IT organizations were more internally concentrated on technical issues and less on providing quality service and developing customer relations. ITIL provides the framework for IT organizations with a methodology that will enable them to align their focus on service delivery and service support. There are basically three levels for a central point of contact. They are a call center, a help desk and a service desk. The natural evolution for most call centers and help desks is to become a service desk that allows business processes to be integrated into the service management organization.

SANDY BARR, ITIL, is Service Center Manager for Houston-based DYONYX. Reach her at (713) 293-6322 or sandy.barr@dyonyx.com.

Wednesday, 26 March 2008 20:00

Eye on the prize

What are the most pressing challenges facing your organization today? You want your business to prosper and grow. The more attention you pay to the right details, the better your chances are. How do you determine the most important areas on which to focus the most effort?

“It is important to regularly define what business you are in,” says Chuck Orrico, president and co-founder of Houston-based DYONYX. “Understand the key business drivers that fuel the business. Set your priorities based on these drivers to move the business forward to accomplish what you have set out to do.”

Smart Business talked with Orrico for his nsight into focusing on the right priorities.

What are the top business challenges facing executives in today’s business environment?

In talking with clients all over the U.S. and the U.K., they consistently identify their top five priorities as the following:

1) They want to sustain a steady top-line growth.

2) They want to grow their profits.

3) They seek speed and flexibility in dealing with customers’ needs in today’s business environment.

4) They want sound managerial talent.

5) They seek excellence in execution.

The top two are almost always the same, although they may be reversed from time to time. And, excellence in execution is almost always fifth or sixth.

Are these challenges prioritized correctly?

Excellence in execution should be the first priority. It is more difficult to achieve growth or increased profits without excellence in execution. It is how you execute your key business processes that determine the speed and flexibility with which key decisions are made.

You also need to know how you are going to gain any efficiencies from any processes you implement. Will these processes increase efficiency or provide more layers that detract from your core business?

What role does IT play in addressing these challenges?

IT should be an enabler to the business and not a disabler. In an effort to increase productivity and efficiency, one of the common mistakes made by most organizations is to allow technology to dictate the business units by focusing on emerging technologies that are advertised to streamline work functions. As a result, the business units are focused to re-engineer key business processes to accommodate the new technology. In many instances, the business unit suffers a productivity downturn, finds the technology difficult to use and views IT as an obstacle in achieving its stated business objectives. IT must have the mindset to understand the key business drivers and how these drivers generate profit and keep the company in business. Then IT must focus on the processes that support these business drivers and how best to optimize them. Once optimization occurs, IT can then implement technologies that will automate these processes, bringing an additional level of productivity improvement and efficiency to the business unit — again, ultimately increasing its ability to properly execute.

What about outsourcing certain functions?

Productivity can be further increased by outsourcing commodity-type functions, such as network infrastructure and desktop and help desk support. This allows more time to focus on the key business functions.

What does continuous improvement (CI) really mean, and how can it help overcome challenges?

As the term implies, CI is a never-ending process. It is much more than specific activities, such as answering the phone by the third ring if you are in the service business. It is reducing variations. It’s eliminating situations that do not add value. It’s improving customer satisfaction. It is important to determine what causes situations to occur in the first place and focus on the causes rather than just fighting fires. When you engage in process improvement, you seek to learn what causes things to happen in the first place, and then you can use this knowledge to implement the correct solution.

Please explain process definition.

Building a task and event relationship is called process definition. Being able to define that process has several advantages. If you properly determine how processes and technology affect each other and what your customers are really looking for, you have a much better chance of supplying their needs in a timely, profitable manner. When you define any barriers to customer satisfaction, you can eliminate them. Examining a process can give you more insight into its pros and cons, allowing you to make adjustments that lead to improvement in your overall operation and customer satisfaction. Process accounts for 80 percent of all problems while people account for only 20 percent. If you have the right processes, your people can act more efficiently.

CHUCK ORRICO is president and co-founder of Houston-based DYONYX. Reach him at (713) 830-5603 or chuck.orrico@dyonyx.com.

Tuesday, 29 January 2008 19:00

A normal housing market

Perhaps no sector of the commercial real estate market has seen more volatility over the past three years than multihousing. From a market position grounded in strong fundamentals to a euphoric run to conversions, the apartment market is again beginning to stabilize for owners, potential investors and tenants.

“The good news is that the effects of ‘conversions’ to ‘reversions’ over the last two quarters of 2007 are being addressed and we are seeing an increasing degree of equilibrium return to the market,” says John Selby, a multihousing specialist and senior vice president with CB Richard Ellis in Tampa, Fla. “As such we are encouraged to enter 2008 with some clarity as we seek to provide guidance to our clients.”

Smart Business talked with Selby for his insight on this segment of the real estate market that truly hits home for many.

Can you quantify what the ‘conversion’ effect has been on the Tampa Bay apartment market?

Between 2004 through 2006, approximately 35,000 apartment units were purchased for the purpose of converting to condominiums. That number represents about 18 percent of the total multihousing inventory in the Tampa Bay area. Most were Class A properties, but many were older buildings that required extensive renovation and upgrades. Initially, sales were very strong as buyers made very aggressive offers, capital was readily available for 100 percent financing, and qualification and income standards were relaxed. In the midst of this buying frenzy was always the question of real demand by buyers intending to occupy their home versus artificial demand from investors or speculators. By early 2007, the weaknesses suspected in this over-heated sector became evident, forcing some property owners to start leasing unsold units by offering reduced rent and substantial concessions. These new offerings of upgraded condominiums, as well as investor-owned single-family homes, began to affect the traditional apartment properties. These properties had been benefiting from high occupancies from tenants previously displaced by conversions. In the end, approximately 25,000 of the original 35,000 units purchased for conversion will remain as fee ownership.

How did this scenario in Tampa compare to other Florida markets?

On a stand-alone basis, the area fared very well, especially as much of the growth via new construction served to foster opportunities for residential living in Tampa’s central business district. Statewide, the comparison that draws the most interest is to the phenomenal plan to deliver nearly 60,000 units in Dade County alone. While there are a variety of opinions attempting to assess that market, it is interesting to follow the current activity taking place. Foreign investors seeking to take advantage of quality product, softening prices and a weak U.S. dollar are now aggressively buying in the international city of Miami.

Overall, Tampa’s underlying strong fundamentals have served to bring a quicker sense of stabilization to its market.

Is ‘work force housing’ viable in today’s market?

Yes, if there is a will to make it happen. While there are a variety of incentives potentially available to deliver affordable inventory, the only true variable factor is the cost of land. For instance, if publicly controlled sites can be made available, and there is some flexibility in issues such as density and parking requirements, then the numbers can work. This is a very important component to the overall housing picture in any major city.

From an investment standpoint, what is the status of the multihousing market as we begin 2008?

With equilibrium returning to the market, there are still opportunities for both sellers and buyers. For the seller, cap rates have not increased in proportion to the rise in interest rates and loan spreads. Subsequently, values have held up well and the demand for quality product is still strong. For the buyer, a recovering market that should be back to full strength by the third or fourth quarter, combined with moderated but solid employment and job growth, as well as limited new sites for new development, bode well.

An additional key factor is the increased demand from renters. Many simply chose the flexibility of renting as a lifestyle and now have more attractive options. Additionally, there is an expanded pool of tenants who will rent by necessity based on issues caused by the effects of the adjustable and subprime mortgage fallout.

JOHN SELBY is a multihousing specialist and senior vice president with CB Richard Ellis in Tampa. Reach him at (812) 273-8413 or by e-mail at john.selby@cbre.com.

Tuesday, 29 January 2008 19:00

Your virtual infrastructure

Virtualization is a rapidly growing technology used by many organizations today. Most companies understand the purpose and value of integrating a virtual infrastructure but may not understand how to properly integrate this new technology into their traditional IT management framework and processes.

“Although virtualization technology may be complex, many management tasks related to virtualization are simplified compared to traditional systems,” says Brian Capoccia, disaster recovery practice manager at Agile360, A Division of Entisys Solutions, Inc.

Smart Business talked with Capoccia for his insight into the best practices in managing your virtual infrastructure.

Do I manage my virtual machines in the same way I manage my physical systems?

In many ways, yes. Software updates, security patches and service packs are all applied in the same manner as they would be to a physical machine. However, there are also many management tasks that relate to virtual machines (VMs) that do not exist for physical machines. Tasks such as rapid provisioning, live migrations between physical systems and snapshot capabilities are performed with VM management software. This software also provides administrators performance and resource information about the host system and the VMs that run on it.

Does virtualization increase the complexity of my IT environment?

The initial implementation will add to the level of complexity as a whole, however, many administrative tasks will be simplified. Virtualization brings new and simple ways to accomplish tasks that are traditionally more complex on physical systems. For example, implementing virtualization can offer protection against server hardware failure that is comparable to clustering. However, implementing and maintaining a clustered system is much more complex than maintaining the same system in the virtual infrastructure.

Virtualization can also simplify the recovery process for failed systems. VMs are recovered as easily as restoring deleted files. In contrast, the recovery process for physical systems may require that the system is recovered to identical hardware, which will usually include the installation of a base operating system, applications and then the restoration of critical files from full and incremental backups.

Can I monitor virtual machines with my current system management software?

Yes, most system management software packages have the capacity to manage and monitor VMs in the same way as physical systems. HP Insight Manager, Dell OpenManage and IBM Director all have this capability. Administrators that are already familiar with these tools are able to leverage their existing knowledge to also manage host machines in the same way.

In addition to these system management tools, virtualization solution providers have management products that are designed specifically for VM management.

What kinds of tools and processes can I use to prevent ‘server sprawl’?

Virtualization itself combats server sprawl by allowing for the consolidation of many physical servers. However, compared to physical servers, VMs are relatively easy to build, duplicate and deploy. This leads to a new phenomenon known as ‘virtual machine sprawl.’

The provisioning process for physical machines has built-in controls that manage machine sprawl. Cost and the procurement process are probably the biggest factors. The VMs should be viewed similarly. The key to managing VM sprawl is to manage the provisioning process just as the procurement process is managed for a physical server. A provisioning process should include the requestor of the resource, the reason for the request and an approval process. Only then should a VM be created.

Controls exist in the virtual infrastructure to prevent unauthorized personnel from creating VMs. Only administrators directly responsible for managing the virtual infrastructure should be assigned this right.

Can I back up VMs the same way I back up physical servers?

Yes, VMs can be backed up just like any other server in the environment. However, careful planning should be considered when backing up VMs. Traditional methods can cause a lot of strain on the system if multiple VMs are backed up simultaneously on the same host. This is because the backup agent is designed to utilize the processor on the server that is being backed up.

There are several methods that exist that can augment traditional backup methods. They include SAN based backups, disk-to-disk-to-tape backups and virtualization specific consolidated backup frameworks that allow for the off-loading of the CPU processing to a backup proxy.

What effect does server virtualization have on my backup window?

Certain virtualization technologies have the capacity to allow for a ‘hot backup’ of VMs. This means that the VM is briefly put on pause to allow for a backup to occur with little impact on performance. This provides the ability to back up VMs during production hours, virtually eliminating the traditional ‘backup window.’

BRIAN CAPOCCIA is disaster recovery practice manager at Agile360, A Division of Entisys Solutions, Inc.. He can be reached at (949) 278-8065 or at brian.capoccia@agile360.com.

Tuesday, 29 January 2008 19:00

Protection for D&Os

Whether you are volunteering for a small not-for-profit organization or have been appointed to the board of one of the largest companies, you need to know about potential personal liability. You also need to know how to obtain protection from potential litigation in those positions.

“Whenever you are involved in making decisions that affect other people or organizations, it is important to recognize that those decisions expose you to litigation,” says James M. Fasone, division senior vice president at Arthur J. Gallagher Risk Management Services Inc. “Directors and officers (D&O) insurance is the best form of protection, after due diligence, against claims of omission or alleged misconduct.”

Why do the majority of organizations of any size buy D&O insurance?

Most buyers of this product are looking to protect their personal liability arising from their fiduciary duty to third parties as directors and officers. While most corporations allow for the indemnification of directors and officers, both the corporation and directors and officers appreciate the added security of insurance protection against personal litigation for any alleged mismanagement.

Many buyers merely look at the economics involved and the potential exposure to high-net-worth independent directors, and migrate to the lowest cost alternative. As this is a very personal protection, more savvy buyers look to benchmarking studies, peer review and complex limit analysis tools to support this important decision.

There are some limited publications available that provide generic data sets that can provide some guidelines. Many buyers rely on various third parties, including insurance brokerage specialists, that have expertise in the product line.

Who are some of the common litigants against directors and officers?

Though employees bring the majority of claims against private and not-for-profit companies, shareholders bring the most claims against publicly traded companies. Litigation can be brought from numerous other sources, including debt holders, customers, suppliers, competitors and regulatory agencies, to name a few.

What are the current market conditions for purchasing D&O insurance?

This is an excellent time to be a buyer for D&O liability insurance. Pricing has significantly softened the past few years. Increased capacity, more sophisticated underwriting tools and a recent decreasing trend in claims have all lead to more aggressive pricing for buyers.

The average cost of defense of litigation is about $600,000. While settlement amounts vary, employment-related claims can exceed $2 million and securities litigation settlements average over $18 million.

For privately held and not-for-profit organizations, the policy is purposely written in a broad fashion. Typically, directors and officers, employees, temporary staff, and the organization itself are protected under the policy. The policy can be negotiated to protect independent contractors, volunteers and even certain third parties in some instances.

Is any particular industry sector more at risk for D&O liability than others?

While underwriters look to assess the risk of each individual company on its own merits, some industry sectors have a higher risk profile and are prone to more litigation. Among these sectors are technology companies, energy-related corporations and those involved with the delivery of health care services. These particular sectors can have highly volatile financial results due to changing technology, shifting competitive landscape and unmanageable regulatory influence.

Does an initial public offering or the raising of public debt increase the potential liability of directors and officers?

Any public filing with the Securities and Exchange Commission, whether to raise debt or equity, significantly increases the potential for personal liability of those involved. The SEC Acts 33 and 34 have very specific provisions regarding the types of representations that can be made about the sale of securities to the general public. Any and all disclosures made to the public are often sited in misrepresentation cases to support securities litigation when a company fails to meet its financial projections. Securities litigation represents the single largest cost of claims to the insurance companies, making the cost of premiums for public companies significantly higher than their privately held peers. <<

JAMES M. FASONE, ARM RPLU, is division senior vice president at Arthur J. Gallagher Risk Management Services Inc. Reach him at (303) 889-2516 or at Jim_fasone@ajg.com.