Welcome!

Bick Group

David Linthicum

Subscribe to David Linthicum: eMailAlertsEmail Alerts
Get David Linthicum via: homepageHomepage mobileMobile rssRSS facebookFacebook twitterTwitter linkedinLinkedIn


Related Topics: Virtualization Magazine, Virtualization Expo

Article

Lack of Best Practices Is Hurting Virtualization

At the end of the day, virtualization is something that you do, as well as something you buy and implement

Those who approach virtualization these days are often taken aback by the complexity of the architecture when faced with the number of layers they must deal with to deploy virtualization. Most typically consider virtualization around data center optimization or private clouds, and find it difficult to get their bearings as they wade into their first virtualization project.

While the virtualization vendors do render assistance, the core issue is that those who deploy virtualization often need to deal with many domain-specific issues, such as architectural complexities around data, service, and process sharing and synchronization across virtualized environments. While there is a lot of information out there about what virtualization is, few understand the way to approach virtualization and the prevailing best practices. As a result, I hear about project after project that ends badly. Not due to the issues with the technology, but the results of creating the wrong solution and having to live with it. As one person put it, "We did not know what we were doing and it showed."

Core to this issue is that many believe the technology itself has magical powers, and that implementing VMware or Zen is the one and only step. Truth be told, this is an architectural exercise, thus many factors should be considered including the data and the use of data, services and the use of services, and processes and the use of processes.

For those who seek best practices in the world of virtualization, it may be best to look toward existing approaches and best practices around SOA. Core to SOA is the notion that we're going to break systems down to their functional primitives and built them up again as sets of architectural components, such as services, and align those services to meet the needs of the business.

What's critical here is to have a much better understanding of the business, the problem domain, and the underlying architectural components (e.g., data, services, and processes), before proceeding to the step that covers how all of that is going to work and play well in a virtualization environment. This intermediate step could be called: How to better configure the virtualization environment to meet the needs of the system or systems. It's a bit more work than you may have expected, but its well worth the effort, considering the risk that it removes.

In moving beyond the core architectural issues, we also need to consider governance, security, and testing, and the best practices around approaching those issues, and, most important, how you define success. Many of those who look to leverage virtualization have no idea what the core benefit is around their existing IT infrastructure, and thus need to establish a business case and goals that should be reached when implementing virtualization.

At the end of the day, virtualization is something that you do, as well as something you buy and implement. Those who do it wrong will fail, no matter what the technology can do. Those who define a healthy process around virtualization implementation will most often succeed. Who do you want to be?

More Stories By David Linthicum

Dave Linthicum is Sr. VP at Cloud Technology Partners, and an internationally known cloud computing and SOA expert. He is a sought-after consultant, speaker, and blogger. In his career, Dave has formed or enhanced many of the ideas behind modern distributed computing including EAI, B2B Application Integration, and SOA, approaches and technologies in wide use today. In addition, he is the Editor-in-Chief of SYS-CON's Virtualization Journal.

For the last 10 years, he has focused on the technology and strategies around cloud computing, including working with several cloud computing startups. His industry experience includes tenure as CTO and CEO of several successful software and cloud computing companies, and upper-level management positions in Fortune 500 companies. In addition, he was an associate professor of computer science for eight years, and continues to lecture at major technical colleges and universities, including University of Virginia and Arizona State University. He keynotes at many leading technology conferences, and has several well-read columns and blogs. Linthicum has authored 10 books, including the ground-breaking "Enterprise Application Integration" and "B2B Application Integration." You can reach him at david@bluemountainlabs.com. Or follow him on Twitter. Or view his profile on LinkedIn.

Comments (1) View Comments

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


Most Recent Comments
davidcaddick@gmail.com 06/09/09 05:31:41 PM EDT

David,

So if you think it's bad for Virtualization (and you do appear to be discussing just Server Virtualization?) then think how bad it is for VDI? As there are so many more components and dependancies?

I recall how we had the same issues with introducing Citrix Winframe/Metaframe? It was so cool, and it just worked - yet typically if not done correctly with a bit of prior planning it can easily end up becoming the stick that is used to beat up the IT dept?

Sound familiar? ;-)

Dave Caddick