In one of the best Slashdot threads I’ve seen in ages, a number of posters chime in with their personal experiences of virtualization. (Usage hint: Set the general threshold = 5 to filter out the dreck, using Advanced Context Controls.) The rough consensus appears to be:
- Virtualization has overhead, but probably a lot less than the 43-50% sometimes claimed.
- Just to be safe, don’t virtualize apps that are already I/O-bound or otherwise running flat-out. (So there’s no contradiction to my support for dedicated security, networking, and data warehouse appliances.)
- Big enterprises have lots of production servers that are old, unreliable, and/or idle most of the time. Virtualize those.
- If a server’s use is particularly spiky, it may be a great candidate for virtualization.
- Most development servers can and should be virtualized.
Makes sense to me.