I love Hyper-V 3.0… particularly compared to earlier versions. It comes packed with some very nice new features, several of which are geared around the idea of thin deployment. One such feature is Dynamic Memory. Dynamic Memory allows you to set a base “Starting” amount of RAM for a server (say something low like 512 MB) and then also set a max amount it can take up (say 8 Gb). The idea is that you can over-provision RAM on a Host server and still be okay if the majority of your VM’s are usually just sitting their idle. Which in most cases they usually are. The problem is that on the client machine, if you are running Windows Task manager at least, you will almost always see 90 – 95% memory utilization and it will show whatever the max is that your server can scale to (say 8 GB).

This really threw me off recently. I had one VM that was misbehaving due to having the VHDX file on a slow share on a storage array. Initially, not knowing what was broken, I took a look at task manager on the VM (which was running Server 2012) and noted that it was showing nearly Maxed out RAM usage. Further investigation though showed that it couldn’t possibly be using more than 1 Gb (of the 8 Gb shown in task manager) at any one time.

After some further investigation I learned that this is common behavior on VM’s that are allocated memory dynamically and nothing to be concerned with. The VM today still has dynamic memory and still shows 95% usage pretty much all the time but runs just fine now that the VHDX file has been moved to faster storage. Anyhow, hope this helps someone else out!

1 of 1

This post has no comments. Be the first to leave one!

Join the discussion

Your email address will not be published. Required fields are marked *