I've recently moved to an environment that has drastically over provisioning their virtual machines. I believe that this is causing a performance issue and when I run a POWERCLI script to work out the average CPU RDY value over 30 days (I’ve attached the script) I’m getting values of 350% which is something I’ve never seen before!
One example is a VM with 24 CPUs and 64GB of RAM which has the average value of 353% over a 30 day period.
What sort of performance effect would this have the virtual machine? Would it take 100 milliseconds longer to run a process? Would it take 100 seconds longer to run a process? I’m trying to quantify the delay so I can better explain it to the business.
Thank you for your assistance.