For this value at risk question, I wonder, why can’t we simply multiply the annual standard deviation of $800,000 by 5 years to arrive at 5 years standard deviation.
I understood the calculation based on your lecturer, but is it wrong mathematically to times the annual standard deviation by 5, instead of turning it into variance first, then you convert the 5 years variance by square root the product. Hope you can understand my wording and sorry for overcomplicating things.