Here is my problem: I have to use a voltage source that supplies a load through a resistor. The voltage source has a rating of 1000 volts and a power rating

of 10000 watts. The resistor is 5 ohms with a power ratings

of 118.975 watts. The resistance of the load may vary

randomly between 200 and 600 ohms. Using 100000 samples

calculate the number of times the power rating of the resistor is exceeded, average power delivered by source, maximum power delivered by source, and the minimum power delivered by source.

The output of the program should consist of 4 numbers,

each number being output on a separate line in the order

listed above. The numbers should be output with a precision

of 2.

I hoping that someone can help me out and explain to me what i need to do.