Here is my problem: I have to use a voltage source that supplies a load through a resistor. The voltage source has a rating of 1000 volts and a power rating of 10000 watts. The resistor is 5 ohms with a power ratings of 118.975 watts. The resistance of the load may vary randomly between 200 and 600 ohms. Using 100000 samples calculate the number of times the power rating of the resistor is exceeded, average power delivered by source, maximum power delivered by source, and the minimum power delivered by source. The output of the program should consist of 4 numbers, each number being output on a separate line in the order listed above. The numbers should be output with a precision of 2. I hoping that someone can help me out and explain to me what i need to do.

Which bit are you stuck on - the electronic theory that you have to implement in code, or the code itself? Supposing you were to do this manually (let's say there are 10 samples rather than 100000). How would you go about it?

For sake of argument let's say that the resistance varies pseudo-sinusoidally something like 400, 500, 600, 500, 400, 300, 200, 300, 400, 500. What results would you get?