I need assistant in writing this program Using the priority queue, simulate the following queueing system that models a packet switch. Packets arrive to the switching facility with average rate Lambda packets/sec. The time between arrivals has an exponential distribution with rate 1/Lambda second. The packets are served by a common communication link each needs a time of T = (packet length/link capacity) seconds to be served. What needs to be done here is to generate events of packet arrivals and service finishing and insert them in an EVENT queue (implemented as priority queue). You start by inserting the first packet arrival into an empty EVENT queue. At any instant in time you should be able to see how many packets are waiting in the queue to be served by the link (note that the events queue are different from packet queue). Then your program should behave as follows: While (Time < MaxTime){ Dequeue from Events Queue and update current system time to event time if (event is packet arrival){ generate next arrival into system and enqueue into events queue if link server is empty generate a service finish event after current time + T and enqueue into event queue } else { //event is service finish update statistics (see below) and generate next service finish event as above (given that the queue is not empty) } } Display Statistics about the system (mean queue length, mean packet waiting time, mean time the system is idle) Try several values of Lambda and T. What happens if Lambda > 1/T? What if T is also random, say exponential or uniformly distributed, with average value T ?Generalize to the case when the system has priorities for packets. High priority packets are always served before low priority packets. Generalize to a system with N input ports
Sounds reasonably straight forward (when you've ploughed through the unnecessarily overcomplicated jargon). Where are you stuck? Do you understand the requirements? Have you written any code yet? Have you determined algorithms for (a) sourcing the simulated packets and (b) handling the packets? If Lambda > 1/T the result should be fairly obvious. What do you think will happen if more packets come in than the system can handle?