Does anyone know how to do this question
UNIX operating system uses Round robin Time slicing with multilevel feed back.
Assume that there are 10 processes waiting in a queue which is implemented as a
linked list of PCB’s ( process control blocks). Assume the PCB have information
about Process ID , CPU burst time required , amount of memory being used .
Assume the time slice is 2 units. Simulate Round robin Time slicing until all the
jobs complete and find average waiting time. Modify your program to include
random arrival of jobs with a fixed burst time required and find the average
waiting time of the jobs completed over a simulation time of 100 units.