Round robin Time slicing

Discussion in 'C' started by chs_rachel86, Apr 10, 2008.

  1. chs_rachel86

    chs_rachel86 New Member

    Joined:
    Apr 10, 2008
    Messages:
    1
    Likes Received:
    0
    Trophy Points:
    0
    Hi,
    Does anyone know how to do this question

    UNIX operating system uses Round robin Time slicing with multilevel feed back.
    Assume that there are 10 processes waiting in a queue which is implemented as a
    linked list of PCB’s ( process control blocks). Assume the PCB have information
    about Process ID , CPU burst time required , amount of memory being used .
    Assume the time slice is 2 units. Simulate Round robin Time slicing until all the
    jobs complete and find average waiting time. Modify your program to include
    random arrival of jobs with a fixed burst time required and find the average
    waiting time of the jobs completed over a simulation time of 100 units.
     
  2. technosavvy

    technosavvy New Member

    Joined:
    Jan 2, 2008
    Messages:
    52
    Likes Received:
    0
    Trophy Points:
    0
    start doing the question..let us know if you face any issue..no one will help if u don't give a shot to find the solution.. :)
     

Share This Page

  1. This site uses cookies to help personalise content, tailor your experience and to keep you logged in if you register.
    By continuing to use this site, you are consenting to our use of cookies.
    Dismiss Notice