in Operating System recategorized by
5,990 views
1 vote
1 vote

Suppose that the number of instructions executed between page fault is directly proportional to the number of page frames allocated to a program. If the available memory is doubled, the mean interval between the page faults is also doubled. Further, consider that a normal instruction takes one microsecond, but if a page fault occurs, it takes 2001 microseconds. If a program takes 60 sec to run, during which time it gets 15,000 page faults, how long would it take to run twice as much memory were available?

  1. 60 sec
  2. 30 sec
  3. 45 sec
  4. 10 sec
in Operating System recategorized by
6.0k views

2 Answers

5 votes
5 votes
ans should be C   45 sec

Normal instruction takes 1 micro second 10^-6 sec

instruction with page fault takes 2001 micro seconds so p.f take 2000 microsecond

now given program takes 60 sec  and there were 15000 p.f  

so time taken by page faults= 15000x2000 micro seconds=30 sec

rest 30 sec are consumed by program execution

if  memory is doubled than the mean interval between the page faults is also doubled(and hence p.f rate will be reduced by half) so when earlier 30 sec were needed  by program instructions  and 30 sec of p.f

now program execution takes 30 sec  and  p.f will take 15 sec(15000x2000x10^-3)/2

so total time =30+15 =45 sec
3 votes
3 votes
60sec = n * 1us + 15000*2000us

So n= (60-30)sec/1us = 30*10^ 6 instructions

Now memory is doubled...so page faults will be half...15000/2 = 7500

So = 30 * 10^6 * 1us + 7500 *2000us

= 30sec + 15 sec

= 45 sec
Answer:

Related questions