0 Replies - 378 Views - Last Post: 18 October 2013 - 03:40 PM Rate Topic: -----

#1 allstarpro  Icon User is offline

  • New D.I.C Head

Reputation: 0
  • View blog
  • Posts: 1
  • Joined: 18-October 13

Python Buffer simulation

Posted 18 October 2013 - 03:40 PM

Assume the following:
a. Disk spooling is NOT being used.
b. The printer does NOT have a hardware buffer to hold the output while the printer is printing
(The theme music from Mission Impossible is playing faintly in the background).
SIMULATE the following scenario
A hypothetical program computes for three seconds then outputs a variable length record to be printed. The printer takes from 0.75 to 4.75 seconds (average time is 2.75 seconds) to print each output record. (Use a random number generator)
The hypothetical program loops 500 times for each case of software buffers (0, 1, 2, 3, 4, 5, 10, 25, and 100 software output buffers). Calculate the AVERAGE time for the program to “virtually compute and print” a record from the 500 records, for EACH of the 9 choices of buffer. Plot the results. The Y axis is from zero to 8 seconds, and the X axis is nonlinear and displays all nine cases of buffers.

Yes this is a homework assignment, no I do not want the answer.

I'm looking for a general idea of what is supposed to go on in this problem. That is all.

Hope you can help.

Thank you.

Is This A Good Question/Topic? 0
  • +

Page 1 of 1