+1 vote
in Python by

Write a function to simulate the overlap of two computing jobs and output an estimated cost.

1 Answer

0 votes
by
More context. Every night between 7 p.m. and midnight, two computing jobs from two different sources are randomly started, with each job lasting an hour. When the jobs run simultaneously at any point in their computations, they cause a failure in some of the company’s other nightly jobs, resulting in downtime for the company that costs $1,000.

The CEO needs a single number representing the annual (365 days) cost of this problem.

Hint. We can model this scenario by implementing two random number generators across a spectrum of 0 to 300 minutes, modeling the time in minutes between 7 p.m. and midnight.

Related questions

+1 vote
asked Oct 28, 2022 in Python by SakshiSharma
0 votes
asked Oct 28, 2022 in Python by SakshiSharma
...