One millisecond is 0.001 of a second. The costs for a single run of a computer program are $1.07 for operating system overhead, $0.023 per millisecond of computer time, and $4.35 for the mounting of a data tape. What is the total of these three costs for 1 run of a program that requires 1.5 seconds of computer time?
(A) $7.15
(B) $8.87
(C) $28.96
(D) $35.57
(E) $39.92
Millisecond computer program
This topic has expert replies
-
- Legendary Member
- Posts: 1085
- Joined: Fri Apr 15, 2011 2:33 pm
- Thanked: 158 times
- Followed by:21 members
immediately transfer $0,023 per millisecond of computer time cost into second time cost --> $23 per one second. Hence 1,5*23+1,07+4,35=39,92neeti2711 wrote:One millisecond is 0.001 of a second. The costs for a single run of a computer program are $1.07 for operating system overhead, $0.023 per millisecond of computer time, and $4.35 for the mounting of a data tape. What is the total of these three costs for 1 run of a program that requires 1.5 seconds of computer time?
(A) $7.15
(B) $8.87
(C) $28.96
(D) $35.57
(E) $39.92
answer is e)
Success doesn't come overnight!
GMAT/MBA Expert
- Brent@GMATPrepNow
- GMAT Instructor
- Posts: 16207
- Joined: Mon Dec 08, 2008 6:26 pm
- Location: Vancouver, BC
- Thanked: 5254 times
- Followed by:1268 members
- GMAT Score:770
Let's first determine the cost of 1.5 seconds of computer time. To do so, we can use equivalent ratios.neeti2711 wrote:One millisecond is 0.001 of a second. The costs for a single run of a computer program are $1.07 for operating system overhead, $0.023 per millisecond of computer time, and $4.35 for the mounting of a data tape. What is the total of these three costs for 1 run of a program that requires 1.5 seconds of computer time?
(A) $7.15
(B) $8.87
(C) $28.96
(D) $35.57
(E) $39.92
Here, we are comparing cost and time. So, the ratio is (cost in dollars)/(time in seconds)
We're told that computer time costs $0.023 per millisecond of computer time. Since a millisecond = 0.001, we can write the ratio:
(0.023)/(0.001)
The decimals here create a problem, so let's create an equivalent ratio by multiplying top and bottom by 1000 to get: (23)/(1)
This tells us that it costs $23 for 1 second of computer time.
From here, if we multiply top and bottom by 1.5, we get: (34.5)/(1.5)
This tells us that it costs $34.50 for 1.5 seconds of computer time. Great!
So, the total cost = $34.50 + $1.07 + $4.35 = $39.92 = E
Cheers,
Brent