a team of software engineers are testing the time taken for a particular type of modern computer to execute a complicated algorithm for factoring large numbers. they would like to estimate the mean time taken for a computer to execute the algorithm. a random sample of 41 times are collected. the mean time in this sample is 535.0 seconds and the sample standard deviation is found to be 71.2. calculate the 95% confidence interval for the mean time taken to execute the algorithm. give your answers to 2 decimal places.