To find the square root of x, make a guess g. If g**2 is close enough to x, then report g as the square root. Otherwise, make a new guess which is the average of g and x/g. Check the new guess and keep repeating until the square of the guess is close enough to x. Suppose x is 99 and the first guess is 5. Using this algorithm, how many guesses will it take until the guess squared is within 1 of x? Use Python to answer this question.
Python code:
#initializing x as 99
x=99
#initializing guess g as 5
g=5
#initializing guess count as 0
count=0
#looping till absolute value of g**2-x is less than 1
while(abs(g**2-x)>=1):
#finding new guess
g=(g+x/g)/2
#incrementing count
count+=1
#printing number of guess
print("Number of guess:",count)
Screenshot:
Output:
Get Answers For Free
Most questions answered within 1 hours.