How many times can you divide 24**36 by 2 until it becomes imprecise? I had this question on one of my quizzes in a python course, and I’m curious as to what the correct answer is. (It’s already submitted so no cheating here)
I wrote this small bit of code in python to calculate how many times you can evenly divide the number by two until it becomes a decimal number.
count = 0 remainder = 0 num = 24**36 while remainder == 0: remainder = num % 2 if remainder == 0: num /= 2 count += 1 print(count)
I assumed the answer had something to do with floating point imprecision, but I’m not sure what is classified as “imprecise” here. The code above prints 113, since by the 113th division it will become a decimal number with .5 at the end, and I claimed that number is imprecise. Is this approach correct?
Never ever use floating point arithmetics for integer values, specialy in Python! In other languages, you may have to use multiprecision library for large values, but Python integer are multiprecision integers…
Now 24 ** 36 = (3 * 2**3) ** 36 = (3 ** 36) * (2 ** 108)
So you can divide it 108 times by 2, and you will get integer values.
At step 109, you will get a floating point value. If you ask
math.log2(3**36) in Python, you get 57.058… meaning the it will need more that the maximum precision of a IEE 754 double precision floating point (what Python floats are), because it is only 54 bits.
So the answer is 108.