Skip to content
Advertisement

Why does the following code is giving an answer that is very far from the analytic answer?

I am trying to learn Python and I came across the following question:

“A ball is dropped straight down from a cliff with height h. The position of the ball after a time t can be expressed as:

y(t) = v0*t − (at^2)/2 + h

where a is the acceleration (in m/s^2) and v0 is the initial velocity of the ball (measured in m/s). We wish to find for how long time t1 is takes the ball to pass a certain height h1. In other words, we wish to find the value of t1 such that y(t1) = h1. The position of the ball is measured per delta_t seconds. Write a program which finds out how long time t1 it takes before the ball reaches height h1 by using a while loop. Here, we let h = 10m, y1 = 5m, delta_t = 0.01 s, v0 = 0m/s and a = 9.81m/s^2.”

I wrote the following code in Python. The problem is that I get a different answer from the one I am expecting when I solve the problem on paper (for y1 = 5m, t = 1.01s, and for y1 = 3.6m, t = 1.14s). I am not sure where is the issue exactly. Here is my code:

JavaScript

Advertisement

Answer

You seem to be wanting to derive a time t for which the distance is the same that the user requires.

For this you attempted to make a loop which guesses the time t.

Mistakes were made.

Its more simple than you think:

JavaScript

Note that your v_i can be assumed to be 0 at all times.

User contributions licensed under: CC BY-SA
1 People found this is helpful
Advertisement