I am trying to learn Python and I came across the following question:
“A ball is dropped straight down from a cliff with height h. The position of the ball after a time t can be expressed as:
y(t) = v0*t − (at^2)/2 + h
where a is the acceleration (in m/s^2) and v0 is the initial velocity of the ball (measured in m/s). We wish to find for how long time t1 is takes the ball to pass a certain height h1. In other words, we wish to find the value of t1 such that y(t1) = h1. The position of the ball is measured per delta_t seconds. Write a program which finds out how long time t1 it takes before the ball reaches height h1 by using a while loop. Here, we let h = 10m, y1 = 5m, delta_t = 0.01 s, v0 = 0m/s and a = 9.81m/s^2.”
I wrote the following code in Python. The problem is that I get a different answer from the one I am expecting when I solve the problem on paper (for y1 = 5m, t = 1.01s, and for y1 = 3.6m, t = 1.14s). I am not sure where is the issue exactly. Here is my code:
import math import numpy as np h = 10 y1 = float(input("Enter the height you want:")) delta = 0.01 t = v_i = 0 a = 9.81 y = v_i * t - (a * pow(t, 2)/2) + h while True: if y <= y1: print("The object will be at height",format(y, "0.3") ,y1, "around the time", format(t, ".3"), "s") break else: t += delta h = y v_i = v_i - a*t y = v_i * t - (a * pow(t, 2)/2) + h
Advertisement
Answer
You seem to be wanting to derive a time t
for which the distance is the same that the user requires.
For this you attempted to make a loop which guesses the time t
.
Mistakes were made.
Its more simple than you think:
h = 10 y1 = float(input("Enter the height you want:")) delta = 0.01 t = 0 a = 9.81 y = h # y starts at the top of the cliff while y > y1: t += delta # time t is still guessed here y = h - (a * pow(t, 2)/2) # Just recalculate y for the new time t print(f"The object will be at height {y1} around the time {t}")
Note that your v_i
can be assumed to be 0
at all times.