A baseball player throws a baseball from a height of 1 m above the ground and its height is given

A baseball player throws a baseball from a height of 1 m above the ground and its height is given by the equation h= -3.2^2+ 12.8t +1, where H is the height in meters above the ground, and t, in seconds, is its time in the air [see the image below]. When, to the nearest tenth of a second, will the ball hit the ground? (You must

solve by factoring)!<<<<< it's important that you do. solve this

Leave a Reply

Your email address will not be published. Required fields are marked *