Paper proposes novel techniques for analyzing the Athletic feats (like Vertical Jump) analyzing the video frame by frame. Most common method to analyze “Athletic Movement/ feat” such as Jump is either an observation made by a human expert/coach, or they are the values captured by measurement devices in the suit or wearables attached to the body of an athlete. Where former requires an access to the human expert, the later requires the special kind of a hardware/sensor that has the capability to extract the body movement stats with respect to time and space. Both methods are pretty accurate but due to their overhead in terms of necessity/dependence on 3rd party system or person, also there’s, of course, the cost such methods come up with, they are often inaccessible in situations where one’s just home practicing or when an athlete is just trying out something in own backyard or Gym (personal zones).
Our target here’s to reduce such dependencies and create such heuristics-based algorithms that can help an individual athlete to assess the feats like Jump, Run, and Leap, without using any 3rd party systems, and be able to approximate the feats and compare them with the existing ones using only the cell phone device in their pocket.
This paper focused only on the sport of a vertical Jump. The system processed the video by each frame, applied HOG (Histogram Of Oriented Gradient) technique to locate the human in a frame and then tracked the human from initial to the last position. We are capable now to calculate pixel distance covered by a human in Jump. We used values like human height to find physical distance covered, Frame Per Frame (FPS) of video, Markers on the screen of mobile while recording videos, to ultimately guess height achieved by the human in terms of the jump made. We ran various experiments on this technique and found our results quite close to an assessment performed by a human expert.