Fitts’s Law is a principle of human movement in ergonomics and human-computer interaction that predicts the time required to rapidly move to a target area. It is used to model the act of pointing, both in the real world, such as with a hand or finger, and on a computer, such as with a mouse or a touchscreen.

Developed in 1954 by Paul Fitts, the law mathematically models the time it takes to move from a starting position to a final target area as a function of the ratio between the distance to the target and the width of the target. Fitts’s Law is often formulated as:

Where:

  • T is the average time taken to complete the movement.
  • a and b are constants that can be determined empirically by regression analysis.
  • D is the distance from the starting point to the center of the target.
  • W is the width of the target measured along the axis of motion.
  • log2​ is the logarithm to base 2.

The key takeaway from Fitts’s Law is that the time required to acquire a target is a function of the distance to and the size of the target. Smaller and further targets take longer to acquire. This principle has important implications in user interface design – it suggests that objects that are frequently clicked should be larger and placed closer to the expected position of the cursor, thereby reducing the time and effort required for the user to interact with them.

Fitts’s Law is widely used in various fields, including ergonomics, HCI, and in designing user interfaces for applications and websites.


Source