The new year is often a time for reflection. Recently, we’ve been reflecting on the fundamentals of human factors, marveling at the industry’s evolution, and the trends emerging in our daily work. Taking a moment to look back, the story of human factors is one of constant evolution, showing how people and technology have worked side by side throughout history. From its early beginnings to the complexities of today’s world, this field has worked tirelessly to make our lives better by aligning systems and tools with human needs and strengths. Looking back at its milestones gives us a glimpse into where it might head in the future.
Early foundations: Anthropometry and ergonomics
The story starts around 1490 when Leonardo da Vinci created the Vitruvian Man—a timeless blend of human proportions and geometry. His work laid the groundwork for anthropometry, which is the study of measuring and analyzing various physical characteristics of the human body. Today, we use anthropometric data in a variety of industries, including aviation, consumer product design, medical device development, and many others, to make sure that products are designed to fit people – and not forcing people to fit with poorly sized products.
Fast-forward to the early 1900s, when Frank and Lillian Gilbreth created the time-and-motion study methodology. This methodology changed how we analyze work tasks and makes processes more efficient. Their research advanced how we think about the design of work and how to integrate humans and machines effectively. Their collective efforts paved the way for the burgeoning field of ergonomics.
The role of early aviation and war
When the Wright brothers took their first flight in 1903, it became clear that understanding human factors in aviation was critical. This led to studies on pilot performance and instrumentation, for example, designing early iterations of dashboard controls (e.g., angle of attack sensor, and stick pusher) to minimize pilots’ cognitive load.
The experiences of WWI nurses also significantly contributed to early understandings of human performance and the importance of designing systems with human limitations in mind, particularly regarding factors like fatigue, stress and workload management, and work environment design. This laid the groundwork for later human factors research. Additionally, the need to optimize workflow and minimize unnecessary movement within hospitals led to early considerations of ergonomics and space layout, which later became core principles in human factors design.
A turning point: World War II
World War II pushed the boundaries of technology, and there was a general shift to focus more on people’s capabilities to minimize the negative consequences of their limitations. As a result, experimental psychologists were heavily involved in developing aviation and military technologies to help improve human performance and the fit between humans and machines. For example, several studies were conducted on why highly trained pilots had airplane crashes, and it was determined that there were issues with control configurations and dashboard displays. In one such study, researchers found that making the landing gear control visually and tactically feel similar to a wheel would reduce cognitive load. This design still endures today!
The era focused on improving interactions between humans and machines, officially formalizing the discipline that we know as human factors and ergonomics
Post-war advancements and establishment
The mid-20th century saw great progress; human factors and psychology research were advancing rapidly. For example, Paul Fitts developed a model of human movement (i.e., Fitts’ Law) that used a predictive model to describe how long it would take a stimulus to move to a target area based on distance to the target and the target’s width. Fitts later applied these concepts to the field of aviation to help better design dashboards, reduce errors, and improve pilots’ ability to manipulate the controls.
In response to the growing field of human performance and ergonomics research, the Human Factors and Ergonomics Society was established in 1957 and the International Ergonomics Association in 1959 shortly following. These societies cemented the field and spurred worldwide cooperation.
Broadening horizons: 1970s and 1980s
By the 1970s and 1980s, human factors extended beyond military and aviation into everyday products and workplaces. This era saw the rise of cognitive ergonomics, focusing on understanding mental processes crucial for navigating today’s intricate systems.
The explosion of computers and the popularization of the graphical user interface (GUI) by Xerox in 1973 (inspired by the DNLS overlapping window system) led to the development of foundational user experience design concepts, such as the creation of a mouse, overlapping windows, and the desktop metaphor. This went on to influence Apple with their Macintosh line, which cemented the GUI for mainstream users. Researchers for Xerox and Apple, as well as other figures like Don Norman, played a role in developing the branch known as “user experience,” which emphasizes the importance of designing technology with users in mind.
Modern changes: 1990s and onwards
Within the medical device field, interest in the ’90s and early 2000s resulted in a need for more oversight to prevent errors from poor medical device design. With the explosive report in 1999, “To Err is Human” released by the Institute of Medicine, researchers discovered that 44k – 98k patients die annually due to medical errors. This report reinvigorated the need for patient safety and stimulated research funding to help solve and address these issues. In response to this growing need, in 2000, the FDA published its first Human Factors guidance, Medical Device Use-Safety: Incorporating Human Factors Engineering into Risk Management, to inform manufacturers about the importance of usability engineering and to explain how to incorporate human factors in the design process.
The 2000s also introduced neuroergonomics, which applies neuroscience to ergonomics. This growing subfield uses neurophysiological measures to study mental workload engagement and human performance. It has the potential to improve our understanding of the brain and how it works in real-world settings, with applications in the development of products such as wearable devices, interface design, and personalized workplace designs.
Present trends and future outlook
Today, researchers in human factors delve into socio-technical systems, inclusive design, and the integration of AI and robotics. The goal is to ensure these technologies serve humanity. As we look ahead, the close relationship between people and new technologies requires us to adapt continuously, fueling ongoing research and innovation.
Conclusion
While we have highlighted a handful of significant advancements, this brief summary is by no means comprehensive. Still, one thing is clear – human factors principles have shown remarkable adaptability to technological shifts. As practitioners, we can work towards a future where designs focus on human needs by learning from historical strides and keeping up with current trends. Pursuing better, safer experiences will keep driving our essential field, leading to innovations that truly make a difference in everyone’s lives. Our mission is clear—to design systems that not only work well with human capabilities but also elevate them. By staying committed to this, we are set to create a future filled with creativity and inclusivity.
Curious about how historical insights in human factors can inform your next project and propel it into the future? We leverage decades of expertise to tailor our research approach to your specific needs. Let’s talk about how we can help! [email protected]