
Pilar Orero gave a keynote speech to close the Eyes4Access Workshop at ETRA’25 in Tokyo.
The Eyes4Access Workshop brought together interdisciplinary researchers to explore advances in eye-tracking research in accessibility. The workshop focused on inclusive technologies, interaction techniques, and real-world applications in digital media, education, and cultural scenarios. Aimed at advancing gaze-based accessibility solutions, it investigated the potential of eye-tracking to create inclusive HCI experiences, particularly for users with sensory and cognitive impairments.
Eyes4Access fostered community collaboration by presenting novel research, methods, and applications of eye-tracking for accessibility.
In her keynote speech, Pilar Orero presented ALFIE and addressed issues surrounding the widespread popularity of generative AI and large language models (LLMs), along with the often superficial public debate that accompanies them. A critical yet frequently overlooked issue is the quality of code generated by these models. The outputs are often biased, unfair, and error-prone, drawing from human behavioral stereotypes such as “the shopper,” “the gamer,” or “the runner.” These models tend to represent humans as winners, overachievers, and highly capable individuals, which reinforces unrealistic or exclusionary ideals.
The lack of oversight regarding dataset quality contributes to the promotion of poor programming practices. Pilar’s presentation advocated for a kinder and fairer approach to generative AI, ideally based on or complemented by large eye-tracking data models. She also proposed normalizing eye-tracking as a standard interaction tool in future IT devices—such as PCs, smart glasses, and mobile phones—to reduce digital exclusion. This vision promotes accessibility and inclusivity, welcoming all citizens seamlessly into emerging virtual environments and the evolving XR internet.