In Gothenburg, Sweden, an algorithm used for school admissions in 2020 produced chaotic results, misplacing hundreds of children miles from their homes. The system, intended for efficiency, calculated distances as the crow flies, failing to account for geographical barriers like a major river, leading to impractical commutes. Despite auditors confirming the flawed instructions and subsequent procedural improvements, the approximately 700 children initially affected by the errors would spend their entire junior high years in schools not of their choosing. AI
Summary written by gemini-2.5-flash-lite from 2 sources. How we write summaries →
IMPACT Highlights the risks of deploying opaque algorithmic systems in public services and the challenges of seeking recourse when errors occur.
RANK_REASON This article discusses a specific instance of an algorithm causing harm in a public service, highlighting issues of accountability and systemic malfunction, rather than a new model release or major policy shift.