In its recently unveiled Equity Action Plan, the US General Services Administration, which procures and examines technology for things like government websites and online services, is making a two-pronged drive for accessibility. Websites must be accessible beyond the basic minimum, according to the report, and the federal government will avoid using face recognition systems as much as possible due to prejudice against the technology.

The GSA “conducted equity evaluations and established a set of activities for three high-impact target areas,” one of which is “government technology design and delivery,” according to the Action Plan.

The memo’s introduction states, “Those who most need government services will frequently have the most trouble receiving them.” “We are committed to taking steps that emphasise equitable user experience as a basic design concept, minimise algorithmic bias, increase digital accessibility, and modernise government service delivery to the American people.”

For that purpose, the GSA has highlighted two key flaws in the current method of providing such services.

One is a lack of dedication to accessibility, or possibly a solid commitment to basic conformity rather than satisfying the needs of the community.

The GSA evaluation states, “Frequently, government apps and websites have little language accessibility, unclear navigation, and poor design principles, resulting in user distrust and irritation.” It was found, in particular, that the routines of visually challenged users who use screen readers diverge from assumptions assumed while creating government websites. Basic operations like logins and account checks might not follow these preferences, or they can need tools (such as cursor use) that are not always available to users.

To help with this, the GSA said it would increase usability testing to include underrepresented populations in the design process. (As accessibility activists have often reminded me, these groups must be consulted from the beginning, otherwise, the conclusion will be precisely what the community wants.)

It will also try to improve the performance of websites on older PCs, phones, and other low-bandwidth devices.

The second issue is that facial recognition software is racially discriminatory. This will probably come as no surprise to followers of this blog, but federal procurement and deployment procedures are notoriously sluggish and strange, so it is not unexpected that the feds are only now catching up with what the IT world has been warning about for years.

“Through our testing, GSA discovered that significant commercial implementations of face matching had disproportionately high ‘False Rejection Rates’ for African Americans,” the email adds, noting that this is in line with the greater body of research in the field.

Its strategy is to address data sovereignty and discrimination in emerging technologies. The GSA’s Login.gov team will research equity and bias in face matching services to deliver an equitable remote identity-verification experience for a diverse population. GSA will also smear an even-handedness lens to the user guides it produces, which will have an impact on governmentwide and industry best practices.

A broad reaction might reflect both structural change and lip service, which is frustrating. Further study on prejudice in face recognition systems would almost definitely lag academic and business research by years, but the GSA is likely hoping to portray itself as a neutral party.

The “equity lens” may or may not be useful, but one hopes that, like in so many other firms and sectors, some individuals have been pointing out problems along these lines for years and have been unable to persuade anybody to listen. Perhaps now is the time to give those voices the attention they so richly deserve.

Share.

Leave A Reply