Google to suggest more ‘inclusive’ terms with new assisted writing tool
Google’s new ‘assisted writing’ tool will suggest more ‘inclusive’ terms in future, it was announced at Google’s annual I/O developer conference.
In an example Google gave, a user who wrote ‘chairman’ was given the option of using ‘chairperson’ or ‘chair’ – gender-neutral alternatives.
Tuesday’s conference also revealed Google aims to shift away from ‘gendered’ language.
The new assisted writing tool will also suggest ways to avoid passive voice or offensive language.
Google boss Sundar Pichai has previously advocated for social justice as a priority across the company.
After a tumultuous year and a series of internal battles over the way in which Google handles diversity and inclusion, the search engine giant revealed a number of changes in an effort to make its technologies more inclusive.
The company has been under pressure to be more inclusive after it was branded ‘institutionally racist’ by former employee and AI academic Timnit Gebru.
Gebru claims she was fired from the company after she co-wrote a paper about discrimination in AI last year, though Google contests this version of events.
The new assisted writing tool was part of a raft of changes to its office functionality software, like Docs, Meet, Sheets, Tasks and Slides, uniting them under the umbrella ‘Smart Canvas’ tool.
Wrapped up in the Smart Canvas is a new ‘assisted writing’ tool which, Google claims, will suggest writing changes where it thinks appropriate, like suggesting more ‘inclusive’ terms to the user when using ‘gendered terms’.
Alongside the more inclusive assisted writing tool, Google also announced a more inclusive camera, which it says more accurately depicts skin tone.
Google’s Sameer Samat said that ‘photography has not always seen us as we want to be seen’ for people of colour, and that the company has made changes to auto white balance adjustments to ‘bring out natural brown tones’.
The technology, which will be featured on Google’s flagship Pixel phone, will be out later this year.
Google has previously apologised for racial and gender biases in its predictive algorithms.
It was forced to remove gender pronouns from Gmail’s predictive text feature in 2018, after Google staff found the technology appeared to be assuming user’s gender incorrectly.
According to Gmail product manager Paul Lambert, a company research scientist found the problem.
He wrote: ‘I am meeting an investor next week and Smart Compose suggested a possible follow-up question: “Do you want to meet him?” instead of “her”.’
Google has also banned expletives and racial slurs from its predictive technologies.
Other changes to its developer style guide include using ‘baffling’ instead of ‘crazy’, ‘placeholder variable’ instead of ‘dummy variable’ and ‘final check for completeness and clarity’ instead of ‘final sanity-check’.