IIT-M students, UK-based Indian researcher develop algorithm to address social bias

The algorithm looks at the AI clustering technology that fails to address social discrimination for selection processes.
IIT-M students, UK-based Indian researcher develop algorithm to address social bias
IIT-M students, UK-based Indian researcher develop algorithm to address social bias
Written by:

Picture a company so big that the applications it receives for a new job opening are far too many for someone to go through each one of them. Typically, the company would then use Artificial Intelligence technology to divide the job applications into groups, based on say, a skill set. Resumes of a similar nature would be in a group. A selector going through a group can look at one resume and decide if that’s the skill set needed for the job; subsequently either choosing to sift through the whole group (for further selection process), or discard the bunch altogether.

On the surface, this categorisation may look fine, fair. But it’s not. For instance, the clustering technology do not look at attributes such as gender, race or sexual orientation, or other societal attributes. To avoid such social biases, a team of three researchers, led by Northern Ireland-based Malayali scholar Deepak Padmanabhan, wrote a paper to develop a new algorithm.

Apart from Deepak, the other two team members are PhD students at IIT-Madras, Savitha Sam Abraham from Kochi, and Sowmya S Sundaram from Chennai. Their paper ‘Fairness in Clustering with Multiple Sensitive Attributes’ would be presented at the 23rd international conference on Extending Database Technology in Copenhagen in April 2020.

“The original clustering technology is designed to group people based only on technical skills,” Deepak explains. “But it could inadvertently result in some social discrimination that haven’t been accounted for. In this example, if it was designed so that every cluster has representation of gender groups that reflect their distribution in the population, then gender disparity may be addressed.”

Existing technology allows using only one attribute in the selection process. So, if gender is addressed, it cannot address other factors such as caste or religion or marital status or sexuality and other social biases. However, Deepak and his team’s algorithm would allow users to consider multiple attributes while clustering.

It was his interest in the politics of the matter that made Deepak think about the possibility. He narrates other examples to prove the case. A toy shop for instance that places pink toys together and blue toys together, encouraging the social stereotype of associating one colour with girls and the other with boys. It may help them improve the sales but ultimately the biases are exploited for business. “Suppose this is a state-run store. We can argue that the state should not promote such social stereotypes,” Deepak says.

The argument so far has been that suitability or skills for that particular job or task is the factor to be taken into consideration for companies. Deepak and his team, however, argue that more importantly, it is representation that should be considered.

Related Stories

No stories found.
The News Minute
www.thenewsminute.com