Can technology cure the ills of the admission process?



[ad_1]






It will be ironic if a highly publicized lawsuit against Harvard eventually overturns racialized confessions of race in US colleges. Americans have been arguing over the issue for more than half a century, but the decisive legal blow may well come while the technology to make positive action more effective is finally at hand.

Technologies such as data mining and machine learning have all made us more visible. Computer algorithms can deduce intimate details about a person, from the race to the political orientation through the characteristics of the personality, by examining a plethora of other details of this person's online behavior. This type of technology is already changing the way colleges sell to prospective students. Recruiters can target underrepresented niches – for example, middle-class Latino athletes who enjoy classical music – with unprecedented accuracy.

Data science could soon help colleges sort their applications. Schools can make decisions based not only on information provided by students about their applications, but also on rich data profiles that incorporate the student's socio-economic background and a range of other variables.

"The technology is there if you sort the problem the right way," said John Carroll, a former communications professor at Boston University and a senior media and marketing analyst, who gives lectures on targeted marketing. "If Facebook can do it, then I think MIT can do it."

Get Today in opinion in your mailbox:

Globe Opinion readings are delivered every Sunday to Friday.

Like other selective schools, Harvard tries to admit a class of students more representative of the global population than a class chosen by GPA and by the results of tests alone. But the legal landscape of positive discrimination is unclear: the Supreme Court has ruled that admissions officers may use a student's race as a factor, but not as a deciding factor on admission; and although quotas are banned, very selective universities have been free to assess candidates more holistically.


When they are overwhelmed by tens of thousands of applications, admissions officers may have trouble understanding the whole situation. Harvard asked readers of the app to rank candidates based on a small number of characteristics, ranging from academic achievement to athletic potential. The plaintiffs accused the school of limiting the number of US-Asian students, including systematically assigning them lower ratings for their subjective personal qualities.

Critics of the current system may not be reassured if human admissions managers instead outsource admissions decisions to powerful computer algorithms. But computers are more skillful than humans in making positive or negative decisions based on dozens or hundreds of variables – and could provide universities with a way to create diversity without regard to race.

***

Like any other businesses, colleges began using follow-up programs to track their clients over the Internet. Click on a link for information on admissions to a university website. You may receive emails promoting the school's financial aid programs.

Even if you sign up for the SAT, you can fill a student's analog and virtual inboxes with promotional material from universities that purchase contact names from testing companies. "In many ways, data mining is moving the admission process from appeal dynamics to attraction dynamics," said Carroll.

Universities can also target color students in their marketing, rather than waiting for their candidacy. Facebook, for example, offers advertisers the ability to reach specific users. Facebook has classified these into what he calls "clusters of racial affinity."

Dipayan Ghosh, a former public policy and privacy advisor at Facebook, said the company ranks users in these groups based on a mountain of data that includes the user's interests, online behavior and social relations. "It's based on everything," he said.

The data collected by Facebook is not always perfect – Ghosh is not African-American, although Facebook believed it. But it's "more accurate than you think," he said. And data brokers such as Experian or Oracle have access to even more complex data networks on individuals than Facebook – they can create a detailed record about an individual based on their online history – and all that information is for sale . "You can infer that the race is very confident if you know all the factors that a data broker might know," Ghosh said.

This means that colleges could not only tailor their ads to color students, but also discover the race of a candidate without ever asking him to disclose this information, instead addressing data brokers.

When they are overwhelmed by tens of thousands of applications, admissions officers may have trouble understanding the whole situation. But computers are much more likely to make positive or negative decisions based on hundreds of variables at a time.

Of course, there is already a long history of colleges trying to diversify their student bodies without asking any questions about the race. In California, for example, where racially positive discrimination was banned in 1996, schools have spent years trying to find alternatives. They tried to use other categories – such as the socio-economic status of a student, his neighborhood or his level of parenting education – that were associated with the race, hoping to that if they admitted more disadvantaged students in these areas, they would also end up with more students of Color.

"It's a good idea, but it does not work," said Gary Orfield, co-director of the UCLA Civil Rights Project.

Over the last two decades, Orfield said, "UCLA has spent hundreds of millions of dollars trying to do everything imaginable" as an alternative to positive discrimination based on race, including the recruitment of economically disadvantaged students and the acceptance of the underperforming 10% high schools. "And nothing worked very well," Orfield said. "The race is not the same thing as anything else."

UCLA's efforts "made a difference," said Orfield, "but not enough."

Natasha Warikoo, associate professor of positive action at Harvard's School of Education, agrees. "In the end," she said, "most research suggests that the best way to achieve racial diversity is surprise! – consider the race.

OK, so maybe just looking at a few factors that tend to correlate with the race is too rude – like looking at a low resolution image. And if the photo is more detailed? Mark C. Long, professor of public policy at the University of Washington, attempted to examine 195 features of a group of grade 10 students who were the object of 39, a survey for a study of the US Department of Education. He discovered that he was able to predict minority status with an accuracy of 82% (the most reliable indicator was the race of the three best friends of a student).

"To avoid the detrimental effects of using an imperfect predictor of race, the university could look for additional information about students to help them predict their future. [racial] status, "he wrote. "The university may want. . . Follow the path of private companies trying to predict the characteristics of their customers. "

In addition, schools could use race and combine other indicators of socio-economic status to identify the most disadvantaged students.

***

Nevertheless, there are still some big problems. First of all, for long notes, buying such data from brokers is expensive and can be prohibitively expensive. But there are also ethical and even legal problems.

Warikoo said that students simply should not have to assume that everything they do online will be taken into account to enter the university. "Young people are constantly on social media," she said. "If you tell me everything is going to be monitored – I think it's dangerous and harmful."

But much also depends on the extent to which positive action based on race is prohibited. Would it be forbidden for Harvard only to take race into account when choosing between applications – or to make special arrangements to encourage members of underrepresented minorities to apply? If it is forbidden for schools to ask candidates for their race, will it also be prohibited to deduce this information from other data?

If colleges are not allowed to ask questions about the breed, using this type of data as "proxy" of the breed would be kosher. But if it is forbidden for universities to take race into account in admissions, the law could also prohibit taking into account such indirect information in correlation with race.

"You are at the mercy of the court system to know if he sees this as a legitimate alternative," Carroll said.

The irony is that this kind of proxy information strategy has already been used in American history – by racist government officials, to circumvent laws prohibiting racial discrimination. From the 1890s to the 1960s, for example, the governments of the South used literacy tests to prevent African Americans from voting, knowing that they were less likely to have access to basic education – a sort of primitive algorithm that has not been contested for too long.

"I think it's often easier to maintain seemingly race-neutral policies that disadvantage previously disadvantaged groups than the other way around," Warikoo said. "It's because of who's in power and who makes the decisions. That's how power works. "

That, at least, is not likely to change – no matter how sophisticated our algorithms are.

S.I. Rosenbaum can be reached at [email protected]. Follow her on Twitter @sirosenbaum.

[ad_2]
Source link