New submitter Muckluck shares an excerpt from a report via Phys.Org that provides “an interesting look at how algorithms may be shaping your life”: When you browse online for a new pair of shoes, pick a movie to stream on Netflix or apply for a car loan, an algorithm likely has its word to say on the outcome. The complex mathematical formulas are playing a growing role in all walks of life: from detecting skin cancers to suggesting new Facebook friends, deciding who gets a job, how police resources are deployed, who gets insurance at what cost, or who is on a “no fly” list. Algorithms are being used — experimentally — to write news articles from raw data, while Donald Trump’s presidential campaign was helped by behavioral marketers who used an algorithm to locate the highest concentrations of “persuadable voters.” But while such automated tools can inject a measure of objectivity into erstwhile subjective decisions, fears are rising over the lack of transparency algorithms can entail, with pressure growing to apply standards of ethics or “accountability.” Data scientist Cathy O’Neil cautions about “blindly trusting” formulas to determine a fair outcome. “Algorithms are not inherently fair, because the person who builds the model defines success,” she said. Phys.Org cites O’Neil’s 2016 book, “Weapons of Math Destruction,” which provides some “troubling examples in the United States” of “nefarious” algorithms. “Her findings were echoed in a White House report last year warning that algorithmic systems ‘are not infallible — they rely on the imperfect inputs, logic, probability, and people who design them,'” reports Phys.Org. “The report noted that data systems can ideally help weed out human bias but warned against algorithms ‘systematically disadvantaging certain groups.'”
Read more of this story at Slashdot.