When Elinor Lobel was 16, a “smart” insulin (胰岛素) pump was attached to her body. Powered by AI, it tracks her glucose levels and administers the right dose of insulin at the right time to keep her healthy. It is one of the new ways that data and AI can help improve lives.
Books that criticize the dark side of data are plentiful. They generally suggest there is much more to fear than fete in the algorithmic(算法的)age.
But the intellectual tide may be turning. One of the most persuasive supporters of a more balanced view is Elinor Lobel’s mother, Orly, a law professor. In The Equality Machine she acknowledges AI’s capacity to produce harmful results. But she shows how, in the right hands, it can also be used to fight inequality and discrimination.
A principle of privacy rules is “minimization”: collect and keep as little information as possible, especially in areas such as race and gender. Ms Lobel flips the script, showing how in hiring, pay and the legal system, knowing such characteristics leads to fairer outcomes.
Ms Lobel’s call to use more, not less, personal information challenges data-privacy orthodoxy(正统观念). But she insists that “tracking differences is key to detecting unfairness.” She advocates g loosening of privacy rules to provide more transparency(透明)over algorithmic decisions.
The problems with algorithmic formulae(公式) are tackled in depth in Escape from Model Land by Erica Thompson of the School of Economics. These statistical models are the backbone of big data and AL. Yet a perfect model will always be beyond reach. “All models are wrong,” runs a wise saying. “Some are useful.”
Ms Thompson focuses on a challenge she calls the Hawkmoth Effect. In the better known Butterfly Effect, a serviceable model, Vin the prediction of climate change, becomes less reliable over time because of the complexity of what it is simulating(模拟), or because of inaccuracies in the original data. In the Hawkmoth Effect, by contrast, the model itself is flawed; it might fail to take full account of the interplay between humidity, wind and temperature.
The author calls on data geeks to improve their solutions to real-world issues, not merely refine their formulae—in other words, to escape from model land. “We do not need to have the best possible answer,” she writes, “only a reasonable one.”
Both these books exhibit a healthy realism about data, algorithms and their limitations. Both recognize that making progress involves accepting limitations, whether in law or coding. As Ms Lobel puts it: “It’s always better to light a candle than to curse the darkness.”
Ms Lobel intends to convey that________
A.minimisation is a good privacy rule to go by |
B.algorithms are currently challenged by data privacy |
C.employing more personal data should be encouraged |
D.identifying algorithms’ problems leads to better outcomes |
For children of the last twenty years born into this modern life, these technological marvels seem like elements of the periodic table: a given ingredient that is simply part of the universe. Younger generations don’t even try to imagine life without modern conveniences. They do not appreciate theunprecedented (史无前例的) technology that is in their possession; rather, they complain about the ways in which it fails to live up to ideal expectations. “My digital video recorder at home doesn’t allow me to program it from my computer at work.” “It’s taking too long for this interactive map to display on my portable GPS”.
What does the underlined statement in paragraph 6 probably mean?
A.Space exploration provides us with new technology. |
B.Adults learn technology while they are doing household chores. |
C.High expectation makes up for the limitation of technology designers. |
D.Consumers regard many technological inventions as unremarkable |
组卷网是一个信息分享及获取的平台,不能确保所有知识产权权属清晰,如您发现相关试题侵犯您的合法权益,请联系组卷网