|京ICP备14027590号-282

22考研外刊阅读《科学美国人》双语精读版训练–机器学习必须要用…

The pandemic, with people working from home,meanteven fewer newspaper purchases, which depressed demand for newsprint again and increased the pain for paper suppliers.

由于人们在家工作,疫情意味着报纸的购买量更少,这再次抑制了对新闻纸的需求,给纸张供应商带来了更多的痛苦。

1.depress:to prevent an economy from being as active and successful as it usually is

《本期内容》

导读

点击下方查看导读

人们一提到人工智能、机器学习就会想到大数据,但实际应用过程中,小数据也可以发挥大作用。

双语阅读

When people hear “artificial intelligence,” many envision “big data.” There’s a reason for that: some of the most prominent AI breakthroughs in the past decade have relied on enormousdata sets. Image classification made enormous strides in the 2010s thanks to the development of ImageNet, a data set containing millions of images hand sorted into thousands of categories.

当人们听到“人工智能”时,很多人会想到“大数据”。其中的一个原因在于:过去十年中人工智能取得的一些最突出的突破都取决于庞大的数据集。由于图网的发展,图像分类在21世纪10年代取得了巨大的进步。图网是一个包含了数以百万计的图像的数据集,其中的图像被手工分类成了数千个类别。

More recently GPT-3, a language model that uses deep learning to produce humanlike text, benefited from training on hundreds of billions of words of online text. So it is not surprising to see AI being tightly connected with “big data” in the popular imagination. But AI is not only about large data sets, and research in “small data” approaches has grown extensively over the past decade—with so-called transfer learning as an especially promising example.

最近出现的GPT-3,是一种使用深度学习生成类似人类文本的语言模型,它受益于对线上文本中数千亿单词的训练。因此,在大众的想象中,人工智能与“大数据”联系紧密,这并不奇怪。但人工智能不仅仅与大数据有关,过去十年中,对“小数据”方法的研究得到了广泛发展——所谓的迁移学习就是一个特别有前景的例子。

Also known as “fine-tuning,” transfer learning is helpful in settings where you have little data on the task of interest but abundant data on a related problem. The way it works is that you first train a model using a big data set and then retrain slightly using a smaller data set related to your specific problem. For example, a research team working on German-language speech recognition showed that they could improve their results by starting with an English-language speech model trained on a larger data set before using transfer learning to adjust that model for a smaller data set of German-language audio.

迁移学习,也即“微调”,可以在当你感兴趣的事项上数据很少,而相似问题上有丰富的数据时发挥作用。它的工作方式是这样的,首先你用大数据集对一个模型进行训练,然后对于你相关的特殊的问题,用更小的数据集稍微地进行再训练。翻译划线句,长按文末小程序码打卡,答案下期公布~

Research in transfer learning approaches has grown impressively over the past 10 years. In a new report for

Georgetown University’s Center for Security and Emerging Technology (CSET), we examined current and projected progress in scientific research across “small data” approaches, broken down in terms of five rough categories: transfer learning, data labeling, artificial data generation, Bayesian methods and reinforcement learning.

对迁移学习方法的研究在过去10年取得了惊人的发展。乔治敦大学安全和新兴技术中心的一份新报告中,我们研究了“小数据”方法领域科学研究目前和预期将取得的进展,并大致分成了五个类别:迁移学习、数据标记、人工数据生成、贝叶斯方法和强化学习。

Our analysis found that transfer learningstands outas a category that has experienced the most consistent and highest research growth on average since 2010. This growth has evenoutpacedthe larger and moreestablishedfield of reinforcement learning, which in recent years has attracted widespread attention.

我们的分析发现,迁移学习这个类别脱颖而出,平均来说该类别自2010年以来,经历了最稳定和最大的研究增长。这种增长甚至超过了更大、更成熟的强化学习领域,后者在最近几年引起了广泛的关注。

本文节选自:Scientific American(科学美国人)

发布时间:2021.10.19

词汇积累

1.fine-tuning

v.微调;调整

2.outpace

英[?a?t?pe?s]美[?a?t?pe?s]

vt.(在速度上)超过;比…快;

3.established

英[??st?bl??t][??st?bl??t]

adj.确定的;著名的;成名的;公认的;成为国教的;确立已久的,早已投入使用的;资深的,知名的

词组搭配

1.data sets 数据集

2.stand out 站起来;突出;坚持到底;坚持反抗

写作句总结

When people hear “artificial intelligence,” many envision “big data.”

结构:When people hear sth.1, many envision sth.2

当人们听到sth.1时,很多人会想到sth.2

例句:When people hear “tea culture”, many envision China.

打卡作业

在草稿纸上翻译文章中的划线句,完成每日的打卡练习!下期推送会公布参考翻译答案,大家一起来学习英语吧~

打卡格式:考研英语打卡+ 翻译内容

点击领取1998-2020经济学人杂志PDF,附双语版+词汇

领取30年考研真题

扫上方二维码,然后回复“真题”

? END ?

排版/外刊君

图片/来源网络

中国高翻小组

Notice: The content above (including the pictures and videos if any) is uploaded and posted by a user of NetEase Hao, which is a social media platform and only provides information storage services.

发表评论

|京ICP备18012533号-223