《演算法帝國》作者TEDx演講字幕

毛倩倩發表於2014-03-04

完整的英文字幕見本頁後半部分。另外,建議翻牆觀看Youtube視訊:http://www.youtube.com/watch?v=H_aLU-NOdHM,因為可以音、畫、字幕同步

Christopher Steiner是《紐約時代》2009年暢銷書《油價30元/升》的作者,他的新書《演算法帝國》也將於2014年上半年在中國上市。2011年,Steiner與人聯合創辦了線上雜貨零售公司Aisle50(https://aisle50.com/),該公司是國際著名創業孵化器Y Combinator的畢業生。在此之前,Steiner在《福布斯》雜誌任資深技術主筆7年之久,《華爾街日報》《芝加哥論壇報》《快公司》《麻省理工科技創業》和《滑雪雜誌》經常刊登他的作品。他擁有伊利諾伊大學香檳分校工學學士學位和西北大學的新聞學碩士學位。Steiner住在伊利諾伊州庫克郡埃文斯頓市。

TEDx. TEDx was created in the spirit of TED's mission, "ideas worth spreading." The program is designed to give communities, organizations and individuals the opportunity to stimulate dialogue through TED-like experiences at the local level. At TEDx events, a screening of TEDTalks videos -- or a combination of live presenters and TEDTalks videos -- sparks deep conversation and connections. TEDx events are fully planned and coordinated independently, on a community-by-community basis

0:09
>>可能有人

0:09
>>一些瘋狂的人會這樣跟你說

0:13
>>比如一些誇誇其談的作者

0:17
>>說什麼演算法正在接管世界

0:22
>>或許今天在這裡,我們可用幾分鐘時間考量一下這種說法

0:26
>>我們知道演算法已經接管了華爾街

0:31
>>控制著你的IRA(個人退休金賬戶)你的養老金(401 K)你的退休金的交易 70%通過演算法完成

0:35
>>演算法控制著你的活命錢

0:39
>>我們的股市

0:42
>>現在都是演算法一層一層一層的疊加,Balbir說得好

0:47
>>我們試圖在混亂中求一恆久次序

0:50
>>對人來說這幾乎是不可能完成的任務

0:53
>>人做不到的,機器也做不到

0:56
>>發生在2010年5月6號的“閃電崩盤”就是例子(沃德爾和裡德公司使用自動高頻演算法交易41億美元期貨導致)

1:00
>>萬億美元5分鐘內消失

1:03
>>還有今年8月1號騎士資本暴跌虧損4.4億美元

1:09
>>起因便是其某一演算法的異常行為

1:12
>>我有一位朋友,他是基金經理

1:15
>>兩年前,我們一塊出去吃午飯,我問他

1:20
>>演算法真的在接管金融界嗎?他說,當然

1:24
>>真正有趣的是他還說演算法正在接管一切

1:32
>>因為演算法在華爾街的應用

1:40
>>人們便開始應用這種方法深入研究並把它推廣到其他領域

1:44
>>引起了很多領域的革命

1:47
>>醫藥體育音樂

1:51
>>於是我們開始關注這些事,對演算法產生興趣

1:54
>>我們就像談論自己的身體一樣談論著演算法彷彿很瞭解它似的

1:58
>>有人會問你那個天氣應用怎麼樣你說那不過是個演算法而已

2:02
>>但演算法到底是什麼呢?

2:08
>>很簡單演算法就是用計算機語言編寫的一組指令

2:11
>>它告訴計算機怎麼處理資訊

2:14
>>演算法通過輸入的一批資料產生輸出結果

2:19
>>學工程的大學一、二年級的學生

2:22
>>在第一堂電腦科學上

2:25
>>都必須寫花上一小時,寫個井字棋遊戲

2:29
>>輸入是人的走法輸出是機器的走法

2:35
>>但演算法的能力已經不限於此了

2:39
>>演算法可以學習可以適應可以進化它們已經進化到這樣的地步

2:42
>>不僅是人類編寫演算法而演算法也在改變我們

2:50
>>演算法塑造我們的文化影響我們的話語

2:52
>>控制我們所聽所聞改變我們的生活

2:55
>>我想大家都認為演算法有其極限

2:58
>>無法跨越的極限

3:01
>>那便是最有人味的任務

3:06
>>也許沒錯但事實是這些領域也已然成為機器的天下

3:10
>>最有人味的任務我是指的比如研究生寫論文

3:17
>>原創藝術創作至關重要的國家安全決策制定

3:21
>>書寫法律文書

3:24
>>音樂界已經使用了演算法來發現藝術家

3:27
>>演算法非常善於發現流行歌曲

3:31
>>因為它們深諳最紅的流行歌曲背後的數學原理總能百發百中地勾住你的心

3:34
>>它們越來越多地決定著電臺放什麼歌曲

3:40
>>但是我們得問問自己

3:41
>>是演算法發現瓦爾納的嗎是演算法發現披頭士的嗎

3:47
>>有一天你的醫生會是演算法

3:50
>>我們已經有了一個全自動化的藥房

3:53
>>在舊金山加利福利亞大學執行著演算法

3:57
>>這家藥房完成了2百萬份處方的配藥沒出一點兒差錯

4:00
>>一般的人類藥劑師在為同樣的處方配藥時可能會出2萬次錯

4:09
>>某一天演算法也會出現在急診室

4:13
>>問題在於這是壞事嗎

4:17
>>答案尚待揭曉

4:19
>>未來20年是大資料和演算法的時代

4:24
>>我們處在人類發展弧線的頂峰階段

4:29
>>我們允許演算法接管的範圍有多廣呢

4:33
>>可是更切合現在更切合當下的問題

4:37
>>是演算法已經接管了多少領域

4:41
>>關於演算法能夠走多遠以及它們已經走了多遠我最喜歡的一個例子是

4:47
>>演算法為人類做準確的心理評估

4:53
>>這通常是人類的專屬領域

4:56
>>那些在醫學院上了10年書的人的專屬領域

4:59
>>連自己都不瞭解的演算法怎麼可能瞭解我們呢

5:04
>>這個故事始於50年前, 1960年代後期

5:07
>>NASA當時決定把科學家送上外太空

5:15
>>在此之前航天員都是從空軍裡面選拔的

5:17
>>比如試飛員

5:19
>>這些是能駕駛超音速飛機遠赴重洋對蘇聯進行偵查的人

5:27
>>這些人沉著鎮定臨危不亂

5:29
>>這就是他們為什麼能成為優秀的航天員

5:31
>>NASA知道他們不會被壓力擊垮

5:34
>>但是對科學家來說沒有試飛專案

5:37
>>NASA怎麼能知道哪些科學家經得住壓力

5:41
>>哪些又承受不住呢

5:43
>>怎麼知道何種性格的人一起呆在太空艙72小時會產生衝突矛盾呢

5:50
>>NASA清楚蘇聯不止一起航天任務由於宇航員之間的矛盾而夭折

5:55
>>所以NASA著手構建一個性格分類系統

6:00
>>一個可以提前預知人與人之間矛盾的預見性系統

6:05
>>可以預測在外太空誰能表現優異誰將被擊垮的系統

6:11
>>NASA在此後20年的時間裡建立了這個世界上最先進的心理分析系統

6:20
>>他們能通過10分鐘的談話知道你在想什麼

6:22
>>你的性格如何以及你的承壓能力有多大

6:29
>>而人需要數年才能掌握這些方法

6:33
>>但是這些方法的重要性以及我們今天談論它的原因在於它是可以量化的

6:40
>>那麼它是如何預測的呢

6:42
>>我們的用詞我們的句子結構我們對介詞和俚語的習慣用法

6:48
>>這些小細節都為探查我們內心世界及我們與他人的共事方式提供了線索

6:54
>>當然言語就好比股市背後的推動力或音樂背後的數學原理

7:00
>>言語即資料

7:02
>>可被整理歸類儲存分析的資料

7:05
>>所以可能有人會把NASA的科學和可應用於任何領域的演算法科技結合起來

7:16
>>那麼我們會在哪些地方遇到這些機器呢

7:18
>>它們在我們的生活中幾乎無所不在

7:21
>>那麼它們怎麼知道我們在想些什麼

7:23
>>它們知道你的性格知道你的言下之意

7:29
>>比如我的性格是思考支配型的

7:34
>>如果一個思考支配型的人在聽別人解釋某種事物

7:39
>>他也許會說

7:42
>>真有趣

7:44
>>那可真有趣

7:46
>>可我說這句話的時候真實的想法是很遺憾我不覺得有趣

7:50
>>我不知道這怎麼樣我需要更多的資訊才能做出判斷

7:55
>>而程式就知道一個思考支配型的人說有趣的時候真正的態度是什麼

8:03
>>那麼我們會在哪些地方遇見如此瞭解我們的機器呢

8:06
>>我想大家都聽過那句客服電話的辭令吧

8:11
>>為了確保服務質量您的通話可能會被錄音

8:15
>>我們還以為是某個上級時不時要聽下談話內容呢

8:19
>>也許吧

8:20
>>可是通常的情況你引來了6百萬條演算法監聽你的通話

8:28
>>NASA的科學傳授給了演算法

8:31
>>那麼它如何工作呢

8:34
>>你致電客戶電話演算法記下你的電話號碼

8:37
>>隨機為你分配一個客服代表然後監聽你們的談話

8:41
>>它們可聽得很仔細

8:43
>>你剛開始說話的2分鐘內它們就為你貼上了一個性格標籤準確得驚人

8:49
>>下次你再打進來它們就會把你連線到具有和你相同性格的客服代表

8:56
>>它們之所以可以這樣做

8:58
>>因為這些客服中心大都有1萬5到2萬名客服代表同時接聽客服電話

9:02
>>當你連線到與自己脾性相投的客服代表你們的通話時間將縮短一半

9:07
>>而且你們達成一個滿意的解決方案的機率為90%

9:11
>>而不是47%

9:14
>>客服中心所屬的公司已經聽過10億通電話

9:20
>>公司花了6千萬美元耗時10年構建了這個含有6百萬條演算法的演算法庫

9:25
>>對人類語言分門別類從而打造這個讀心機器

9:31
>>在明尼蘇達州有一個足球場大小的倉庫

9:35
>>那裡儲存著所有這些通話這些資料

9:38
>>等著為你的大腦思維說話方式找到一個完全匹配的解讀

9:44
>>如果我們可以這樣處理資料應用演算法

9:46
>>我們便知你是誰你為什麼來電你在想什麼

9:51
>>這並非遙不可及的

9:57
>>問題已不再是我們是否會通過這種方式對人們進行分類

10:00
>>問題是我們這樣做的場合和頻率

10:02
>>我們會以此方式挑選求職人員嗎

10:06
>>我們會以此方式挑選大學申請者嗎

10:09
>>甚或我們會以此方式挑選可能的伴侶嗎

10:13
>>個人意見我可不希望這發生

10:17
>>我們會這樣來分類孩子嗎

10:20
>>每種工具其效用都有其侷限性

10:24
>>我們已經看到自動化應用於華爾街的侷限性

10:29
>>華爾街已經成了人類對發生之事沒有洞察無法控制的場所

10:32
>>引領華爾街的人已經很難分清效用和危險的界限

10:40
>>在我們發展的道理上各個領域的資料科學家程式設計師寬客

10:46
>>都會遇到這樣一個難題

10:48
>>如何劃清效用與危險的界限

10:51
>>未來20年上演的是大資料和演算法的故事

10:57
>>這界限如何畫誰來畫決定了這故事的走向

11:05
>>希望你們覺得這還挺有趣 (前面提到思考支配型的人這樣說表明他不覺得有趣)

11:08
>>非常有趣

11:09
>>謝謝大家


0:09there are people
0:09who will tell you crazy people
0:13people like authors who tend to be prone to hyperbole
0:17that algorithms are taking over the world perhaps
0:22we should take the start and examine it is for just a few minutes here
0:26we already know of course that algorithms have taken over Wall Street
0:31they make seventy percent but the trades that control your IRAs
0:35your 401 K's your pensions the control your money
0:39all our stock markets
0:42have become one layer upon layer upon layer Balbir them's to the point
0:47where discerning order from the chaos for human
0:50is next to impossible sometimes that's impossible for the machines as well
0:56that's what happened on May 6th 2010 what we now call the flash crash
1:00when a trillion dollars disappeared in five minutes
1:03or in August 1st this year when I capital loss for did forty million
1:08dollars
1:09in 45 minutes 11 a bit algorithms went buzzer
1:12I have a friend who's a fund manager
1:15in two years ago we're out to lunch and I said on is it true
1:20algorithms are taking over he said of course
1:24but the interesting thing he said
1:27was there taking over everything now interestingly enough
1:33as algorithmic science advanced on wall street.
1:36people began taking these methods
1:41and peeling off and taking them to other places
1:44and starting many revolutions all sorts feels
1:47medicine sports music
1:50we talk about these things these algorithms we often
1:54talk about them like we know though but there are bodies some ill ask you
1:58now that the weather app or can you go just now for them
2:02what exactly is an algorithm another them
2:08quite simply is a set instructions written computer language
2:11that informs machine what to do with the piece of information
2:14over the steak input and they produce output
2:18often freshman and sophomore
2:22engineering students in the first computer science courses have to write
2:25an hour than the play tic-tac-toe that Albertans
2:29import are the moves that human the output other moves
2:33the computer but after that have gone
2:36far past this now they can learn
2:39they can adapt they can evolve they've evolved to the point in fact
2:42where we are not only shipping the algorithms they are shaping
2:46us the shaper culture
2:50shape but we see they say but we hear shape how we live
2:54I think we all match in there some wine
2:58wherever that can't get past the can't do
3:01those most human of tasks right but actually
3:06these places are now to the province the box
3:09well I mean by the most human of tasks I mean things like
3:14grading students written essays creating
3:17original art making crucial national security decisions
3:21reading legal documents music industry already employs algorithms to find you
3:27artists
3:27the very good at finding pop songs because
3:31they know the math behind the best pop hooks
3:34already deciding more more what plays
3:37on our radios we have to ask ourselves
3:41with the algorithms finer vana what they find the Beatles
3:45your doctor Sunday be an algorithm
3:49we already have a fully robotic pharmacy
3:53running and algorithms at the University of California San Francisco
3:56its doled out two million prescriptions without making a single mistake
4:00average human pharmacists would have made twenty thousand mistakes
4:06filling out the same prescriptions
4:09you will meets now for them in the emergency room Sunday
4:13the question of course is this Pat
4:17that answer has yet to be determined the story the next 20 years is a story a big
4:21data
4:22and algorithms we are at a giant
4:25fork in the archive humanity just how much
4:29we allow algorithms to take over
4:33but the better question for today the better question for right now
4:37is how much have the already taken over
4:41with my favorite examples of how far algorithms can go
4:44and how far they've already gone
4:47it's about algorithms that perform accurate psychological
4:50the valuations people this is something we normally reserved
4:54for humans who have been in medical school for 10 years
4:59how can a bot that doesn't know of itself no us
5:04the story starts fifty years ago in the late 1960s
5:07when nasa made the decision to begin sending
5:11scientists in outer space up until this point astronauts had been
5:15plucked from the Air Force they were test pilots
5:19these are the men who flew the planes that broke the speed of sound
5:22that road next to the stratosphere that spied
5:25on the Soviet Union these men were unflappable
5:29which is why they made such good astronauts nasa new
5:32would not crack under pressure but there's a test pilot program for
5:35scientists
5:37Howard nasa no which scientists would stand up to the pressure
5:40in which scientists would fall I wouldn't know which personalities would
5:44clash
5:45after being locked in a space capsule for 72 hours
5:50the russians nasa new had more than one mission compromised because a coup crew
5:54conflict
5:55so nasa set out to create a system a personality classification
6:00a predictive system that would foresee complex between people
6:04in Knowle who would perform well and who would crumble
6:09outer space the next 20 years
6:12next 20 years nasa created this system
6:15the most advanced psychoanalytic methods
6:18in the world they knew what you were thinking
6:22what your personality was what your capacity for stress was
6:25after a 10-minute conversation
6:29you could take humans years to master these methods
6:33but the important thing about these methods the reason we're talking about
6:37them today
6:38the quantifiable how they work
6:41works of the words that we speak the way in which we structure sentences
6:46the way in which you use pronouns and burt's all these little things
6:49offer clues to our inner personality how we work
6:52with other people in words of course
6:56just like the momentum behind our stock market for the math behind her music
7:00were to date did I can be organized stored
7:04parsed so it figures that somebody would come along to take mass is incredible
7:08science
7:09in Marion with the new algorithmic technologies
7:13that could employ everywhere so where do we run into the spot
7:18we're going to the nearly every day how do they know what we're thinking well
7:23when the pots know your personality
7:26they know the meaning behind your words for instance
7:30my personality something called thoughts based
7:34when I thought space person is being explained what somebody explaining
7:37something to a thought space person
7:39they might sometimes say that's interesting
7:43that's really interesting what I really mean when I say that
7:48unfortunately is that's not that interesting I don't
7:51I don't know what I think about this any more information
7:55now the pots no thats exactly
7:58what I thought space person means when he says that's interesting
8:02so we run into these pots that Noah so well
8:06well I think we're all we've all heard that refrain a customer service refrain
8:11this call may be recorded or monitored for quality assurance purposes
8:14we assume that means once in a while a boss is listening to the call
8:19and maybe yeah maybe that's what happens but often what happens
8:22is you just invited six million algorithms in for a listen
8:27nasa scientist was transferred to the set about us
8:30so how does it work well you call up customer service
8:34the over them pick no to your phone number than the route you to a random
8:38customer service agent then they settle in for a listen they listen very
8:41carefully
8:42within two minutes have you started speaking they sign you
8:45personality their credit the actor
8:49the next time you call the route you
8:52to an agent with the exact same personality is yourself and they can do
8:55this because a lot of these call centers have
8:57fifteen twenty thousand people on deck at any moment
9:01what happens when you get people with the same personnel your calls are half
9:04as long
9:07Indicom happy resolutions ninety percent of the time
9:11instead of 47 percent of the time
9:14the company behind all this is listen to one billion conversations
9:19it spent $60 million dollars over 10 years to create this library of six
9:23million algorithms
9:24to categorize the human language to build this mind reading
9:28by there's a warehouse
9:32Minnesota the size of a football field world these conversations
9:35all these data store waiting
9:39to recall that one file that matches perfectly
9:42your brain and how you talk now we can do this with data
9:45and algorithms if we can know who you are why you called
9:49and what you thinking doesn't seem to be much
9:53outside a bar reach
9:57the question isn't whether we will soar people this way the question is
10:00where and how often well we sort job applicants with these methods
10:06will we sort college applicants with these methods we even saw a potential
10:09spouse is
10:11with this of person
10:17we sort children just as with any tool there's limits
10:21utility in all of this we've seen the end utility
10:26automation on wall street it's become a place where humans have little insight
10:30as to what's going on
10:32in little control the people in charge Wall Street
10:35have had a difficult time drawing a line between utility
10:38and menace as we go forward
10:41data scientists programmers Inc wants all sorts of fields
10:46face the same dilemma where to draw the line between utility
10:50and menace the story in the next 20 years is
10:53the story big data algorithms
10:57that story will be determined by where these lines get drawn
11:01and who gets to draw them
11:05hope you found that interesting very interesting
11:08thank you

相關文章