Breaking news

“Transformers” and 8 Google employees who changed the history of AI | WIRED.jp

“Transformers” and 8 Google employees who changed the history of AI | WIRED.jp
“Transformers” and 8 Google employees who changed the history of AI | WIRED.jp
--

Vaswani recalls collapsing on the couch in his office one night while writing a paper with his team. As he stared at the curtain that separated the sofa from the rest of the room, he was surprised to see patterns drawn on the fabric that looked like synapses or neurons. I told Gomez, who was there, that what we were researching would definitely surpass machine translation. “Ultimately, the modalities of speech, audio information, and visual information all need to be integrated in a single architecture, much like the human brain,” Vaswani says. “I had a strong intuition that what our team was developing was something holistic.”

“Attention Is All You Need”

But company executives saw this research as just another interesting AI project. I asked several members of the Transformers team whether their bosses had ever asked them to report on project progress, and they said not often. Still, “the team knew this project could be pretty big,” says Uskoreit. “That’s why I was very particular about my comments about future research at the end of the paper.”

That sentence anticipated what might happen next, a future in which all modes of human expression are applied to transformer models: “We are very excited about the future of attention-based models. We plan to expand the functionality of transformers to areas including input/output modalities, and also study images, audio, and video.

A few nights before the deadline, Uskorite realized he needed to give his paper a title. According to Jones, the team agreed to fundamentally reject existing best practices such as LSTM and introduce a technique called attention. Then Jones remembered that the Beatles wrote “All You Need Is Love” in a song. Then how about the title “Attention Is All You Need”?

Why the Beatles?

“Because I’m British,” Jones says. “I literally thought about it in five seconds. I never thought I would get the job.”

The team continued to collect experimental results right up to the deadline. “The numbers for English-to-French translation came out five minutes before I submitted my paper,” Palmer says. “I sat in the micro-kitchen of building 1965 and punched in the last number.” By the time I submitted my paper, there were less than two minutes left until the deadline.

Like almost every tech company, Google immediately filed for a provisional patent on the research. This is not to prevent others from using the idea, but rather to expand its list of patents as a defense against infringement claims. ”).

When the conference reviewers responded, the reactions were mixed. “One person was positive, one person was extremely positive, and one person was like, ‘OK,'” Palmer said. The research will be presented at the evening poster session.

By December, the paper had become a hot topic. The four-hour session on Dec. 6 was packed with scientists looking to learn more about the research. The team talked until they were hoarse, and even after the session ended at 10:30 p.m., there was still a crowd. “There was a security guard asking us to leave,” Uskorite said. Perhaps his most fulfilling moment was when computer scientist Sepp Hoffreiter came to praise his team’s work. That’s quite a compliment. Hoffreiter is the co-inventor of the long- and short-term memory model, which was just dethroned as a primary tool in AI by Transformers.

“No one really understood what it meant.”

However, Transformers didn’t immediately conquer the world, and it didn’t even take over within Google right away. Kaiser recalls that around the time the paper was published, Shazier suggested to Google executives that they should discard all current search indexes and use transformers to train large networks. This is a proposal that will completely change the way Google organizes information. At the time, even Kaiser thought it was too far-fetched. Now everyone thinks it’s only a matter of time.

The article is in Japanese

Tags: Transformers Google employees changed history WIRED .jp

-

NEXT “Git for Windows 2.45.0” released – Preliminary support for reftable and SHA-1/SHA-256 interoperability functions