Skip to main content

Meet the $4 Billion AI Superstars That Google Lost

It seems fitting that one of Google's most important inventions — one that would come back to haunt the company — was initially devised over lunch.

In 2017, researchers at Alphabet's Mountain View, California, headquarters were talking over their midday meal about how to make computers generate text more efficiently. Over the next five months they ran experiments and, not realizing the magnitude of what they'd discovered, wrote their findings up in a research paper called “Attention is All You Need.” The result was a leap forward in AI.

The paper's eight authors had created the Transformer, a system that made it possible for machines to generate humanlike text, images, DNA sequences and many other kinds of data more efficiently than ever before. Their paper would eventually be cited more than 80,000 times by other researchers, and the AI architecture they designed would underpin OpenAI's ChatGPT (the “T” stands for Transformer), image-generating tools like Midjourney and more.

There was nothing unusual about Google sharing this discovery with the world. Tech companies often open source new techniques to get feedback, attract talent and build a community of supporters. But Google itself didn't use the new technology straight away. The system stayed in relative hibernation for years as the company grappled more broadly with turning its cutting-edge research into usable services. Meanwhile, OpenAI exploited Google's own invention to launch the most serious threat to the search giant in years. For all the talent and innovation Google had cultivated, competing firms were the ones to capitalize on its big discovery.

The researchers who co-authored the 2017 paper didn't see a long-term future at Google either. In fact, all of them have since left the company. They've gone on to launch startups including Cohere, which makes enterprise software, and Character.ai, founded by Noam Shazeer, the longest-serving Googler in the group who was seen as an AI legend at the company. Combined, their businesses are now worth about $4.1 billion (roughly Rs. 33,640 crore), based on a tally of valuations from research firm Pitchbook and price-tracking site CoinMarketCap. They are AI royalty in Silicon Valley.

The last of the eight authors to remain at Google, Llion Jones, confirmed this week that he was leaving to start his own company. Watching the technology he co-created snowball this past year had been surreal, he told me. “It's only recently that I've felt … famous?” Jones says. “No one knows my face or my name, but it takes five seconds to explain: ‘I was on the team that created the ‘T' in ChatGPT.'”

It seems strange that Jones became a celebrity thanks to actions outside Google. Where did the company go wrong?

One obvious issue is scale. Google has an army of 7,133 people working on AI, out of a workforce of about 140,000, according to an estimate from Glass.ai, an AI firm that scanned LinkedIn profiles to identify AI employees at Big Tech firms earlier this year for Bloomberg Opinion. Compare that to OpenAI, which sparked an AI arms race with a much smaller workforce — about 150 AI researchers out of approximately 375 staff in 2023.

Google's sheer size meant that scientists and engineers had to go through multiple layers of management to sign off on ideas back when the Transformer was being created, several former scientists and engineers have told me. Researchers at Google Brain, one of the company's main AI divisions, also lacked a clear strategic direction, leaving many to obsess over career advancement and their visibility on research papers.

The bar for turning ideas into new products was also exceptionally high. “Google doesn't move unless [an idea is] a billion-dollar business,” says Illia Polosukhin, who was 25 when he first sat down with fellow researchers Ashish Vaswani and Jakob Uszkoreit at the Google canteen. But building a billion-dollar business takes constant iterating and plenty of dead ends, something Google didn't always tolerate.

Google did not respond to requests for comment.

In a way, the company became a victim of its own success. It had storied AI scientists like Geoffrey Hinton in its ranks, and in 2017 was already using cutting-edge AI techniques to process text. The mindset among many researchers was “If it ain't broke, don't fix it.”

But that's where the Transformer authors had an advantage: Polosukhin was preparing to leave Google and more willing than most to take risks (he's since started a blockchain company). Vaswani, who would become their paper's lead author, was eager to jump into a big project (he and Niki Parmar went off to start enterprise software firm Essential.ai). And Uszkoreit generally liked to challenge the status quo in AI research — his view was, if it ain't broke, break it (he's since co-founded a biotechnology company called Inceptive Nucleics).

In 2016, Uszkoreit had explored the concept of “attention” in AI, where a computer distinguishes the most important information in a dataset. A year later over lunch, the trio discussed using that idea to translate words more efficiently. Google Translate back then was clunky, especially with non-Latin languages. “Chinese to Russian was terrible,” Polosukhin remembers.

The problem was that recurrent neural networks processed words in a sequence. That was slow, and didn't take full advantage of chips that could process lots of tasks at the same time. The CPU in your computer at home probably has four “cores,” which process and execute instructions, but those used in servers for processing AI systems have thousands of cores. That means an AI model can “read” many words in a sentence at the same time, all at once. No one had been taking full advantage of that.

Uszkoreit would walk around the Google office scribbling diagrams of the new architecture on white boards, and was often met with incredulity. His team wanted to remove the “recurrent” part of the recurrent neural networks being used at the time, which “sounded mad,” says Jones. But as a few other researchers like Parmar, Aidan Gomez and Lukasz Kaiser joined the group, they started seeing improvements.

Here's an example. In the sentence, “The animal didn't cross the street because it was too tired,” the word “it” refers to the animal. But an AI system would struggle if the sentence changed to, “because it was too wide,” since “it” would be more ambiguous. Except now the system didn't. Jones remembers watching it work this out. “I thought, ‘This is special,'” he says.

Uszkoreit, who is fluent in German, also noticed the new technique could translate English into German far more accurately than Google Translate ever had.

But it took a long time for Google itself to apply the technique to its free translation tool, or to its language model BERT, and the company never deployed it in a chatbot that anyone could test out. That is, until the launch of ChatGTP in late 2022 forced Google to quickly release a rival called Bard in March 2023.

Over the years, the authors watched their ideas get applied to an array of tasks by others, from OpenAI's early iterations of ChatGPT to DALL-E, and from Midjourney's image tool to DeepMind's protein folding system AlphaFold. It was hard not to notice that the most exciting innovations were happening outside Mountain View.

You could argue that Google has simply been careful about deploying AI services. But slow doesn't always mean cautious. It can also just be inertia and bloat. Today some of the most interesting AI advancements are coming from small, nimble startups. It is a shame that many of them will get swallowed by big tech players, who are poised to reap the biggest financial benefits in the AI race even as they play catch-up.

Google may have the last laugh in the end, but in many ways it will have been an unimpressive journey.

© 2023 Bloomberg LP


Google I/O 2023 saw the search giant repeatedly tell us that it cares about AI, alongside the launch of its first foldable phone and Pixel-branded tablet. This year, the company is going to supercharge its apps, services, and Android operating system with AI technology. We discuss this and more on Orbital, the Gadgets 360 podcast. Orbital is available on Spotify, Gaana, JioSaavn, Google Podcasts, Apple Podcasts, Amazon Music and wherever you get your podcasts.


from Gadgets 360 https://ift.tt/VEg2FrT

Comments

Popular posts from this blog

Samsung Galaxy S24 Series Pre-Order Details Leak Out; Galaxy S24+ Spotted on Walmart Listing

Samsung Galaxy S24 series is expected to be unveiled on January 17 and past rumours have already given us a fair idea of what to expect from the new flagship phones. The most recent leaks suggest video recording upgrades on the Samsung Galaxy S24, Galaxy S24+, and Galaxy S24 Ultra along with some pre-order perks. The regular models are also tipped to be priced lower than their predecessors. The Galaxy S24 Ultra, in contrast, could see a price jump across all three storage variants. Meanwhile, Walmart accidentally listed the Galaxy S24+ on its online website, whereas Samsung retailers in Brazil have also started teasing the phone ahead of the debut. On X (formerly Twitter), Tipster Ice Universe (@UniverseIce)  posted that Samsung's flagship Galaxy S24 phones will be priced slightly lower than expected. He claimed that Galaxy S24 and Galaxy S24+ stock is large this time as the company is optimistic about the upcoming phones. Further, the Galaxy S24 series is said to offer the abili

End-of-Season Savings on Convertible Air Conditioners Starting at Rs. 22,990

Check out Croma's latest deals on Convertible Air Conditioners, with prices starting from just Rs. 22,990. This sale brings together top brands like Voltas, Daikin, and Croma, offering a mix of high-tech features and value for money. Whether you're looking for energy-saving options or models with the latest cooling technology, there's something for everyone. Plus, these ACs are more environmentally friendly. Big discounts on the usual prices and special bank offers make this a great time to upgrade your air conditioner. Croma 4 in 1 Convertible 1.5 Ton 3 Star Inverter Split AC Get the Croma 4 in 1 Convertible AC at a great deal of Rs. 28,990, down from Rs. 42,000. Made for compact spaces, this 3-star, 1.5 Ton AC is efficient and eco-friendly, complete with a PM 2.5 filter. HDFC Bank users can avail themselves of an extra discount, making it an even better bargain. Buy now at: Rs. 28,990 (MRP Rs. 42,000) Voltas 183V Vectra Platina 4 in 1 Convertible 1.5 Ton 3 Star Invert

Google Pixel Watch 2 Spotted on Google Play Console; Specifications, Chipset Tipped

Google Pixel Watch 2 is expected to launch soon as the successor to the Google Pixel Watch , which was released in October 2022. The debut Pixel Watch model has a 1.2-inch AMOLED touch display protected by 3D Corning Gorilla Glass 5 and an Exynos 9110 SoC. The watch also claimed to have a battery life of up to 24 hours. However, the upcoming Pixel Watch 2 is expected to include an improved SoC and battery. There have been several rumours about the purported smartwatch. The wearable was reportedly spotted on Google Play Console, where some key specifications were listed. A 9to5Google report says that the Pixel Watch 2 was recently spotted on the Google Play Console. The listing shows some key details of the smart wearable, including its processor, display and software specifications. The report says that the watch is likely to be powered by a Qualcomm SW5100 SoC, which claims to be the Snapdragon W5 chipset, not the W5+ variant. It is also tipped to retain 2GB of RAM from the precedi