WWW.THEVERGE.COM
The AI talent wars are just getting started
For my last issue of the year, Im focusing on the AI talent war, which is a theme Ive been covering since this newsletter launched almost two years ago. And keep reading for the latest from inside Google and Meta this week.But first, I need your questions for a mailbag issue Im planning for my first issue of 2025. You can submit questions via this form or leave them in the comments.Its like looking for LeBron JamesThis week, Databricks announced the largest known funding round for any private tech company in history. The AI enterprise firm is in the final stretch of raising $10 billion, almost all of which is going to go to buying back vested employee stock.How companies approach compensation is often undercovered in the tech industry, even though the strategies play a crucial role in determining which company gets ahead faster. Nowhere is this dynamic as intense as the war for AI talent, as Ive covered before.To better understand whats driving the state of play going into 2025, this week I spoke with Naveen Rao, VP of AI at Databricks. Rao is one of my favorite people to talk to about the AI industry. Hes deeply technical but also business-minded, having successfully sold multiple startups. His last company, MosaicML, sold to Databricks for $1.3 billion in 2023. Now, he oversees the AI products for Databricks and is closely involved with its recruiting efforts for top talent.Our conversation below touches on the logic behind Databrickss massive funding round, what specific AI talent remains scarce, why he thinks AGI is not imminent, and more.The following conversation has been edited for length and clarity:Why is this round mostly to help employees sell stock? Because $10 billion is a lot. You can do a lot with that.The company is a little over 11 years old. There have been employees that have been here for a long time. This is a way to get them liquidity.Most people dont understand that this is not going into the balance sheet of Databricks. This is largely going to provide liquidity for past employees, [and] liquidity going forward for current and new employees. It ends up being neutral on dilution because theyre shares that already exist. Theyve been allocated to employees and this allows them to sell those to cover the tax associated with those shares.How much of the rapid increases in AI company valuations have to do with the talent war?Its real. The key thing here is that its not just pure AI talent people who come up with the next big thing, the next big paper. We are definitely trying to hire those people. There is an entire infrastructure of software and cloud that needs to be built to support those things. When you build a model and you want to scale it, that actually is not AI talent, per se. Its infrastructure talent.The perceived bubble that were in around AI has created an environment where all of those talents are getting recruited heavily. We need to stay competitive.Who is being the most aggressive with setting market rates for AI talent?OpenAI is certainly there. Anthropic. Amazon. Google. Meta. xAI. Microsoft. Were in constant competition with all of these companies.Would you put the number of researchers who can build a new frontier model under 1,000?Yeah. Thats why the talent war is so hot. The leverage that a researcher has in an organization is unprecedented. One researchers ideas can completely change the product. Thats kind of new. In semiconductors, people who came up with a new transistor architecture had that kind of leverage.Thats why these researchers are so sought after. Somebody who comes up with the next big idea and the next big unlock can have a massive influence on the ability of a company to win.Do you see that talent pool expanding in the near future or is it going to stay constrained?I see some aspects of the pool expanding. Being able to build the appropriate infrastructure and manage it, those roles are expanding. The top-tier researcher side is the hard part. Its like looking for LeBron James. There are just not very many humans who are capable of that.I would say the Inflection-style acquisitions were largely driven by this kind of mentality. You have these concentrations of top-tier talent in these startups and it sounds ridiculous how much people pay. But its not ridiculous. I think thats why you see Google hiring back Noam Shazeer. Its very hard to find another Noam Shazeer.A guy we had at my previous company that I started, Nervana, is arguably the best GPU programmer in the world. Hes at OpenAI now. Every inference that happens on an OpenAI model is running through his code. You start computing the downstream cost and its like, Holy shit, this one guy saved us $4 billion.You start computing the downstream cost and its like, Holy shit, this one guy saved us $4 billion.Whats the edge you have when youre trying to hire a researcher to Databricks?You start to see some selection bias of different candidates. Some are AGI or bust, and thats okay. Its a great motivation for some of the smartest people out there. We think were going to get to AGI through building products. When people use technology, it gets better. Thats part of our pitch.AI is in a massive growth base but its also hit peak hype and is on the way down the Gartner hype curve. I think were on that downward slope right now, whereas Databricks has established a very strong business. Thats very attractive to some because I dont think were so susceptible to the hype.Do the researchers you talk to really believe that AGI is right around the corner? Is there any consensus of when its coming?Honestly, theres not a great consensus. Ive been in this field for a very long time and Ive been pretty vocal in saying that its not right around the corner. The large language model is a great piece of technology. It has massive amounts of economic uplift and efficiencies that can be gained by building great products around it. But its not the spirit of what we used to call AGI, which was human or even animal-like intelligence.These things are not creating magical intelligence. Theyre able to slice up the space that were calling facts and patterns more easily. Its not the same as building a causal learner. They dont really understand how the world works.You may have seen Ilya Sutskevers talk. Were all kind of groping in the dark. Scaling was a big unlock. It was natural for a lot of people to feel enthusiastic about that. It turns out that we werent solving the right problem.Is the new idea thats going to get to AGI the test-time compute or reasoning approach?No. I think its going to be an important thing for performance. We can improve the quality of answers, probably reduce the probability of hallucinations, and increase the probability of having responses that are grounded in fact. Its definitely a positive for the field. But is it going to solve the fundamental problem of the spirit of AGI? I dont believe so. Im happy to be wrong, too.Do you agree with the sentiment that theres a lot of room to build more good products with existing models, since they are so capable but still constrained by compute and access?Yeah. Meta started years later than OpenAI and Anthropic and they basically caught up, and xAI caught up extremely fast. I think its because the rate of improvement has essentially stopped.Nilay Patel compares the AI model race to early Bluetooth. Everyone keeps saying theres a fancier Bluetooth but my phone still wont connect.You see this with every product cycle. The first few versions of the iPhone were drastically better than the previous versions. Now, I cant tell the difference between a three-year-old phone and a new phone.I think thats what we see here. How we utilize these LLMs and the distribution that has been built into them to solve business problems is the next frontier.ElsewhereGoogle gets flatter. CEO Sundar Pichai told employees this week that the companys drip-drip series of layoffs have reduced the number of managers, directors, and VPs by 10 percent, according to Business Insider and multiple employees I spoke with who also heard the remarks. Relatedly, Pichai also took the opportunity to add being scrappy as a character trait to the internal definition of Googleyness. (Yes, thats a real thing.) He demurred on the most upvoted employee question about whether layoffs will continue, though Im told he did note that there will be overall headcount growth next year.Meta cuts a perk. File this one under sad violin: Im told that, starting in early January, Meta will stop offering free EV charging at its Bay Area campuses. Keep your heads held high, Metamates.What else you should know aboutOpenAI teased its next o3 reasoning model (yes, o2 was skipped) with impressive evals.TikTok convinced the Supreme Court to hear its case just before its US ban is set to take effect. Meanwhile, CEO Shou Chew met with Donald Trump at Mar-a-Lago to (Im assuming) get a sense of what his other options are should TikTok lose its case.More tech-meets-Mar-a-Lago news: Elon Musk inserted himself into the meeting between Jeff Bezos and Trump. Robinhood donated $2 million to Trumps inauguration. And Softbank CEO Masayoshi Son pledged to invest $100 billion into AI tech in the US, which happens to be the same number he has floated for a chip venture to compete with Nvidia.Apple complained about Meta pressuring the EU to make iOS more compatible with third-party hardware. Anyone who has synced photos from the Ray-Ban Meta glasses to an iPhone will understand why this is a battle that is very important for Meta to win, especially as it gears up to release its own pair of AR glasses with a controller wristband next year.Amazon is delaying its return-to-office mandate in some cities because it doesnt have enough office space.Perplexity, which is projected to make $127 million in revenue next year, recently raised $500 million at a valuation of $9 billion. It also acquired another AI startup called Carbon to help it hook into other services, like Notion and Google Docs.Job boardA few notable moves this week:Meta promoted John Hegeman to chief revenue officer, reporting to COO Javier Olivan. Another one of Olivans reports, Justin Osofsky, was also promoted to be head of partnerships for the whole company, including the companys go-to-market strategy for Llama.Alec Radford, an influential, veteran OpenAI researcher who authored its original GPT research paper, is leaving but will apparently continue working with the company in some capacity. And Shivakumar Venkataraman, who was recently brought in from Google to lead OpenAIs search efforts, has also left.Coda co-founder and CEO Shishir Mehrotra will also run Grammarly now that the two companies are merging, with Grammarly CEO Rahul Roy-Chowdhury staying on as a board member.Tencent removed two directors, David Wallerstein and Ben Feder, from the board of Epic Games after the Justice Department said their involvement violated antitrust law.Former Twitter CFO Ned Segal has been tapped to be chief of housing and economic development for the city of San Francisco.More linksMy full Decoder interview with Arm CEO Rene Haas about the AI chip race, Intel, and more.Waymos new report shows that its AV system is far safer than human drivers.The US AI task forces recommendations and policy proposals.Apples most downloaded app of the year was Temu, followed by Threads, TikTok, and ChatGPT.Global spending on mobile apps increased 15.7 percent this year while overall downloads decreased 2.3 percent.If you arent already getting new issues of Command Line, dont forget to subscribe to The Verge, which includes unlimited access to all of our stories and an improved ad experience on the web. Youll also get access to the full archive of past issues.As always, I want to hear from you, especially if you have a tip or feedback. Respond here, and Ill get back to you, or ping me securely on Signal.Thanks for subscribing.
0 Comments 0 Shares 28 Views