🎙 Listen to the full conversation

小宇宙 / Apple Podcasts / Spotify → 「离线时间」

This is the seventh episode of Stolen Chat. Yang Kai studied math at the University of Toronto, took Geoffrey Hinton's class, and witnessed the moment AlexNet launched the deep learning era. Harvard stats master's, Wall Street quant trader, serial entrepreneur. Now in Singapore building Arros AI — AI infrastructure for the recruiting industry, ARR over $1M, recently appointed Chief AI Scientist by Nasdaq-listed YY Group.

We talked about two things: why finding people is fundamentally different from finding information, and why "lobsters" (agentic AI) represent a genuine paradigm shift. His answer was more modest than I expected — not because the technology is amazing, but because people are getting lazier.

What Hinton taught was curiosity

Yang Kai took Hinton's class as an undergrad at U of T. Deep learning wasn't mainstream yet. AlexNet had just appeared, and the accuracy of convolutional networks on classification tasks stunned everyone.

But what Hinton taught him that mattered most wasn't any specific technique. It was curiosity.

"North American education was a shock to me. Chinese traditional education is teacher-directed — master the knowledge points. North America values the spirit of exploration. The professor opens a door, shows you something interesting, and you think: this is fascinating, I want to dig into it too."

That curiosity sustained over a decade of deep work in AI, including entrepreneurship. "Startups are actually a lot like doing research in a lab — you're constantly studying new problems, trying new methods."

LLMs are just undergrad math done very deep

Yang Kai once said: LLMs are just an autoregressive model you learn as an undergrad — junior year material — executed at massive scale with enormous data.

He doesn't deny the value of large models, but emphasizes that the fundamentals haven't changed. The underlying principles are the same — how to capture relationships between words. The difference is the engineering scale: from pre-training to RLHF to data labeling, the sheer volume is unimaginable from a student's perspective. Training on clusters of tens of thousands of GPUs — that's not something a lab can do.

"Without the professors' earlier contributions, without AlexNet showing the world deep learning's potential, there wouldn't have been more investment from Google, no Transformer, no OpenAI. The whole thing is a chain reaction."

"Nothing is about the theory. It's about how deep you go."

From quant trading to AI recruiting

After Harvard stats, he went to Wall Street for quantitative trading. He left for a straightforward reason: the problem's scope wasn't big enough. He wanted greater impact.

Back in China, he started his first company: Huixiaor — an online meeting venue booking platform. The core problem was massive labor costs: leads from Baidu needed phone calls, matching venues required extensive training, and an operations specialist needed two to three months to learn all the venues, prices, layouts, and vibes.

He built an automated matching system: incoming leads get classified, routed to the right specialist, then automatically matched to venues based on structured requirements. Fewer people handling more demand, lead conversion rates improved, customer acquisition costs dropped from 2000+ to 800 RMB.

What connects quant trading and AI recruiting? Both are hard problems with a signal-to-noise challenge — the data is extremely dirty, noise is massive, and the core task is capturing the true signal.

China's SaaS graveyard

His second startup was Moyin Intelligence — voice technology. It exploded during COVID. Masses of finance professionals working from home needed automatic meeting transcription and summaries via Tencent Meeting. Within months: over a million users, $3M raised.

Then Feishu went all-in on free. Tencent Meeting added free features too. Moyin had launched nearly a year before Feishu, but big tech's free tier crushed them.

They pivoted to enterprise intelligent customer service, landing major clients like Yonyou and Kingdee. But enterprise had its own problems: companies partnered with startups because they were cheap and hungry. A single contract might be 1-2 million RMB but take a year to deliver. Year two, they'd push for lower prices — or just build a 60-70% replacement internally.

"China's consumer products have successful business models — advertising, gaming, e-commerce. But enterprise has never found a scalable business model."

Two years ago, he returned to overseas markets.

From AI recruiting player to AI recruiting infrastructure

Arros AI originally did AI recruiting directly. But they discovered a bigger opportunity: every recruiting company was desperate for AI transformation, anxious, urgently wanting AI capabilities.

So they made an elegant pivot — from industry player to industry infrastructure. Instead of competing in the market, they supply technology as ammunition to all recruiting platforms.

"You can think of it this way: every recruiting company is an AI recruiting company. So selling shovels became the biggest opportunity."

YY Group was one such company — they saw the technology, invested in Arros AI, and appointed Yang Kai as Chief AI Scientist. But beyond YY, many similar platforms have proactively sought partnerships.

Indexing information with keywords is easy. Indexing people with intent is hard.

I threw out an analogy during our conversation: Google used intent-based search to reorganize information. LLMs can also search for information. But what Arros does is fundamentally use algorithms to reconstruct people.

Yang Kai said people are extremely complex objects. A resume mentions "LLM" — does that mean the person works on LLMs? Not necessarily. They might be a recruiter hiring for LLM positions. Keywords have very limited utility — only weak correlation.

Finding people requires strong correlation and deep understanding. For example, a client says "I want someone with a Chinese background." How do you search with keywords? By surname? Zhao, Qian, Sun, Li? It doesn't work.

They built a technology called profile consolidation: aggregating a person's traces across LinkedIn, social media, news, and forums, using LLMs for source comparison and cross-validation — confirming this is the same Ellen Cheng, not a different one — then assembling a complete portrait.

"AI as a communication tool is becoming increasingly valuable. LinkedIn is the world's largest address book. AI already has enormous potential to become the Connector of the future."

"Lobsters" are the next AlexNet

Yang Kai's take on "lobsters" (Lovable/Bolt-style agentic AI platforms) is definitive: this is a genuine product paradigm shift, not buzz.

The reason isn't that the technology is impressive. It's that people are getting lazier.

"Before, I had to log into software, learn how to use it, click 100 buttons. Now I just talk and get the same result. Once people get used to this, there's no going back — just like once you're used to ChatGPT, you don't want to search Google. Once you're used to lobsters, going back to SaaS feels impossible."

Their recruiters use agentic AI for massive amounts of work: organizational chart mapping, sourcing, auto-sending WhatsApp and LinkedIn messages, generating PPTs, meeting notes, checking what salary a candidate discussed previously. Implementing this workflow used to require seven or eight SaaS tools. Now one conversational assistant handles it all.

He compared lobsters to the next AlexNet: "We've already seen through agentic AI the enormous potential — just like when I saw AlexNet at U of T. Once people see the potential, more investment follows, and it gradually creates larger and larger societal value."

The scarcest thing in the AI era

Final question: what's most important for building products in the AI era?

Yang Kai's answer: understanding demand.

"Engineering barriers are dropping everywhere — vibe coding, execution-layer tools, everything is getting simpler and cheaper. The core question remains unchanged: where is the demand? What does the customer actually need? How do I serve them better?"

The trend now is people becoming more composite — engineers becoming PMs, PMs becoming engineers, everyone needing to understand marketing and distribution. AI technology further expands the boundary of individual capability.

"This question is as old as time."

这是「离线时间」第七期对话。杨凯,多伦多大学数学系,Geoffrey Hinton的学生,亲历了AlexNet开启深度学习时代的那个瞬间。哈佛统计硕士,华尔街对冲基金量化交易员,连续创业者。现在在新加坡做Arros AI,为招聘行业提供AI基建,ARR超100万美金,上个月被纳斯达克上市公司YY Group任命为首席AI科学家。

这期聊了两件事:一是"找人"和"找信息"有什么不同,二是为什么龙虾(agentic AI)是真正的范式变化。他的回答比我预想的朴素得多——不是因为技术多牛,是因为人越来越懒了。

Hinton教的是好奇心

杨凯本科在多伦多大学数学系上过Hinton的课。当时深度学习还没成为显学,AlexNet刚刚出现,用卷积网络做分类任务的准确率让所有人震惊。

但他说Hinton教给他最重要的不是某个技术点,是好奇心。

"北美的教育体系给我带来很大的冲击。中国传统教育比较老师导向,要你掌握知识点。北美崇尚探索精神——老师把你引进门,介绍一些很有趣的东西,你就觉得这个很有意思,我也想研究研究。"

他说这种好奇心支撑了他十多年在AI领域的深耕,包括创业。"创业其实很像在实验室做研究——一直在研究新问题,尝试用新方法。"

LLM不过是大三学过的东西

杨凯说过一句话:LLM不过是一个大学本科就学过的autoregressive model,大三就学过的东西,只不过基于庞大数据量做得很深。

"如果没有教授们之前的贡献,没有AlexNet让人看到深度学习的巨大潜力,就不会有Google更多的投资,不会有Transformer,不会有OpenAI。整个是一个连锁反应。"

"什么事情都不在于理论,在于你做得有多深。"

从量化交易到AI招聘

哈佛统计毕业后去华尔街做量化交易。离开的原因很直接:问题的scope不够大,想做影响力更大的事。

回中国后连续创业。第一个项目是会小二——线上会议场地预定平台。他做了一套自动匹配系统,获客成本从2000+降到800。

量化交易和AI招聘的相通性在哪?他说都是hard problem,都是信噪比问题——数据非常脏,噪音巨大,核心是怎么capture true signal。

中国SaaS的血泪史

第二次创业做魔音智能,语音技术方向。疫情期间爆发——几个月累积上百万用户,融了300万美金。

但飞书后来全部主打免费,腾讯会议也加了免费功能。他们比飞书早了将近一年推出产品,但大厂免费碾压。

后来转做大企业智能客服。但大企业的问题也很明显:第二年就压价,或者干脆自己做个六七十分的替代你。

"中国的consumer产品有成功的商业模式——广告、游戏、电商。但enterprise至今没有找到scalable的商业模式。"

从AI招聘玩家到AI招聘基建

Arros AI最初是自己做AI招聘业务。但做着做着发现了一个更大的机会:每个招聘公司都在谋求AI转型,都很焦虑。

于是他们做了一个华丽转身——从行业玩家变成行业基建。不在市场里竞争,而是把技术当弹药供应给所有招聘平台。

"你可以理解为所有的招聘公司都是AI招聘公司。所以卖铲子变成了最大的机会。"

用关键词index信息很简单,用意图index人很难

杨凯说人是一个非常复杂的object。一份简历里写了"LLM",他就是做LLM的吗?不一定——他可能只是一个招LLM岗位的recruiter。关键词能做的事情非常有限,它只有弱相关性。

他们做了一个叫profile consolidation的技术:把一个人在LinkedIn、社交媒体、新闻、论坛上所有的留痕聚合起来,用LLM做信源比对和交叉验证,然后拼成一个完整画像。

"AI作为一个communication tool变得越来越有价值。LinkedIn就是世界上最大的通讯录。AI已经具备巨大的潜力,成为未来的Connector。"

龙虾就是下一个AlexNet

杨凯对龙虾(Lovable/Bolt这类agentic AI平台)的判断非常明确:是真正的产品范式变化,不是buzz。

原因不是技术多牛。是人越来越懒了。

"我原来要登录一个软件,学怎么操作,点100下按钮。现在我只是说话,就能实现一样的功能。人一旦习惯了这个东西就很难回去——就像用惯了ChatGPT就不想翻Google。用龙虾用惯了,退回SaaS就很难。"

他把龙虾比作下一个AlexNet:"我们已经通过龙虾看到了agentic AI非常巨大的潜力,就跟我当年在多大时候看到AlexNet一样。"

AI时代做产品最稀缺的是什么

杨凯的答案是:对需求的理解。

"现在很多工程的门槛都在降低,vibe coding这些技术、很多执行层面的东西都变得越来越简单越来越便宜。核心问题还是:需求在哪里?客户究竟需要什么?我应该怎么更好地满足?"

"这个问题亘古不变。"


This is an episode of「离线时间」Stolen Chat. If you're building in AI and thinking about going global, I'd love to hear from you. Arros AI →