Tianda Li

Tianda Li李天达

Bringing AI automation into traditional industries. Founder of DeepTensor.

Available for collaborations · Toronto → Singapore

Bio

I'm an applied research scientist focused on bringing AI into traditional industries — building automation systems that take recent advances in LLMs and agentic systems out of papers and into real business workflows. In parallel, I'm developing an AI-driven trading system that combines my research background with quantitative finance.

I founded DeepTensor Pte. Ltd. in Singapore as the vehicle for this work — an independent AI lab building products that bring automation into traditional industries.

My research background spans large language models, document intelligence (KIE, visual document understanding), knowledge distillation, and agentic systems. Outside of company work, I've trained LLMs at scale (up to 18B parameters) on distributed infrastructure I built and operated end-to-end.

I received my M.A.Sc. from Queen's University (Canada), advised by Prof. Xiaodan Zhu. Before that I studied at Nankai University (B.Eng. in Information Security + B.A. in Law, dual degree) and spent a year on exchange at École Centrale de Nantes in France.

What I'm building

AI for Traditional Industries Workflow Automation Agentic Systems LLM Applications AI Trading Systems
Large Language Models Document Intelligence Knowledge Distillation Visual Document Understanding Key Information Extraction Distributed Training Sparse Pretraining Conversational AI

Where I've been

2023 — Now
Founder · DeepTensor Pte. Ltd.
Independent AI lab building products for traditional industries · Singapore
2023 — 2026
Senior Applied Research Scientist · ServiceNow Research (ATG)
SOTA application of CV & key information extraction in enterprise products · Llama Award, 2024
2022 — 2023
ML Engineer · Cerebras Systems
Sparse pretraining of GPT-scale models on the Wafer-Scale Engine
2020 — 2022
ML Researcher · Huawei Noah's Ark Lab
Knowledge distillation research; first-author papers at EMNLP and NeurIPS workshops
2018 — 2020
M.A.Sc. & Research Assistant · Queen's University
TAML Lab, advised by Prof. Xiaodan Zhu · NLP, distillation, dialogue systems
2018
Research Intern · iFlytek
Natural language inference research on the SNLI dataset
2014 — 2018
B.Eng. (Information Security) + B.A. (Law) · Nankai University
Dual degree · Exchange at École Centrale de Nantes (France), 2017 — 2018

Selected work

ACL 2022 · DialDoc Workshop Best Paper
Document-grounded Dialogue Research
Best Paper Award at the DialDoc Workshop, ACL 2022.
UAI 2023
Sparse Pretraining of GPT-scale Language Models
Sparse pretraining methods on the Cerebras Wafer-Scale Engine.
EMNLP · First Author
Knowledge Distillation for Language Models
First-author work on model compression and efficient transfer of knowledge in pretrained LMs (Huawei Noah's Ark Lab era).
NeurIPS Workshop · First Author
Efficient Transfer in Pretrained Models
Workshop paper extending the distillation line of work on large pretrained language models.
More on Google Scholar
Includes additional papers in EMNLP, ACL, ICASSP, and UAI.

Recent

2026
Relocating from Toronto to Singapore. Move
2025
Trained an 18B-parameter LLM independently end-to-end through DeepTensor.
2024
Received the ServiceNow Llama Award for contributions in LLM application research.
2023
Founded DeepTensor Pte. Ltd. in Singapore. Joined ServiceNow Research as Senior Applied Research Scientist.
2022
Best Paper at ACL 2022 DialDoc Workshop. Award