The principal economist and his team address unique challenges using techniques at the intersection of microeconomics, statistics, and machine learning.
Tokenizing time series data and treating it like a language enables a model whose zero-shot performance matches or exceeds that of purpose-built models.