Tabular data breaks the assumptions that make scaling work for language and vision. There's no natural sequence, no spatial structure, no shared vocabulary across datasets. The architectures and scaling laws that power LLMs don't transfer. We've made the first breakthroughwith TabPFN but the hardest problems are still ahead.
At Prior Labs, Research Scientists drive the core model agenda. You'll define research directions, design novel architectures, and publish work that advances the field while ensuring your ideas translate into models that actually ship. We create cutting-edge models because the same people do both. As an early team member, you'll have significant technical ownership and room to grow as we scale.
The problems we're solving:
What we're looking for:
Nice to have: