He Automates Data Pipelines for Marketing Automation At Scale

Raj Jain shares how he built Amazon's Email Prioritization Service and LLM-powered translation pipelines at Tarro, plus the future of AI-native CRMs and GTM stacks.

Watch Episode

Listen to Episode

Show Notes

In this inaugural episode, Raj Jain — a GTM systems engineer who has worked at Amazon, WeWork, MongoDB, Dynamic Yield, and Tarro — shares how he approaches building scalable go-to-market systems. Raj discusses his time on Amazon's central email marketing team, where he helped manage communication limits and email prioritization at massive scale, even before LLMs existed. He explains how those foundational processes laid the groundwork for AI adoption. Raj then dives into his work at Tarro, where he built LLM-powered translation pipelines to serve non-English-speaking restaurant owners, and discusses how AI-native CRMs are emerging as an alternative to legacy platforms like Salesforce and HubSpot. The conversation also covers the importance of human-in-the-loop workflows, behavioral psychology in user adoption, intent signals, and what the future of AI-native GTM stacks looks like.

Key Takeaways

  • Amazon's Email Prioritization Service reduced unsubscribes at scale
  • Human-in-the-loop is critical for AI system adoption
  • LLM pipelines can augment tools like Gong for multilingual sales intelligence
  • AI-native CRMs like Attio and Clay are reshaping the GTM stack
  • Enterprise process rigor vs. startup speed both have trade-offs

Guest

Raj JainGTM Systems Engineer (ex-Amazon, MongoDB, WeWork, Dynamic Yield, Tarro)

LinkedIn profile

Related Episodes