Job Details

Data Solutions Architect

  2025-06-04     Grocery Outlet     Emeryville,CA  
Description:

Our Mission :

Touching lives for the better

Our Vision :

Touching lives by being the first choice for bargain-minded consumers in the U.S.

About the Team :

Our IT team's mission is to push the boundaries of technology with the intention of going above and beyond to aid stores and customers and deliver timely solutions to benefit all members of Grocery Outlet. Our team consists of problem solvers and go-getters who are dedicated to being service-oriented and solving important problems.

About the Role :

The Data Solutions Architect is responsible for designing, developing, and supporting the Data Platform primarily in Sales, Inventory, Planning & Ecommerce Data Domains. You will ensure the highest data integrity, governance, and service to internal and external customers. Your role involves building highly scalable data pipelines and data marts using Python, Spark, and SQL, strengthening data governance, and increasing data value through user-friendly processes and system tools, while simplifying data architecture.

Responsibilities Include :

  1. Collaborate closely with product managers and other engineers to understand business priorities, workflows, and architect data solutions.
  2. Model enterprise business lines with robust documentation, target gold tables, and pipeline flow, going beyond bespoke tables for one-off use cases.
  3. Develop, deploy, and manage scalable pipelines on Databricks, and build data models/data marts.
  4. Investigate and leverage Databricks' capabilities for real-time data processing and streaming, using Spark Streaming, Delta Live Tables, etc.
  5. Maintain high code quality with data observability, metadata standards, and best practices.
  6. Build data pipelines using Python, PySpark, and SQL.
  7. Partner with data science and reporting teams to translate data requirements into models.
  8. Mentor junior data engineers as the team grows.
  9. Share knowledge through brown bags, tech talks, and evangelize best practices.
  10. Communicate architecture, gold tables, execution plans, releases, and training internally.
  11. Experience with building data warehouses and lake houses in cloud environments.
  12. Implement large-scale solutions involving ETL, data ingestion, BI, reports, dashboards, and analytics.
  13. Define and implement data management frameworks like data quality, metadata, master data, governance, and security using tools like Azure Purview and Databricks Unity Catalog.
  14. Develop on GCP, AWS, or Azure Databricks, utilizing Lakehouse architecture and tools such as Delta Lake, Autoloader, Delta Live Tables, etc.
  15. Understand on-premises and cloud workloads and suggest optimal solutions.
  16. Work with stakeholders to establish architecture, design, and implementation artifacts.
  17. Effectively communicate and present architecture and design to IT leaders and stakeholders.

About The Pay :

  • Base Salary Range : $125,000 - $145,000 annually
  • Annual Bonus Program
  • Equity
  • 401(k) Profit Sharing
  • Final compensation depends on experience, skills, and location.

About You :

  • Bachelor's or master's degree in computer science
  • 7+ years of industry experience in Data Engineering
  • 2+ years of experience at the data architect level leading modeling
  • 5+ years of experience in SQL, Spark, or other programming languages
  • 2+ years of experience with Databricks
  • Nice to have: familiarity with real-time ML systems in Databricks
  • Excellent communication skills for effective stakeholder engagement

To learn about how we collect, use, and secure your personal information, click here to see our privacy policy.

J-18808-Ljbffr


Apply for this Job

Please use the APPLY HERE link below to view additional details and application instructions.

Apply Here

Back to Search