Job Details

Staff Data Science - Trust and Safety

  2025-10-02     DataBricks     San Francisco,CA  
Description:

RDQ426R282

Databricks is building the world's best and most secure platform for data and AI. We innovate and deploy industry-leading solutions in security, compliance, and governance.

As a member of the Trust and Safety Data Science team, you will work on projects critical to ensuring the security and compliance of the Databricks Platform. Our customers depend on Databricks to keep their data safe, all while orchestrating millions of virtual machines across three clouds in dozens of regions around the globe.

Our engineering teams build highly technical products that fulfill real, important needs in the world. We always push the boundaries of data and AI technology, while simultaneously operating with the security and scale that is critical to making customers successful on our platform. We serve many companies with varying security and compliance needs. To efficiently serve these markets, we need to understand how customers use our existing features. This requires data-driven analysis of all aspects of security programs at Databricks.

Customers also trust Databricks with their most valuable data and we have the mission to build the most trusted data analytics and ML platform in the world. We're looking to expand our Trust and Safety Data Science team. You will join a group of "full stack" data scientists who partner with engineering and security teams, focusing on strategic plans that make Databricks secure and safe for our customers. The team will use statistical and machine learning techniques for fraud and abuse detection on our platforms using state of the art methods . You can read more about some of our efforts in this blog post. The work in fraud and abuse detection is dynamic and essential, offering an opportunity to make a substantial impact in maintaining the security and efficiency of business operations.

More information is available at

The impact you will have:

* You will develop and implement Machine Learning models to detect anomalous activity in products that we offer.

* You will analyze the performance and pricing of security-related features and work with product and engineering teams to identify important opportunities.

* You will collaborate with security engineers, trust and safety experts, and machine learning engineers to build a variety of systems and tools that protect Databricks and our customers from threats.

* You will create solutions and frameworks to meet compliance requirements at Databricks

* You will gather requirements, define project OKRs and milestones, and communicate progress to both technical and non-technical audiences.

* You will guide junior data scientists and interns on the team by helping with project planning, technical decisions, and code and document review.

* You will represent the data science discipline throughout the organization, using your powerful voice to make us more data-driven.

* You will represent Databricks at academic and industrial conferences and events.

What we look for:

* 7+ years of data science, machine learning, and advanced analytics experience in high-velocity, high-growth companies

* Understanding of good software engineering practices around testing, code reviews, and deployment.

* Experience working in a highly cross functional alignment and talking about results to non-technical partners.

* Experience deploying Data Science / ML solutions in production to achieve results.

* Coding skills in SQL and a software development language (preferably Python)

* Experience with distributed data processing systems like Spark and familiarity with software engineering principles.

* Prior experience applying machine learning and data analytics to identify SaaS product misuse and enhance compliance preferred but not required.

* Masters or higher in quantitative fields or equivalent experience in industry


Apply for this Job

Please use the APPLY HERE link below to view additional details and application instructions.

Apply Here

Back to Search