Job Details

Job Information

Data & Analytics Engineer, AiDP
AWM-3111-Data & Analytics Engineer, AiDP
5/2/2026
5/7/2026
Negotiable
Permanent

Other Information

www.apple.com
Austin, TX, 78703, USA
Austin
Texas
United States
78703

Job Description

No Video Available
 

Weekly Hours: 40

Role Number: 200660032-0240

Summary

Imagine what you could do here. At Apple, new ideas have a way of becoming extraordinary products, services, and customer experiences very quickly. Bring passion and dedication to your work, and there’s no telling what you could accomplish.

AI & Data Platforms (AiDP) is IS&T's engine for AI-powered innovation. The team brings together data, application development, and machine learning — including generative AI — along with data services and customer success functions, to help IS&T build solutions more efficiently and streamline the adoption and embedding of generative AI across Apple.

Description

The Developer Experience Platform team is building the next generation of AI-powered tools that accelerate how applications are developed across Apple. We are looking for a Data & Analytics Engineer to help design, build, and scale the data foundation that powers this platform.
In this role, you will develop robust data pipelines and analytics systems that enable AI agents, autonomous workflows, and data-driven insights—directly impacting how software is built at scale.

Minimum Qualifications

  • 3+ years of hands-on experience in data engineering, analytics engineering, or a related role in a production environment

  • Proficiency in Python and SQL, including pipeline development, automation, and performance optimization

  • Hands-on experience with cloud data warehouses (e.g., Snowflake, BigQuery, or Databricks)

  • Experience implementing monitoring, logging, and observability for data pipelines

  • Experience with data modeling

  • B.S. in Computer Science or similar or equivalent industry experience

Preferred Qualifications

  • Experience building AI/LLM-powered data pipelines, including RAG systems and integrations with APIs such as OpenAI or Anthropic

  • Experience with real-time/streaming data systems such as Apache Kafka, Flink, or Spark Structured Streaming

  • Experience with workflow orchestration tools such as Airflow, Prefect, or Dagster

  • Knowledge of MLOps workflows, including feature engineering, model deployment, and monitoring (e.g., MLflow, Vertex AI)

  • Experience with data quality, governance, and lineage tools (e.g., Great Expectations, Monte Carlo)

  • Experience building and maintaining ELT pipelines using DBT

  • Experience building dashboards and analytics using tools like Tableau, Looker, or Power BI

  • Working knowledge of cloud platforms (AWS, GCP, or Azure) and associated data services (e.g., S3, Glue, Dataflow)

Other Details

No Video Available
--

About Organization

 
About Organization