Job Description
### About Mitre Media
Mitre Media is redefining FinTech with AI-driven tools that empower millions of investors. Our portfolio, including Dividend.com and MutualFunds.com, leverages LLMs to deliver novel data insights and visually rich user experiences. For over a decade, we’ve served individual investors, financial advisors, and top asset managers like BlackRock and Vanguard through our premium data, tools, and advertising solutions.
### About the Role
As Tech Lead Data Engineer, you’ll architect and maintain the data backbone powering every feature across our product suite. Reporting to the CTO, you will design Databricks-based ETL pipelines, model complex investment data, and surface low-latency, high-quality datasets for both user-facing features and internal AI/analytics workloads. You’ll collaborate in a remote-first hybrid culture that values in-person “bursts” of collaboration, follow ShapeUp for project planning, and ship pragmatic solutions that favor de-scoping over delays.
### Responsibilities
- Design, implement, and optimize large-scale ETL workflows in Databricks (Apache Spark, Delta Lake, DBT).
- Develop algorithms that transform raw market data into actionable insights.
- Own data quality and lineage, instituting tests, monitoring, and alerting for mission-critical pipelines.
- Evolve our cloud data platform (AWS & GCP) for scale, performance, and cost efficiency.
- Mentor engineers, championing best practices in code reviews, documentation, and DevOps for data.
### Required Technical Skills
- **Data Engineering Programming**: Expert in Python plus working knowledge of Scala or Java.
- **Databricks & Spark**: Hands-on with cluster tuning, job orchestration, and Delta Tables.
- **SQL & DBT**: Strong analytical SQL, modular data-model design, and CI/CD for transformations.
- **Cloud**: Production experience on AWS or GCP data services (e.g., S3/GCS, EMR/Dataproc, Glue/Dataflow).
- **ETL & Orchestration**: Solid grasp of EL(T) patterns, workflow scheduling, and incremental processing.
- **Data Modeling**: Dimensional and schema-on-read designs for analytics and AI.
- **AI & Analytics Enablement**: Building feature stores or inference-ready tables for ML/LLM workflows.
### Nice-to-Have Technical Skills
- Apache Airflow, Luigi, or Dagster for DAG orchestration.
- Experience with Looker, Tableau, or Stripe’s Vizier-style visualization stacks.
- Financial-markets domain knowledge (equities, ETFs, mutual funds).
- Machine-learning engineering or statistical-analysis background.
### Required Soft Skills
- Entrepreneurial mindset with a passion for AI and FinTech innovation.
- Self-starter who diagnoses problems, proposes trade-offs, and delivers.
- Clear communicator who thrives in distributed, cross-functional teams.
- Mentorship attitude—uplifts peers through code reviews and knowledge-sharing.
- Detail-oriented and driven.