Direct Hire

Analytics / Data Engineer

Remote

  • Upload File

We are seeking analytics/data engineer applications for several well-known companies and marquee clients. All backgrounds and levels of experience considered, but must be an A-player.

Summary

As an analytics / data engineer, your primary responsibility will be to help create and maintain a data infrastructure that enables the wider team to deliver timely, accurate and meaningful reports, analyses and insights that stakeholders can use to measure performance and make better business decisions.

Duties include supporting the design, development, testing, maintenance and optimization of data pipelines and data marts; updating data from multiple sources; batch and real-time processing of data and preparing it for consumption; and helping to ensure high data quality standards are met through good documentation and governance.

This role has a unique opportunity to gain knowledge and experience building cloud-based solutions using cutting-edge technology for data science and engineering projects strongly supported by senior leadership.

Responsibilities

  • – Partner with data science, engineering, marketing and product on schema design, event structure and data architecture to enable scalability.
  • – Design and build batch and real-time data pipelines for ingestion and consumption using Python, SQL, and maybe Spark, between disparate data sources and AWS, Azure or GCP data warehouses.
  • – Develop, test and automate ETL processes to incorporate new data into data platform.
  • – Work with cross-functional teams to ensure the data they need is structured so that it is easy to access and understand.

About You

  • – You are naturally curious.
  • – You are a self-starter and detail-oriented.
  • – You care about and take pride in the quality of your work.
  • – You are serious about following up and following through.
  • – You quickly absorb new information and love to learn new skills.
  • – You work well independently and can handle some level of ambiguity.

Requirements

  • – 3+ years experience with a cloud data platform (e.g., GCP, AWS, Azure)
  • – 2+ years experience with modern DevOps tools (e.g., dbt, GitHub, FiveTran, Airflow, Prefect)
  • – 1+ years experience with relational DBs (e.g., Snowflake, Redshift, Databricks)
  • – Fluent in SQL and Python (including Jupyter Notebook, Pandas, Spark)
  • – Familiarity with GCP toolset (Big Query, Cloud Composer, Dataflow, Pub/Sub)

Benefits

  • – Work remotely
  • – Medical, dental and vision
  • – Bonus + equity potential
  • – 401k + matching
  • – Flex time off
  • – Paid company holidays
  • – Paid parental leave