Search by job, company or skills
Project Summary
Immediate requirement for a current project, we need to get started asap
40 hrs/Week for a 2-month period, will likely extend to 40 hrs/week on-going
Open to contract or contract-to-hire
Responsibilities
Data Infrastructure and Data Modeling - Help design and implement end-to-end data platforms, assist backend teams modeling their upstream transactional data, and physical data modeling for modern data warehouses
Data Pipelines - build end-to-end ETL/LT pipelines to populate the Data Warehouse and empower downstream analytical data consumers
Qualifications:
BS degree in Computer Science or related technical field, or equivalent practical experience.
5+ years proven work experience as a Data Engineer, working with at least one programming language (Python, Scala, Go) plus SQL expertise.
5+ years experience developing data platform infrastructure and ETL/ELT design and implementation, including development best practices in testing, logging, monitoring.
Extensive experience implementing CICD practices and infrastructure as code (terraform) as a foundation for developing data platforms from the group up on cloud ecosystems.
Background working with distributed big data technologies such as Spark, Presto, Hive, etc. as well as data warehouse technologies including Snowflake, BigQuery, or Redshift.
Experience with implementing microservice architectures, event based processing, and streaming pipelines, including expertise with Kafka and Spark Streaming.
Extensive data modeling knowledge including in NoSQL, transactional dBs, columnar and row based distributed analytical data environments, as well as prior experience designing and implementing best practice Data Lakes.
Knowledge of agile software development and continuous integration / deployment principles.
Benefits
Date Posted: 12/07/2024
Job ID: 84424757