Looking for a Data Engineer for a FTE Chicago based company. Data Engineer will collaborate with various other IT groups

On Site Chicago IL
Job Posting Date:
JOb Description

The Data Engineer will collaborate with various other IT groups, business partners and external service providers andplay a key role in the design, development and operations of our clients new analytics platform


Duties and Responsibilities: 

·   Participate in Requirements Gathering: work with key business partner groups (e.g. Product Mgt) and other Data Engineering personnel to understand department-level data requirements for the analytics platform

·   Design Data Pipelines: work with other Data Engineering personnel on an overall design for flowing data from various internal and external sources into the analytics platform

·   Build Data Pipelines: leverage standard toolset and develop ETL/ELT code to move data from various internal and external sources into the analytics platform

·   Support Data Quality Program: work with Data QA Engineer to identify automated QA checks and associated monitoring & alerting to ensure analytics platform maintains consistently high quality data

·   Support Operations: triage alerts channeled to you and remediate as necessary

·   Technical Documentation: leverage templates provided and create clear, simple and comprehensive documentation for your development

·    Key contributor to defining, implementing and supporting: Data Services ,Data Dictionary, Tool Standards, Best Practices, Data Lineage, User Training



·       Strong ELT/ ETL designer/developer

·       Strong SQL

·       Strong Python

·       Structured & unstructured data expertise

·       Cloud environment development & operations experience (e.g. AWS, GCP)

·       Preference for candidates experienced with: Google Cloud Platform (GCP) and

        associated services; e.g. Big Query, GCS,

        Cloud Composer, Data proc, Dataflow, Data prep,Cloud Pub/Sub, Metadata DB,

        Data Studio,

        Datalab, other

·       Other important tools: Apache Airflow (scheduler), Bitbucket and git (version control),

        Stack driver (ops monitoring),

        Opsgenie (alert notification), Docker

·       Real-time data replication/streaming tools

·       Data Modeling

·       Excellent verbal and written communications

·       Strong team player


 Success Criteria

·       Strong analytical thinking and problem solving skills

·       Superior communication and business-technical interaction skills

·       Positive, “get it done” attitude

·       Ability to multi-task and manage multiple activities with varying timelines



·       Bachelor’s degree in computer science, management information systems, or a

        related discipline 

·       5+years hands-on ETL/ELT design/development experience

·       Key resource on team(s) that have delivered successful enterprise-level analytics



Inquire about Role
Max file size 10MB.
Upload failed. Max size for files is 10 MB.
Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.