🧠 Data Engineer
📍 Location: Ridgefield, Connecticut (Hybrid – 2–3 days onsite per week)
💼 Openings: 2
🏢 Industry: Information Technology / Life Sciences
🎓 Education: Bachelor’s degree in Computer Science, MIS, or related field (Master’s preferred)
🚫 Visa Sponsorship: Not available
🚚 Relocation: Available for the ideal candidate
💰 Compensation: $140,000 – $185,000 base salary + full benefits
🕓 Employment Type: Full-Time | Permanent
🌟 The Opportunity
Step into the future with a global leader in healthcare innovation — where Data and AI drive transformation and impact millions of lives.
As part of the Enterprise Data, AI & Platforms (EDP) team, you’ll join a high-performing group that’s building scalable, cloud-based data ecosystems and shaping the company’s data-driven future.
This role is ideal for a hands-on Data Engineer who thrives on designing, optimizing, and maintaining robust data pipelines in the cloud, while collaborating closely with architects, scientists, and business stakeholders across the enterprise.
🧭 Key Responsibilities
Design, develop, and maintain scalable ETL/ELT data pipelines and integration frameworks to enable advanced analytics and AI use cases.
Collaborate with data architects, modelers, and data scientists to evolve the company’s cloud-based data architecture strategy (data lakes, warehouses, streaming analytics).
Optimize and manage data storage solutions (e.g., S3, Snowflake, Redshift), ensuring data quality, integrity, and security.
Implement data validation, monitoring, and troubleshooting processes to ensure high system reliability.
Work cross-functionally with IT and business teams to understand data requirements and translate them into scalable solutions.
Document architecture, workflows, and best practices to support transparency and continuous improvement.
Stay current with emerging data engineering technologies, tools, and methodologies, contributing to innovation across the organization.
🧠 Core Requirements
Technical Skills
✅ Hands-on experience with AWS data services such as Glue, Lambda, Athena, Step Functions, and Lake Formation.
✅ Strong proficiency in Python and SQL for data manipulation and pipeline development.
✅ Experience in data warehousing and modeling (dimensional modeling, Kimball methodology).
✅ Familiarity with DevOps and CI/CD practices for data solutions.
✅ Experience integrating data between applications, data warehouses, and data lakes.
✅ Understanding of data governance, metadata management, and data quality principles.
Cloud & Platform Experience
Expertise in AWS, Azure, or Google Cloud Platform (GCP) – AWS preferred.
Knowledge of ETL/ELT tools such as Apache Airflow, dbt, Azure Data Factory, or AWS Glue.
Experience with Snowflake, PostgreSQL, MongoDB, or other modern database systems.
Education & Experience
🎓 Bachelor’s degree in Computer Science, MIS, or related field
💼 5–7 years of professional experience in data engineering or data platform development
⭐ AWS Solutions Architect certification is a plus
🚀 Preferred Skills & Attributes
Deep knowledge of big data technologies (Spark, Hadoop, Flink) is a strong plus.
Proven experience troubleshooting and optimizing complex data pipelines.
Strong problem-solving skills and analytical mindset.
Excellent communication skills for collaboration across technical and non-technical teams.
Passion for continuous learning and data innovation.
💰 Compensation & Benefits
💵 Base Salary: $140,000 – $185,000 (commensurate with experience)
🎯 Bonus: Role-based variable incentive
💎 Benefits Include:
Comprehensive health, dental, and vision coverage
Paid vacation and holidays
401(k) retirement plan
Wellness and family support programs
Flexible hybrid work environment
🧩 Candidate Snapshot
Experience: 5–7 years in data engineering or related field
Key Skills: AWS Glue | Python | SQL | ETL | CI/CD | Snowflake | Data Modeling | Cloud Architecture
Seniority Level: Mid–Senior
Work Arrangement: 2–3 days onsite in Ridgefield, CT
Travel: Occasional
🚀 Ready to power the future of data-driven healthcare?
Join a global data and AI team committed to harnessing the power of cloud and analytics to drive discovery, innovation, and meaningful impact worldwide.
