Associate Data Movement Engineer (Snowflake) - Contractor
New Haven, CT, US, 06511
Feel Good About Doing Good
The Knights of Columbus is a tax-exempt Catholic fraternal benefit society that provides financial security to members and their families through our life insurance, long-term care insurance, disability income insurance, investment and annuity products. Charity is at the core of our missions: our profits are donated to help those in need and to support our faith - $1.73B over the past ten years.
While we have many employees who are not Catholic, we follow the Church’s teachings in our investment strategies and our employee benefits. As part of our religious mission, we support the pro-life cause by contributing to the March for Life and pregnancy resource centers, we oppose assisted suicide and euthanasia, we are evangelists for the Catholic faith, and we help Christians who are facing religious persecution in the Middle East. We all work together to support our two million members as they volunteer to help others in their parishes and communities around the world.
Share Your Talent. Live Your Purpose.
We are a growing and purpose-driven community of professionals. Join us to discover how you can meet your goals and ours!
#LI-Hybrid
Overview
The Knights of Columbus is embarking on the modernization of its core data platforms. We are currently seeking an Associate Data Movement Engineer to lead the design, development, and optimization of our cloud-based data warehouse. This role involves building scalable data models, managing ETL/ELT workflows, and ensuring the delivery of high-quality, secure, and efficient data. The ideal candidate will have strong hands-on experience with Snowflake, Data Vault modeling, and modern orchestration frameworks, along with a passion for data architecture and governance.
Core Responsibilities
- Complete full life cycle of ETL/ELT development to address business needs or resolve issues including design, mappings, data transformations, scheduling and testing.
- Translate data movement requirements into technical designs.
- Develop data extraction and transmissions to external vendors.
- Develop test plans and perform unit testing.
- Create supporting documentation for new processes.
- Work closely with data analysts to gain understanding of business processes and corporate data.
- Determine impacts to data warehouse structures and information flows due to changes in business or technical requirements.
- Contribute to architectural decisions to support business processes.
- Provide production support for data solutions.
- Complete root cause analysis and contribute to remediation planning and implementation.
- Perform data quality analysis, report data quality issues and propose solutions for data quality management.
- Learn and expand upon internal controls and participate in customer support.
- Prepare effort estimation including researching and estimating costs of software development, unit testing. May provide estimates for upgrades of vendor packages upgrades and integration with existing systems.
- Monitor and troubleshoot data warehouse jobs and Snowflake pipelines in production.
- Investigate and resolve data quality issues, performance bottlenecks, and system failures.
- Perform root cause analysis and implement long-term solutions to recurring problems.
- Design and maintain scalable data models using Data Vault 2.0 methodology.
- Ensure alignment of data models with business requirements and enterprise architecture standards.
- Collaborate with data architects and business analysts to evolve the data warehouse design.
- Develop and optimize SQL scripts, stored procedures, and data pipelines in Snowflake.
- Implement performance tuning strategies for queries and warehouse resources.
- Manage Snowflake roles, permissions, and data security policies.
- Implement data validation, reconciliation, and lineage tracking mechanisms.
- Support data governance initiatives including metadata management and audit compliance.
- Ensure consistent and reliable data delivery to downstream systems and reporting platforms.
- Manage and enhance job scheduling and orchestration using Apache Airflow.
- Automate routine support tasks and improve monitoring capabilities.
- Work closely with business users, data engineers, and application teams to resolve issues and deliver enhancements.
- Maintain detailed documentation of data models, workflows, and production support procedures.
- Contribute to knowledge base and support documentation for recurring issues.
- On Call and/or after hours work required.
- Other related duties as directed.
Skills Qualifications
- Bachelor’s degree in Computer Science, Information Systems, or related field.
- 1+ years of experience in data warehousing and production support.
- Strong hands-on experience with Snowflake and Data Vault modeling. Snowflake Certification Preferred.
- Proficiency in SQL and data architecture.
- Proficiency in scripting languages (e.g., Python, Shell).
- Familiarity with ETL/ELT tools and orchestration frameworks (e.g., Airflow, Talend).
- Experience with incident tracking systems (e.g., ServiceNow, Jira) and CI/CD practices.
- Some experience with cloud-based database technologies.
- Working knowledge of data warehousing concepts, structures and ETL best practices.
- Ability to solve problems using analytical thinking skills.
- Must work well independently and be inquisitive.
- Strong organizational and time management skills.
- Strong communication skills including verbal and written.
- Strong project management skills to ensure timely delivery.
Education
- Bachelor’s degree in computer science, Information Systems, or related field
- 1+ years' experience in data warehousing and ETL development
Physical Demands
Nearest Major Market: New Haven
Nearest Secondary Market: Hartford