Job Summary:
We are seeking an experienced Snowflake + Matillion ETL Developer / Data Engineer to design, develop, and optimize cloud-based data pipelines and analytics solutions. The ideal candidate will have deep expertise in Snowflake Data Warehouse , Matillion ETL , AWS data services , and data modeling (Facts, Dimensions, SCD Type 2). This role involves building reliable, scalable, and high-performance data integration solutions that power business insights and reporting.
Key Responsibilities:
Design, develop, and deploy ETL/ELT pipelines using Matillion ETL and Snowflake for data ingestion, transformation, and analytics.
Build and maintain Matillion architecture including orchestration jobs, transformation workflows, and variable management.
Create and manage Snowflake objects databases, schemas, tables, views, streams, tasks, and stored procedures.
Implement data validation, cleaning, and transformation to ensure data accuracy, completeness, and consistency.
Design and maintain data warehouse models (Star/Snowflake schema) including Slowly Changing Dimensions (SCD Type 2) .
Integrate data from multiple sources such as AWS S3, RDS, Redshift, and APIs using Matillion and Python connectors.
Optimize SQL queries, transformations, and Snowflake workloads for performance and cost efficiency.
Collaborate with data architects, BI developers, and DevOps teams to implement CI/CD pipelines (Git, Jenkins, GitHub Actions).
Monitor ETL processes, troubleshoot issues, and ensure on-time delivery of reliable data pipelines.
Apply data governance, quality, and security best practices across all environment
Required Skills & Experience:
8 10 years of overall IT experience with 4+ years in Snowflake Data Engineering and 3+ years in Matillion ETL .
Strong proficiency in SQL and Snowflake scripting (Streams, Tasks, Procedures, Performance Tuning).
Expertise in Matillion ETL job design , orchestration, and optimization.
In-depth understanding of data modeling concepts Facts, Dimensions, Hierarchies, and SCD Type 2 .
Experience with AWS data services such as S3, Glue, Lambda, and Redshift.
Proficiency in Python for data transformations and automation.
Hands-on experience with CI/CD pipelines and Git-based version control systems.
Strong analytical, problem-solving, and communication skills.
Preferred Qualifications:
SnowPro Core Certification or Matillion Associate Certification .
Experience with BI tools (Tableau, Power BI, Looker) for data validation and visualization.
Familiarity with Airflow or AWS Step Functions for orchestration.
Understanding of data lake architecture , real-time streaming , and data governance frameworks .
...Work from Home Data Entry & Office Administration Remote Online Role About the Job We are seeking motivated individuals in Wilmington, Delaware... ..., and Amazon . Flexible scheduling allows part-time or full-time work to fit your availability. Job...
...Prisma International is actively recruiting experienced/qualified SPANISH Interpreters located in WISCONSIN to join our team as Independent Contractors. As an Interpreter at Prisma, you will be supporting Prisma Internationals state government client with ONSITE (in...
...through an advise-build-operate model across the complete product lifecycle. With deep domain expertise in regulatory sciences, clinical research solutions, quality & compliance, pharmacovigilance, medical information, and R&D technology, ProPharma offers an end-to-end...
...challenges, rewards and opportunity of working for one of the world's largest providers of products and services to the global energy industry.... ...control equipment. Operates well logging equipment at job... ...help you thrive at work and at home. By clicking here , you can...
PROCESSING ASSISTANT III Location Shelby, NC : Contact: NICOLE FORINASH HR SPECIALIST County Admin ****@*****.***.... ...perceive information at normal spoken word levels, work requires operating machines and observing general surroundings and activities;...