Who You'll Work With
Slalom Emerge is a team of trailblazers helping ensure we achieve our strategic goals of investing for the future and pursuing innovation as a competitive advantage. We empower our local markets by identifying emerging capabilities, building multi-disciplinary teams, and providing access to niche and hyper-specialized expertise.
What You'll Do
• Lead initiatives to design and develop data architecture, data models, and standards for a variety of data warehouse and/or data integration projects, with a specific focus on Snowflake
• Develop and maintain documentation of the data architecture, data flow, data models, data dictionaries, and source to target mapping
• Design and develop scalable, reusable data ingestion frameworks to transform a variety of datasets
• Apply best practices for Snowflake architecture, ELT and data models
• Serve as a subject matter expertin cloud data architecture for the larger Slalom practice and contribute back to the community
• Deliver on the technical scope of projects & demonstrate thought leadership at clients as well as internally at Slalom
• Research, analyze, recommend, and select technical approaches for solving exciting and complex development and integration problems
• Develop multi-phased cloud data implementation roadmaps
• Travel up to 30-40% (regional)
What You'll Bring
• A passion for all things data; understanding how to work with it at scale, and more importantly, knowing how to get the most out of it.
• Deep data engineering and/or data warehousing experience
• Strong understanding of native Snowflake capabilities like data ingestion, data sharing, zero-copy cloning, tasks, Snowpipe etc.
• Demonstrated experience architecting, designing, and overseeing the implementation of a Snowflake Data Warehouse on AWS, GCP, or Azure Cloud
• Deep experience building cloud data solutions with (Azure, AWS, GCP, Snowflake) and migrating from on-prem to cloud
• Experience leading, managing and delivering complex cloud-based architecture engagements end-to-end
• Expertise in data modeling, with a good understanding of modeling approaches like Star schema and/or Data Vault
• Proficiency in a relevant programming language for cloud platforms e.g., Python, Java, C#, Unix or SQL
• Certifications are a plus; we love people who love to learn
Subscribe to job alerts and upload your resume!
*By registering with our site, you agree to our
Terms and Privacy Policy.