Data Engineer – Snowflake

Full Time

Rotational Shift

Job Location: Nagpur / Pune
No. Of Positions: 3
Experience Required: 5 to 7 years

Job Purpose:

  • We are looking for a savvy Data Engineer to join our growing team of analytics experts. The hire will be responsible for expanding and optimizing our data and data pipeline architecture, as well as optimizing data flow and collection for cross functional teams. The ideal candidate is an experienced data pipeline builder and data wrangler who enjoys optimizing data systems and building them from the ground up. The Data Engineer will support our software developers, database architects, data analysts and datascientists on data initiatives and will ensure optimal data delivery architecture is consistent throughout ongoing projects. They must be self-directed and comfortable supporting the data needs of multiple teams, systems and products. The right candidate willbe excited by the prospect of optimizing or even re-designing our company’s data architecture to support our next generation of products and data initiatives.

Snowflake Data Engineer

  • Must Have: SQL, Python, Snowflake, Data Modelling, ETL, Snowpark
  • Good to Have: DBT (Data Build Tool), API Integration (AWS Lambda), Git, AWS S3 Integration

Knowledge, Skills and Experience:

  • Proficiency in crafting and optimizing complex SQL queries and Stored Procedures for data transformation, aggregation, and analysis within the Snowflake platform.
  • Experience with Snowflake cloud data warehousing service, including data loading, querying, and administration.
  • In-depth understanding of ETL processes and methodologies, leveraging Snowflake's capabilities.
  • Familiarity with DBT (Data Build Tool) for data transformation and modeling within Snowflake.
  • Expertise in integrating Snowflake with AWS S3 storage for data storage and retrieval.
  • Proficiency in Snowpark, enabling data processing using Snowflake's native programming language.
  • Skill in API integration, specifically integrating Snowflake with AWS Lambda for data workflows.
  • Adeptness in version control using GitHub for collaborative code management.
  • Adeptness in troubleshooting data-related issues within the Snowflake ecosystem, ensuring data quality and consistency.
  • Skill in creating clear and concise technical documentation, facilitating communication and knowledge sharing.
  • Designing efficient and well-structured data schemas within Snowflake.
  • Utilizing Snowflake's features for data warehousing, scalability, and performance optimization.
  • Leveraging Python programming for data processing, manipulation, and analysis in a Snowflake environment.
  • Implementing data integration and transformation workflows using DBT.
  • Writing and maintaining scripts for data movement and processing, using cloud integration.

We are on the lookout for dynamic individuals that bring energy and passion to their work, just like us. As
an innovation-driven organization, we offer high-impact careers and growth opportunities across global
locations. Our collaborative work environment is designed to help NICE thrive, learn, and grow through
targeted learning and development programs as well as generous benefits and perks.