Atacama DQ Tech Lead for a software firm at Bangalore, Chennai, Gurgaon, Hyderabad, Jaipur, Mumbai, Pune

Client of Forstaffing Published: November 21, 2024
Location
Bangalore, Chennai, Gurgaon, Hyderabad, Jaipur, Mumbai, Pune, India
Job Type

Description

About the company: 

Location: Bangalore, Chennai, Gurgaon, Hyderabad, Jaipur, Mumbai, Pune

Team Strength- 2500+ people

About the Company:It is a business transformation accelerator. We partner with Fortune 500 companies to deliver fast, meaningful outcomes. As practitioners of end-to-end business and technology transformation, we deliver measurable results for clients across financial services, payments, retail, automotive, healthcare, manufacturing, and other industries. Founded in 2012, we have offices across the globe and over 2,500 energized employees.

About the Vacancy:
Designation: Atacama DQ Tech Lead

Experience required- 9-14 years.

Reporting To:  Manager

Vacancy: 20

Interview Processes-2 tech rounds, 1 engineering leadership role (maybe a client)

Mode of work: Hybrid (3 days WFO)

Joining time / Notice Period: Immediate to 30 days

 

ABOUT THE ROLE:

Responsibilities:

Works closely with the business owner / Product owner / Cross-functional Team and gathering technical requirements

Experience in building and maintaining reliable and scalable ETL pipelines on big data & / or Cloud platforms through the collection, storage, processing, and transformation of large datasets

Collaborate with other teams to design, develop, and deploy data tools that support both operations and product use cases

Work with the team in solving problems in big data technologies and prototype solutions to improve our data processing architecture

Must-Have:

Ensuring proper execution of tasks and alignment with business vision and objectives

Oversee activities of the junior data engineering teams

Works closely with the business owner / Product owner / Cross-functional Team and gathering technical requirements

Experience in building and maintaining reliable and scalable ETL pipelines on big data/Cloud platform through the collection, storage, processing, and transformation of large data-sets

Experience working with varied forms of data infrastructure inclusive of relational databases such as SQL, Hadoop, SparkProficiency in scripting languages in PySpark, python

Experience in AWS cloud & DevOps

Experience in Databricks

Experience in Database design/data modelling

Must have strong experience in data warehouse concepts, understanding Data Lake, Lake HouseConcepts, any new Big Data eco system.

Experience in testing and validation in order to support the accuracy of data transformations and data verification

Should be able to independently drive the requirements, have solutioning mindset facing businessstakeholders

Soft Skills:

Excellent Verbal and Communication skills needed.

Should be a excellent team player.Good knowledge on Agile principles and experience working in scrum teams using Jira

Should be comfortable to mentor junior engineers in the team (For Senior Data Engineers)

Should be able to operate in an ambiguous environment with minimal guidance.

 

Must-Haves :

Experience in Ataccama is mandatory

Experience working with varied forms of data infrastructure inclusive of relational databases such as

SQL, Hadoop, Spark

Proficiency in scripting languages in PySpark, python

Experience in AWS cloud & DevOps

Experience in Databricks, Database design/data modelling

Must have strong experience in data warehouse concepts, understanding Data Lake, Lake House

Apply
Drop files here browse files ...
Are you sure you want to delete this file?
/