Description
Must-Haves for this role :
- Microsoft Fabric experience required a minimum 1 year
- Mandatory skills: Experience building and supporting EDW using Microsoft technologies – ADF, Synapse, Power BI, Microsoft Fabric - Monitoring and Optimizing pipelines, Data quality, CI/CD process within Fabric
- Programming: Python/Spark
Nice to Haves :
- MS Fabric certificate
About the company:
Head office – Headquartered in Santa Clara, California
Job location in India - Bangalore, Hyderabad, Kolkata,
Team Strength- 3K+ people
About the Company: It is a digital products and platforms company that provides impactful results to its customers across North America, Europe, and Asia-Pacific. Founded in 2000, the company employs over 3000 people and is a recognized top employer. Tavant is creating an AI-powered intelligent enterprise by reimagining customer experiences, driving operational efficiencies, and improving collaboration. Leader in IDC MarketScape for warranty and service contract management, Tavant Warranty is a next-generation warranty management software solution. It uses AI and machine learning capabilities that transform the warranty and service process by constantly adapting to an analytics-first decision-making engine. The solution enables manufacturers to make intelligent warranty decisions, allowing businesses to concentrate on more complex and significant value-driven problem statements. Tavant KK is a wholly owned subsidiary of Tavant Technologies Inc., At Tavant KK we are looking for a Business Analyst to join our team in bringing value to our customers within the Manufacturing Industry in Japan. In this role, we are looking forward to personnel with the below roles and responsibilities.
About the position:
Designation: Sr. Data Engineer- TL
Experience required- 5-8 years.
Reporting To: Manager
Vacancy: 12
Interview Processes- 2rounds of technical interview + 1 round managerial round
ABOUT THE ROLE:
Responsibilities:
Designing, implementing, and maintaining scalable data pipelines within the cloud
Ensuring data quality, security, and optimal performance by: managing data ingestion, transformation, and distribution
Developing data models, optimizing cloud storage solutions
Monitoring data pipeline performance and identifying bottlenecks, optimizing queries and data structures to improve query response times, scaling data processing infrastructure to handle peak data loads
Implementing data governance policies
Ensure the solution adheres to security and compliance regulations within the cloud environment.
Working closely with data analysts, data scientists, infrastructure team, and business stakeholders to understand data needs and requirements.
Qualifications needed:
Bachelor's degree in Computer Science or a related field
Goof to have:
Master's degree in Computer Science or a related field- Certification in database administration or data engineering
Skill needed:
- Proven experience in data engineering-
- Experience building and supporting EDW using Microsoft technologies – ADF, Synapse, Power BI, Microsoft Fabric - Monitoring and Optimizing pipelines, Data quality, CI/CD process within Fabric.
- Programming: Python/Spark- Visualization – built or supported built of reports and dashboard-
- Strong proficiency in SQL.
- Experience with relational databases (MySQL/Oracle/MS SQL Server/Managed cloud databases)-
- Experience with data warehousing, data warehouse modeling, and ETL processes - Familiarity with cloud-based data management platforms Snowflake, AWS Redshift, or Databricks-
- Excellent problem-solving and analytical skills- Ability to work effectively in a collaborative team environment-
- Strong communication and interpersonal skills
- Experience with big data technologies such as Hadoop or Spark- Knowledge of DevOps practices and tools