Job Summary
Experience:
3.00 - 5.00 Years
Industrial Type:
IT-Software/Software Services
Location:
Mumbai
Functional Area:
IT Software - Other
Designation:
Azure Data Engineer
Key Skills:
Databricks, Python, Pyspark, Sql
Educational Level:
Graduate/Bachelors
Job Post Date:
2025-07-04 17:17:11
Stream of Study:
Degree:
BCA, BCS, BE-Comp/IT, BE-Other, BSc-Comp/IT, BSc-Other, BTech-Comp/IT, BTech-Other
Company Description
It is a global technology enterprise, focused on delivering excellence in complex digital environment primarily in banking, insurance, capital market ecosystem and online examinations. Our three distinct business units – Digital Transformation, DEX, Aujas- Cyber Security & Cloudxchange.io are built on the three base principles of ‘”Passion for Excellence”, “Customer Centricity” and “Ownership and powered by a 1500+ highly specialized workforce comprising certified technology & business domain experts. With over 20+ years of expertise across large scale complex digital implementations for industries like banking, insurance & capital market, we have been recognized as the ‘Trusted Technology & Knowledge Partner’ for 150+ customers across India, US and Middle East. We have been assessed at Maturity Level 5 in Capability Maturity Model Integration for Development (CMMI® - DEV) and certified for ISO 9001:2015 and ISO 27001:2013 for our information security management systems.
Job Description
Job Title: Junior Azure Data Engineer
Location: Mumbai (Thane) / Ahmedabad
Department: Data Analytics
Job Overview:
We are seeking a Junior Azure Data Engineer to join our data engineering team responsible for building and optimizing scalable cloud-based data platforms. The successful candidate will work closely with cross-functional teams to design and implement data solutions using Azure, Databricks, Python, PySpark, and SQL.
The role demands a highly motivated professional with a strong foundation in real-time data processing and cloud-based data lakehouse architecture. You will have the opportunity to work on modern data stack technologies, gain in-depth exposure to enterprise data workflows, and contribute to strategic data initiatives.
Key Responsibilities:
• Data Engineering & Development
• Develop and maintain cloud-native data pipelines and analytics solutions on Azure and Databricks.
• Implement ETL/ELT processes using Databricks Delta Lakehouse architecture.
• Build real-time structured streaming pipelines using PySpark.
• Database & Modelling
• Perform data modeling and schema design for relational databases, preferably PostgreSQL.
• Write optimized and reusable SQL queries for data extraction, transformation, and analysis.
• Automation & CI/CD Integration
• Design and deploy CI/CD pipelines to support automated builds, testing, and deployment of data pipelines.
• Security & Compliance
• Ensure all solutions adhere to security-first development principles and organizational data governance policies.
Qualifications:
• Education:
• Bachelor’s degree in Computer Science, Information Technology, or a related field.
• Experience:
• 3 to 5 years of experience in data engineering with a focus on Azure and Databricks.
Skills:
Mandatory Skills:
? Databricks
? Python
? PySpark
? SQL
Preferred Skills:
• Experience in data modeling with PostgreSQL
• Understanding of Delta Lake architecture
• Exposure to real-time processing with Structured Streaming
• Familiarity with secure coding practices and compliance in data environments
• Strong problem-solving and analytical abilities
• Ability to work in a fast-paced, collaborative environment
