Big Data / BI / ETL / Informatica Developers (Job Code : J44831)  

 Job Summary
Share this job on Facebook  Share this job on Twitter  Share this job on Linked In
5.00 - 8.00  Years 
Big Data / BI / ETL / Informatica Developers
BA, BBA/BMS, BCA, BCom, BCS, BE-Comp/IT, BEd, BE-Other, BIS, BIT, BSc-Comp/IT, BSc-Other, BTech-Comp/IT, BTech-Other, CA, CS, DE-Comp/IT, DE-Other, Diploma, ICWA, LLB, MA, MBA, MCA, MCM, MCom, MCS, ME-Comp/IT, ME-Other, MIS, MIT, MSc-Comp/IT, MS-Comp/IT, MSc-Other, MS-Other, MTech-Comp/IT, MTech-Other, PGDM, PG-Other
Educational Level:
Stream of Study:
Computer Science/IT
Industrial Type:
IT-Software/Software Services
Functional Area:
IT Software - System Programming
Key Skills:
bigdata or ETL or BI or Informatica
Job Post Date:
2021-12-14 17:50:18  

 Company Description
Our clienta is a global digital services company that helps organizations build the vision and the tools to run the future. With expertise in strategy, technology and operations, our client delivers design-engineered solutions that escalate growth for the world’s largest companies. Its team of more than 3,500 experts has been doing the right thing since 2004.

We are passionate about and passionately work towards building a better society by actively contributing to social and economic developments of people, communities, and the environment. From promoting technology reuse and helping to empower children through education and sports, to aiding in the rebuild of communities after natural disasters and putting focus around health and healthy workplaces, we leverage our global footprint and strive to make the most impact.

 Job Description
Job Description:

5+ years of experience in Big Data / BI / ETL / Informatica projects
Should have strong knowledge and experience in Object Oriented Programming, Analysis and Design concepts (e.g. Python, PySpark,) and frameworks.
Expertise in Microsoft Azure is mandatory including components like Azure Data Factory, Azure Data Lake, Azure Delta Lake, Azure ADLS Gen2, Azure DataBricks, Azure Events Hub and Kafka.
Should have good understanding of ETL architecture
Should have experience on big data components like HIVE, Sqoop, HDFS, Spark
Should have solid understanding in building data ingestion and Transformation pipelines using Hadoop Technologies
Sound knowledge and experience in at least one programming language / scripting language (Python, Shell script)
Strong programming skills using any flavours of SQL.
Should have hands on experience in development using Impala, NoSQL databases like MongoDB/Cassandra/Hbase
Should have extensive experience in Database design, Continuous Integration and Deployment
Should have strong knowledge and experience in handling non-functional requirements such as performance, security, load, scaling and usability of applications.
Basic knowledge on the workflow setups like Oozie is needed
Strong verbal and written communication skills are a must
Ability to learn, contribute and grow in a fast phased environment
Ability to multi task and work independently