We are looking for a Big Data Java Developer for our client in the financial sector.
Main duties and responsibilities:
- Utilize Agile software development life cycle methodologies, processes, and procedures
- Work independently and take responsibility for ETL Development
- Collaborate with business and other technology teams across the GLOBE to establish processes and best practices
- Work closely with Core Services engineering team to design scalable solutions
- Provide mentoring and support to less experienced developers
- Provide batch execution support during testing and QA
- Foster teamwork and cooperation, promote sharing of information, establish and maintain effective working relationships
- Effectively manage time and complete tasks within budgeted estimates
- Communicate clearly in written and verbal form
- Carry out responsibilities with strong customer service orientation
- Adapt to changing requirements and technologies
- Consistently demonstrate commitment to quality
- Listen effectively and follow through appropriately
- Analyze data or business processes and develop logical solutions to problems encountered
- Excellent oral and written English
- Ability to collaborate effectively in a large global team
- Excellent grasp of Object Oriented Design
- Excellent grasp of Concurrent programming concepts
- Familiarity with Agile software development processes
- 7+ experience and proficiency with Data Warehousing concepts and solutions including ETL, Data Warehousing models, data marts, reporting, and testing.
- 5+ years of experience in AbInitio ETL tool, ACE/BRE, Operational Console/Control Center
- Hands-on experience in the design, development, testing, and deployment of Data Staging methodologies.
- Good understanding of Relational Database concepts and working experience with Oracle databases.
- Good understanding of various formats like JSON, XML, Parquet and Avro
- Configure and maintain scheduled ETL jobs
- Define, prepare and execute migration packages for new ETL objects using Bit Bucket
- Strong experience and proficiency to work with large streams of data in Hadoop eco system with tools like Map Reduce, Java, Python, Spark, Hive SQL, Scala and Kafka
- Strong Debugging and Problem Solving Skills
- Work experience with AWS is a plus
- Experience working in a Bank Financial applications and Big Data Warehouse
- Experience leading a team of Software Engineers/Analyst Developers.
- Demonstrated capacity to build tooling for development team use.
- Ability to implement industry best practice