Hadoop Scala Developer
SKILLS
FULL DESCRIPTION
Hadoop Scala Developer
[Employer hidden — view at passion-project.co.uk] is seeking a Hadoop Scala Developer with 4+ years of experience in Hadoop with Scala Development. The candidate should have handled more than 2 projects in the above framework using Scala and have experience in end-to-end Big Data technology. Responsibilities include designing and coding Hadoop applications, creating data processing frameworks, troubleshooting application bugs, and maintaining data security.
Hadoop Scala Developer
Job ID: 151801
Type: Contract /permanent
Experience: Fresehers and 0+2 years
No. of Positions: 3
Travel Required: Yes
Location: UK
Who we are
Headquartered in the UK in 2016 and with branch offices in India and France, [Employer hidden] is renowned as one of the leading IT consulting services companies serving clients all across the globe. With a team of strategic minds and well-experienced professionals in various technologies, we provide IT consulting services in various domains such as Big Data, Machine Learning, Artificial Intelligence, Cloud Computing, Data Science, etc. Adding to this, we have a proven track record of serving various industries such as Banking & Finance, Healthcare, Public Services, Insurance, Charity, and Retail. Thus, we help businesses make the right decisions and thrive with our innovative solutions through our IT consulting services.
Furthermore, [Employer hidden] is also recognized as the organization that trains and empowers individuals on various technologies and thus encourages them to build their careers.
What will your job look like
- 4+ years of relevant experience in Hadoop with Scala Development
- Its mandatory that the candidate should have handled more than 2 projects in the above framework using Scala.
- Should have 4+ years of relevant experience in handling end to end Big Data technology.
- Meeting with the development team to assess the company’s big data infrastructure.
- Designing and coding Hadoop applications to analyze data collections.
- Creating data processing frameworks.
- Extracting data and isolating data clusters.
- Testing scripts and analyzing results.
- Troubleshooting application bugs.
- Maintaining the security of company data.
- Creating data tracking programs.
- Producing Hadoop development documentation.
- Training staff on application use
- Good project management and communication skills.
- Designing, creating, and maintaining Scala-based applications
- Participating in all architectural development tasks related to the application.
- Writing code in accordance with the app requirements
- Performing software analysis
- Working as a member of a software development team to ensure that the program meets standards
- Application testing and debugging
- Making suggestions for enhancements to application procedures and infrastructure.
- Collaborating with cross-functional team
- 12+ years of hands-on experience in variety of platform & data development roles
- 5+ years of experience in big data technology with experience ranging from platform architecture, data management, data architecture and application architecture
- High Proficiency working with Hadoop platform including Spark/Scala, Kafka, SparkSQL, HBase, Impala, Hive and HDFS in multi-tenant environments
- Solid base in data technologies like warehousing, ETL, MDM, DQ, BI and analytical tools extensive experience in metadata management and data quality processes and tools
- Experience in full lifecycle architecture guidance
- Advanced analytical thinking and problem-solving skills
- Advanced knowledge of application, data and infrastructure architecture disciplines
- Understanding of architecture and design across all systems
- Demonstrated and strong experience in a software engineering role, including the design, development and operation of distributed, fault-tolerant applications with attention to security, scalability, performance, availability and optimization
Requirements
- 4+ years of hands-on experience in designing, building and supporting Hadoop Applications using Spark, Scala, Sqoop and Hive.
- Strong knowledge of working with large data sets and high capacity big data processing platform.
- Strong experience in Unix and Shell scripting.
- Experience using Source Code and Version Control Systems like Bitbucket, Git
- Experience using Job Scheduler like Autosys
- Experience working in an agile environment
- Strong communication skills both verbal and written and strong relationship and collaborative skills and organizational skills with the ability to work as a member of matrix based diverse and geographically distributed project team.
#### Apply Here
Are you legally authorized to work in the country of this job?
Yes
No
Are you willing to relocate for future career changes?
Yes
No
If offered a position, would you be willing to submit to a criminal background check, a previous employment verification check, and/or an education verification check?
Yes
No
This position is a relocation position, are you willing to relocate?
Yes
No
Upload Resume
(Supported pdf, doc, docx types)
Read and accept the [data privacy statement](https://[Employer hidden].com/privacy-policy).
Apply