Jobs in Egypt - Big Data Engineer at Vodafone

Big Data Engineer at Vodafone

Location: Smart Village

Job Description:

  • Designing and producing high performing stable end-to-end applications to perform complex processing of batch and streaming massive volumes of data in a multi-tenancy big data platform, both Hadoop on-premises and in the cloud, and output insights back to business systems according to their requirements.
    • Design and implement core platform capabilities, tools, processes, ways of working and conventions under agile development to support the integrationof the LM and tenant’s data sourcing and use cases implementation, towards reusability, to easy up delivery and ensure standardisation across Local Markets deliverables in the platform.
    • Support the distributed data engineering teams, including technical support and training in the Big Data Programme frameworks and ways of working, revision and integration of source code, support to releasing and source code quality control
    • Working with the Group architecture team to define the strategy for evolving the Big Data capability, including solution architectural decisions aligned with the platform architecture
    • Defining the technologies to be used on the Big Data Platform and investigating new technologies to identify where they can bring benefits

Core competencies, knowledge and experience:

• Expert level experience in designing, building and managing applications to process large amounts of data in a Hadoop ecosystem orother big data frameworks
• Extensive experience with performance tuning applications on Hadoop and configuring Hadoop systems to maximise performance or other big data frameworks;
• Experience building systems to perform real-time data processing using Spark Streaming, Flink, Storm or Heron data processing frameworks, and Kafka, Beam, Dataflow, Kinesis or similar data streaming frameworks;;
• Experience with common SDLC, including SCM, build tools, unit, integration, functional and performance testingfrom automation perspective, TDD/BDD, CI and continuous delivery, under agile practises
• Experience working in large-scale multi tenancy big data environments;

Key performance indicators:

• Development of core frameworks to speed up and facilitate integration of Local Markets developments in the BDP
• Speed of on-boarding data sources and use cases for EU Hub markets and new tenants
• Delivered integrated use cases from Local Markets and tenants to add value to the business using the Big Data Program

Job Requirements:

Technical Knowledge/ Skills/ Training required:

  • Expert level experience with Hadoop ecosystem (Spark, Hive/Impala, HBase, Yarn); desirable experience with Cloudera distribution; experience with similar cloud provider solutions also considered (AWS, GCP, Azure)
    • Strong software development experience in Scala and Python programing languages; Java and functional languages desirable;
    • Experience with Unix-based systems, including bash scripting
    • Experience with columnar data formats
    • Experience with other distributed technologies such as Cassandra, Splunk, Solr/ElasticSearch, Flink, Heron, Bean, would also be desirable.

        To Apply:
https://vodafone.taleo.net/careersection/2a/jobdetail.ftl?job=000000240871&tz=GMT%2B02%3A00&tzname=Africa%2FCairo

Tips for updating your Resume:
https://careeradvancers.org/resume-cv-tips/

No comments:

Contact Form

Name

Email *

Message *

Powered by Blogger.