facebook tracking

Cloud Engineer

Bigger challenges. Bolder ideas. Global impact - Join us for Rewarding Career
Scroll to content

Experience: 8-12 years

Key Skills: Java/Python, Big Data

Candidates only from Product Based Company/Banking 

Qualification: B.Tech/B.E Only


  • Design, Develop and manage of Big Data infrastructures Solution Focused professional with experience in Data Engineering, Data Analysis, Workflow monitoring in distributed computing environment for implementing Big Data solution.
  • Good understanding of complex processing needs of big data and has experience in developing codes and modules to address those needs.
  • Participate in architectural discussions to ensure solutions are designed for successful deployment, security, and high availability in the cloud
  • Write and maintain code for automating the creation of scalable/resilient systems/infrastructure
  • Design, implement, and maintain server, storage, network, and security infrastructure
  • Work with application teams to move existing applications to AWS / Azure through lift-and-Shift and refactoring migration strategies
  • Develop, implement, and test data backup and recovery, and disaster recovery procedures
  • Write and maintain clear, concise documentation, runbooks and operational standards including infrastructure diagrams
  • Ensure all solutions are properly monitored and instrumented
  • Troubleshoot and resolve complex issues in development, test and production environments
  • Educate/mentor product teams and junior engineers.

Requirements:

  • 8+ Years of software development experience (Java or Python) with good knowledge of computer science fundamentals
  • Able to integrate state-of-the-art Cloud/Big Data technologies into the overall architecture and lead a team of developers through the construction, testing and implementation- Experience with designing data pipelines
  • Experience with Data Quality Management (Semantic and Syntactic Checks)
  • Hands on Experience with SQL
  • Experience in Cloud Computing technologies like AWS (EC2, S3, ECS, Lambda, RedShift, EMR, SQS, SNS, IAM, Pivotal Cloud Foundry).
  • Experience in Big Data Technologies like Hadoop, Python and knowledge of Spark and Scala.
  • Experience in Hadoop components like HDFS, Map Reduce, Hive, Pig, HBase, Impala, Sqoop, Flume and Oozie.
  • Experience in Python Scripting, Git, Maven, Junit, SonarCube, Nexus, Ansible, Docker, Microservices, Jenkins
  • Broad range of technology interests across different disciplines including computer and system architecture, web applications, Scalability, performance analysis, distributed systems design, and testing.
  • Strong understanding of object oriented programming and software architecture.
  • Strong understanding of the SDLC in an Agile/Scrum environment.
  • Proven ability to successfully design, build, and deliver globally scalable software applications.
  • Experience in successfully communicating with stakeholders, product management, and other key contributors.
  •  Understanding of immutable infrastructure and infrastructure as code concepts
  • Familiar with Linux and/or Windows operating systems

Apply for this job
Usually responds within a day

Or, know someone who would be a perfect fit? Let them know!

Specialisms

Already working at Staffio HR?

Let’s recruit together and find your next colleague.

email
@staffiohr.com
Teamtailor

Applicant tracking system by Teamtailor