Apply now »

Tech Lead - AWS Data Pipeline Job

Date:  Feb 12, 2024
Job Requisition Id:  53571
Location: 

Hyderabad, IN Pune, MH, IN Indore, IN

YASH Technologies is a leading technology integrator specializing in helping clients reimagine operating models, enhance competitiveness, optimize costs, foster exceptional stakeholder experiences, and drive business transformation.

 

At YASH, we’re a cluster of the brightest stars working with cutting-edge technologies. Our purpose is anchored in a single truth – bringing real positive changes in an increasingly virtual world and it drives us beyond generational gaps and disruptions of the future.

 

We are looking forward to hire AWS Data Pipeline Professionals in the following areas :

 

• 6 or more years’ experience in Cloud architecture (AWS specifically)
• 5+ years of experience in engineering with experience in ETL type work with databases and Hadoop platforms. Skills 
• Hadoop General Deep knowledge of distributed file system concepts, map-reduce principles and distributed computing. Knowledge of Spark and differences between Spark and Map-Reduce. Familiarity of encryption and security in a Hadoop cluster. 
• Data management / data structures Must be proficient in technical data management tasks, i.e. writing code to read, transform and store data 
• XML/JSON knowledge • Experience working with REST APIs 
• Spark Experience in launching spark jobs in client mode and cluster mode. Familiarity with the property settings of spark jobs and their implications to performance. 
• Application Development Familiarity with HTML, CSS, and JavaScript and basic design/visual competency 
• SCC/Git Must be experienced in the use of source code control systems such as Git 
•ETL Experience with developing ELT/ETL processes with experience in loading data from enterprise sized RDBMS systems such as Oracle, DB2, MySQL, etc. 
• Authorization Basic understanding of user authorization (Apache Ranger preferred) 
• Programming Must be at able to code in Python or expert in at least one high level language such as Java, C, Scala. 
• Must have experience in using REST APIs 
• SQL Must be an expert in manipulating database data using SQL. Familiarity with views, functions, stored procedures and exception handling. AWS General knowledge of AWS Stack (EC2, S3, EBS, …) IT Process Compliance SDLC experience and formalized change controls 
• Working in DevOps teams, based on Agile principles (e.g. Scrum) 
• ITIL knowledge (especially incident, problem and change management) IT Process Compliance SDLC experience and formalized change controls 
• Working in DevOps teams, based on Agile principles (e.g. Scrum) 
• ITIL knowledge (especially incident, problem and change management) Languages Fluent English skills 
• Proficiency in SQL / Java / Python (Python required; all 3 not necessary) 
• Proficiency in PySpark for distributed computation 
• Familiarity with Postgres and ElasticSearch 
• Familiarity with HTML, CSS, and JavaScript and basic design/visual competency
• Familiarity with common databases (e.g. JDBC, mySQL, Microsoft SQL). Not all types required

 

At YASH, you are empowered to create a career that will take you to where you want to go while working in an inclusive team environment. We leverage career-oriented skilling models and optimize our collective intelligence aided with technology for continuous learning, unlearning, and relearning at a rapid pace and scale.

 

Our Hyperlearning workplace is grounded upon four principles

  • Flexible work arrangements, Free spirit, and emotional positivity
  • Agile self-determination, trust, transparency, and open collaboration
  • All Support needed for the realization of business goals,
  • Stable employment with a great atmosphere and ethical corporate culture

Apply now »