Module Lead - AWS Job
Hyderabad, TG, IN
YASH Technologies is a leading technology integrator specializing in helping clients reimagine operating models, enhance competitiveness, optimize costs, foster exceptional stakeholder experiences, and drive business transformation.
At YASH, we’re a cluster of the brightest stars working with cutting-edge technologies. Our purpose is anchored in a single truth – bringing real positive changes in an increasingly virtual world and it drives us beyond generational gaps and disruptions of the future.
We are looking forward to hire AWS Professionals in the following areas :
Job Descreption:
Role Overview
We are seeking an experienced AWS Data Architect with strong hands-on expertise in designing, building, and optimizing scalable data lake and analytics platforms on AWS. The ideal candidate will combine deep technical proficiency in AWS services with a strong understanding of data architecture principles, governance, and best practices for analytical enablement.
Required Skills :
- Proven experience architecting and implementing AWS data lake and analytics ecosystems.
- Hands-on expertise in AWS Glue, S3, Athena, Lambda, Redshift, and Lake Formation.
- Strong programming and ETL skills using PySpark / Python.
- Deep understanding of data modeling, data ingestion patterns, and schema evolution.
- Knowledge of data governance, security, and compliance frameworks in cloud environments.
- Familiarity with CI/CD, IaC, and DevOps tools (CDK, Terraform, GitHub Actions, CodePipeline).
- Experience with data quality frameworks (Deequ, Great Expectations).
- Strong problem-solving, analytical, and communication skills.
Preferred Qualifications
- At least 5-7 years of experience in data architecture and data modelling.
- AWS Certified Data Analytics – Specialty or AWS Certified Solutions Architect – Professional.
- Proven experience with database management systems and data warehousing.
- Strong understanding of data governance and data quality principles.
- Excellent problem-solving and analytical skills.
- Experience with data modelling tools and techniques.
Responsibilities
- Architect and Design: Define and implement scalable, secure, and cost-efficient AWS-based data lake architectures using S3, Glue, Athena, and Lake Formation.
- Data Engineering: Develop robust ETL/ELT pipelines using PySpark, Glue, and Lambda for batch and near-real-time data ingestion and transformation.
- Data Modeling: Design logical and physical data models, implement schema evolution, partitioning strategies, and efficient metadata management.
- Manage the complete data lifecycle from raw ingestion to curated and analytical layers ensuring data quality, lineage, and traceability.
- Integration: Enable seamless data consumption by downstream systems such as Redshift Spectrum, QuickSight, SageMaker, and other BI platforms.
- Enforce robust data governance, access control, and compliance through IAM, KMS, CloudTrail, and Lake Formation.
- Data Quality: Implement and automate data validation frameworks using tools such as Deequ or Great Expectations.
- Automation & DevOps: Build infrastructure as code (IaC) using AWS CDK or Terraform, and orchestrate CI/CD pipelines with GitHub Actions or AWS CodePipeline.
- Monitoring & Optimization: Set up observability, performance tuning, and cost optimization through CloudWatch, Glue job optimization, and Athena query tuning.
At YASH, you are empowered to create a career that will take you to where you want to go while working in an inclusive team environment. We leverage career-oriented skilling models and optimize our collective intelligence aided with technology for continuous learning, unlearning, and relearning at a rapid pace and scale.
Our Hyperlearning workplace is grounded upon four principles
- Flexible work arrangements, Free spirit, and emotional positivity
- Agile self-determination, trust, transparency, and open collaboration
- All Support needed for the realization of business goals,
- Stable employment with a great atmosphere and ethical corporate culture