Sr. Software Engineer - AWS Job
Bangalore, KA, IN Bangalore, KA, IN
Key Responsibilities:
- Design, build, and manage scalable data pipelines and data processing workflows using AWS services such as AWS Glue, Amazon Redshift, and Amazon S3.
- Collaborate with data scientists, analysts, and business stakeholders to define data requirements and ensure seamless data integration.
- Implement data models, data lakes, and databases within AWS, ensuring efficient storage, processing, and retrieval of data.
- Optimize and troubleshoot performance issues related to data processing and storage in AWS.
- Design and implement ETL (Extract, Transform, Load) processes to transform raw data into structured formats for analytical purposes.
- Maintain data security and ensure compliance with data privacy regulations by implementing access controls, encryption, and audit logging.
- Work with large datasets and ensure data integrity, accuracy, and consistency across various sources.
- Automate data workflows and scheduling processes for efficient execution.
- Perform regular monitoring, tuning, and maintenance of AWS-based data systems.
- Provide support for troubleshooting and debugging data issues across data systems and pipelines.
- Collaborate with other engineering teams to integrate data solutions with other business applications and services.
Qualifications:
- Bachelor’s or Master’s degree in Computer Science, Information Technology, Engineering, or a related field.
- Proven experience as a Data Engineer with hands-on expertise in AWS services like AWS S3, AWS Glue, Amazon Redshift, Amazon EMR, AWS Lambda, and AWS Data Pipeline.
- Strong experience with SQL and NoSQL databases (e.g., MySQL, PostgreSQL, DynamoDB).
- Proficiency in data modeling, data warehousing, and designing data architectures.
- Experience with data processing frameworks like Apache Spark, Apache Kafka, and Apache Hive.
- Proficient in programming languages such as Python, Java, or Scala.
- Knowledge of cloud security best practices, including data encryption and user authentication/authorization mechanisms.
- Strong understanding of data governance, data quality, and data privacy.
- Ability to work effectively in an agile development environment and collaborate with cross-functional teams.
- Excellent problem-solving skills, with the ability to analyze complex data challenges and design innovative solutions.
Preferred Skills:
- Experience with machine learning tools and frameworks (e.g., TensorFlow, SageMaker).
- Certification in AWS (e.g., AWS Certified Solutions Architect or AWS Certified Big Data – Specialty).
- Familiarity with CI/CD pipelines and DevOps practices for automating data processes.