Cutshort logo
Amazon S3 Jobs in Chennai

8+ Amazon S3 Jobs in Chennai | Amazon S3 Job openings in Chennai

Apply to 8+ Amazon S3 Jobs in Chennai on CutShort.io. Explore the latest Amazon S3 Job opportunities across top companies like Google, Amazon & Adobe.

icon
Amwhiz
Chennai
5 - 6 yrs
₹18L - ₹21L / yr
skill iconNodeJS (Node.js)
skill iconPython
ETL
Serverless
Relational Database (RDBMS)
+18 more

Job Overview:


We are seeking a Senior Backend Developer with 5+ years of hands-on experience in Node.js and Python, strong TypeScript knowledge, and advanced skills in AWS cloud services. You will lead backend engineering efforts and provide architectural and client-facing solutions, especially for international clients in the USA, UK, and Australia.

The ideal candidate must be deeply skilled in data structures, object-oriented programming, system design, and authentication standards (OAuth, SAML, etc.). This role also includes team leadershipcloud-native solution architecture, and direct client interaction for translating business requirements into technical deliverables.


Responsibilities:

  • Design, build, and maintain scalable backend systems using Node.js (TypeScript) and Python
  • Architect and implement cloud-based solutions on AWS and optionally on other cloud providers (GCP/Azure)
  • Develop, secure, and integrate APIs (REST, GraphQL, SOAP) and WebSocket services
  • Lead the backend development team: code reviews, mentoring, and enforcing engineering best practices
  • Work directly with clients from USA, UK, and Australia to gather requirements and present solutions
  • Implement authentication and authorization mechanisms (OAuth 2.0, SAML, JWT, custom auth flows)
  • Follow design principles and OOP patterns to ensure code scalability and maintainability
  • Apply strong understanding of data structures and algorithms to optimize backend performance
  • Create and manage infrastructure components such as:
  • IAM, EC2, S3, RDS, Lambda, CloudWatch
  • SQS, SNS, ElastiCache, Route53, API Gateway
  • VPCs, NAT Gateways, Internet Gateways, ALB/NLB
  • Use Docker and Kubernetes (EKS preferred) for containerization and orchestration
  • Integrate with relational and non-relational databases including MySQL, PostgreSQL, SQL Server, MongoDB, DynamoDB
  • Implement search capabilities using Elasticsearch



Required Skills:

Programming & Architecture:

  • Strong in Node.js (TypeScript) and Python
  • Deep knowledge of data structuresalgorithms, and system design
  • Expert in object-oriented programming (OOP) and design patterns
  • Experience with software architecture and microservices

Authentication & Security:

  • Deep understanding of OAuth 2.0, SAML, JWT, API key, and custom authentication mechanisms
  • Experience implementing secure, scalable identity & access controls

Cloud Infrastructure (AWS):

  • Hands-on with full stack of AWS: IAM, EC2, S3, RDS, Lambda, CloudWatch, SQS, SNS, ElastiCache, VPC, NAT Gateway, ALB/NLB, Route53, API Gateway
  • Proficiency with DockerKubernetes, and cloud-native CI/CD pipelines

Databases & Search:

  • SQL: MySQL, PostgreSQL, SQL Server
  • NoSQL: MongoDB, DynamoDB
  • Search: Elasticsearch

APIs & Integration:

  • REST, GraphQL, SOAP, WebSockets

Soft Skills:

  • Excellent English communication (verbal, written, presentation)
  • Experience working with international clients (USA, UK, Australia)
  • Strong problem-solving and solution architecture skills
  • Able to lead a team and deliver client-ready solutions independently

Preferred Qualifications:

  • AWS Certified (e.g., Solutions Architect or DevOps Engineer)
  • Experience with hybrid or multi-cloud environments
  • Exposure to CI/CD tools, monitoring, logging, and performance tuning

What We Offer:

  • Global exposure with direct client interaction
  • Strong engineering culture with mentorship and learning opportunities
  • High-impact projects with modern cloud-native architecture


Read more
NeoGenCode Technologies Pvt Ltd
Akshay Patil
Posted by Akshay Patil
Bengaluru (Bangalore), Mumbai, Gurugram, Pune, Hyderabad, Chennai
5 - 8 yrs
₹10L - ₹24L / yr
Drupal
skill iconPHP
skill iconJavascript
Custom Module & Theming Development
skill iconAmazon Web Services (AWS)
+5 more

Job Title : Full Stack Drupal Developer

Experience : Minimum 5 Years

Location : Hyderabad / Bangalore / Mumbai / Pune / Chennai / Gurgaon (Hybrid or On-site)

Notice Period : Immediate to 15 Days Preferred


Job Summary :

We are seeking a skilled and experienced Full Stack Drupal Developer with a strong background in Drupal (version 8 and above) for both front-end and back-end development. The ideal candidate will have hands-on experience in AWS deployments, Drupal theming and module development, and a solid understanding of JavaScript, PHP, and core Drupal architecture. Acquia certifications and contributions to the Drupal community are highly desirable.


Mandatory Skills :

Drupal 8+, PHP, JavaScript, Custom Module & Theming Development, AWS (EC2, Lightsail, S3, CloudFront), Acquia Certified, Drupal Community Contributions.


Key Responsibilities :

  • Develop and maintain full-stack Drupal applications, including both front-end (theming) and back-end (custom module) development.
  • Deploy and manage Drupal applications on AWS using services like EC2, Lightsail, S3, and CloudFront.
  • Work with the Drupal theming layer and module layer to build custom and reusable components.
  • Write efficient and scalable PHP code integrated with JavaScript and core JS concepts.
  • Collaborate with UI/UX teams to ensure high-quality user experiences.
  • Optimize performance and ensure high availability of applications in cloud environments.
  • Contribute to the Drupal community and utilize contributed modules effectively.
  • Follow best practices for code versioning, documentation, and CI/CD deployment processes.


Required Skills & Qualifications :

  • Minimum 5 Years of hands-on experience in Drupal development (Drupal 8 onwards).
  • Strong experience in front-end (theming, JavaScript, HTML, CSS) and back-end (custom module development, PHP).
  • Experience with Drupal deployment on AWS, including services such as EC2, Lightsail, S3, and CloudFront.
  • Proficiency in JavaScript, core JS concepts, and PHP coding.
  • Acquia certifications such as:
  • Drupal Developer Certification
  • Site Management Certification
  • Acquia Certified Developer (preferred)
  • Experience with contributed modules and active participation in the Drupal community is a plus.
  • Familiarity with version control (Git), Agile methodologies, and modern DevOps tools.


Preferred Certifications :

  • Acquia Certified Developer.
  • Acquia Site Management Certification.
  • Any relevant AWS certifications are a bonus.
Read more
Amwhiz
Boomika S
Posted by Boomika S
Chennai
3 - 6 yrs
₹4L - ₹9L / yr
Fullstack Developer
skill iconNodeJS (Node.js)
TypeScript
NestJS
Angular(10+)
+18 more

Full Stack Developer (3+ Years Experience)

Location: Chennai - Work From Office

Job Type: Full-time


About the Role:


We are looking for a highly skilled Full Stack Developer with 3+ years of experience in designing, developing, and maintaining scalable web applications. The ideal candidate should have expertise in Node.js, TypeScript, NestJS, Angular, and databases like MongoDB, DynamoDB, and RDBMS. You will be working on building robust REST APIs, implementing various authentication methods (Cookies, JWT, OAuth, etc.), and deploying applications in AWS & GCP cloud environments using serverless technologies.


Key Responsibilities:


● Develop and maintain scalable backend services using Node.js, TypeScript, and NestJS.

● Design and implement frontend applications using Angular.

● Build and optimize RESTful APIs for high-performance web applications.

● Work with databases (MongoDB, DynamoDB, and RDBMS) to store and retrieve application data efficiently.

● Implement authentication and authorization mechanisms such as JWT, cookies, and OAuth.

● Deploy and manage applications on AWS (Lambda, API Gateway, S3, DynamoDB, IAM, Cognito, etc.) and GCP Cloud Functions.

● Ensure code quality with unit testing, integration testing, and CI/CD pipelines.

● Work on serverless architectures to optimize performance and scalability.

● Collaborate with cross-functional teams to define and develop new features.

● Troubleshoot and debug application issues in both development and production environments.


Required Skills & Qualifications:


● 3+ years of experience as a Full Stack Developer.

● Strong knowledge of Node.js, TypeScript, and NestJS.

● Experience with Angular (Angular 10+ preferred).

● Hands-on experience with MongoDB, DynamoDB, and RDBMS (e.g., PostgreSQL, MySQL).

● Expertise in REST API development and API security best practices.

● Experience with authentication methods (JWT, OAuth, Session Cookies).

● Proficiency in AWS (Lambda, API Gateway, S3, DynamoDB, IAM, Cognito, etc.) and/or GCP Cloud Functions.

● Familiarity with serverless development and microservices architecture.

● Strong knowledge of CI/CD pipelines and automated deployment strategies.

● Understanding of software development best practices, version control (Git), and

Agile methodologies.


Nice to Have:


● Experience with Kubernetes and containerized applications (Docker).

● Understanding of WebSockets and real-time applications.


Benefits:


● Competitive salary based on experience.

● Flexible working hours.

● Learning and development opportunities.

● Collaborative and growth-oriented team environment

Read more
Bengaluru (Bangalore), Mumbai, Delhi, Gurugram, Pune, Hyderabad, Ahmedabad, Chennai
3 - 7 yrs
₹8L - ₹15L / yr
AWS Lambda
Amazon S3
Amazon VPC
Amazon EC2
Amazon Redshift
+3 more

Technical Skills:


  • Ability to understand and translate business requirements into design.
  • Proficient in AWS infrastructure components such as S3, IAM, VPC, EC2, and Redshift.
  • Experience in creating ETL jobs using Python/PySpark.
  • Proficiency in creating AWS Lambda functions for event-based jobs.
  • Knowledge of automating ETL processes using AWS Step Functions.
  • Competence in building data warehouses and loading data into them.


Responsibilities:


  • Understand business requirements and translate them into design.
  • Assess AWS infrastructure needs for development work.
  • Develop ETL jobs using Python/PySpark to meet requirements.
  • Implement AWS Lambda for event-based tasks.
  • Automate ETL processes using AWS Step Functions.
  • Build data warehouses and manage data loading.
  • Engage with customers and stakeholders to articulate the benefits of proposed solutions and frameworks.
Read more
Genesys

at Genesys

5 recruiters
Manojkumar Ganesh
Posted by Manojkumar Ganesh
Chennai, Hyderabad
4 - 10 yrs
₹10L - ₹40L / yr
ETL
Datawarehousing
Business Intelligence (BI)
Big Data
PySpark
+6 more

Join our team

 

We're looking for an experienced and passionate Data Engineer to join our team. Our vision is to empower Genesys to leverage data to drive better customer and business outcomes. Our batch and streaming solutions turn vast amounts of data into useful insights. If you’re interested in working with the latest big data technologies, using industry leading BI analytics and visualization tools, and bringing the power of data to our customers’ fingertips then this position is for you!

 

Our ideal candidate thrives in a fast-paced environment, enjoys the challenge of highly complex business contexts (that are typically being defined in real-time), and, above all, is a passionate about data and analytics.

 

 

What you'll get to do

 

  • Work in an agile development environment, constantly shipping and iterating.
  • Develop high quality batch and streaming big data pipelines.
  • Interface with our Data Consumers, gathering requirements, and delivering complete data solutions.
  • Own the design, development, and maintenance of datasets that drive key business decisions.
  • Support, monitor and maintain the data models
  • Adopt and define the standards and best practices in data engineering including data integrity, performance optimization, validation, reliability, and documentation.
  • Keep up-to-date with advances in big data technologies and run pilots to design the data architecture to scale with the increased data volume using cloud services.
  • Triage many possible courses of action in a high-ambiguity environment, making use of both quantitative analysis and business judgment.

 

Your experience should include

 

  • Bachelor’s degree in CS or related technical field.
  • 5+ years of experience in data modelling, data development, and data warehousing.
  • Experience working with Big Data technologies (Hadoop, Hive, Spark, Kafka, Kinesis).
  • Experience with large scale data processing systems for both batch and streaming technologies (Hadoop, Spark, Kinesis, Flink).
  • Experience in programming using Python, Java or Scala.
  • Experience with data orchestration tools (Airflow, Oozie, Step Functions).
  • Solid understanding of database technologies including NoSQL and SQL.
  • Strong in SQL queries (experience with Snowflake Cloud Datawarehouse is a plus)
  • Work experience in Talend is a plus
  • Track record of delivering reliable data pipelines with solid test infrastructure, CICD, data quality checks, monitoring, and alerting.
  • Strong organizational and multitasking skills with ability to balance competing priorities.
  • Excellent communication (verbal and written) and interpersonal skills and an ability to effectively communicate with both business and technical teams.
  • An ability to work in a fast-paced environment where continuous innovation is occurring, and ambiguity is the norm.

 

Good to have

  • Experience with AWS big data technologies - S3, EMR, Kinesis, Redshift, Glue
Read more
netmedscom

at netmedscom

3 recruiters
Vijay Hemnath
Posted by Vijay Hemnath
Chennai
5 - 10 yrs
₹10L - ₹30L / yr
skill iconMachine Learning (ML)
Software deployment
CI/CD
Cloud Computing
Snow flake schema
+19 more

We are looking for an outstanding ML Architect (Deployments) with expertise in deploying Machine Learning solutions/models into production and scaling them to serve millions of customers. A candidate with an adaptable and productive working style which fits in a fast-moving environment.

 

Skills:

- 5+ years deploying Machine Learning pipelines in large enterprise production systems.

- Experience developing end to end ML solutions from business hypothesis to deployment / understanding the entirety of the ML development life cycle.
- Expert in modern software development practices; solid experience using source control management (CI/CD).
- Proficient in designing relevant architecture / microservices to fulfil application integration, model monitoring, training / re-training, model management, model deployment, model experimentation/development, alert mechanisms.
- Experience with public cloud platforms (Azure, AWS, GCP).
- Serverless services like lambda, azure functions, and/or cloud functions.
- Orchestration services like data factory, data pipeline, and/or data flow.
- Data science workbench/managed services like azure machine learning, sagemaker, and/or AI platform.
- Data warehouse services like snowflake, redshift, bigquery, azure sql dw, AWS Redshift.
- Distributed computing services like Pyspark, EMR, Databricks.
- Data storage services like cloud storage, S3, blob, S3 Glacier.
- Data visualization tools like Power BI, Tableau, Quicksight, and/or Qlik.
- Proven experience serving up predictive algorithms and analytics through batch and real-time APIs.
- Solid working experience with software engineers, data scientists, product owners, business analysts, project managers, and business stakeholders to design the holistic solution.
- Strong technical acumen around automated testing.
- Extensive background in statistical analysis and modeling (distributions, hypothesis testing, probability theory, etc.)
- Strong hands-on experience with statistical packages and ML libraries (e.g., Python scikit learn, Spark MLlib, etc.)
- Experience in effective data exploration and visualization (e.g., Excel, Power BI, Tableau, Qlik, etc.)
- Experience in developing and debugging in one or more of the languages Java, Python.
- Ability to work in cross functional teams.
- Apply Machine Learning techniques in production including, but not limited to, neuralnets, regression, decision trees, random forests, ensembles, SVM, Bayesian models, K-Means, etc.

 

Roles and Responsibilities:

Deploying ML models into production, and scaling them to serve millions of customers.

Technical solutioning skills with deep understanding of technical API integrations, AI / Data Science, BigData and public cloud architectures / deployments in a SaaS environment.

Strong stakeholder relationship management skills - able to influence and manage the expectations of senior executives.
Strong networking skills with the ability to build and maintain strong relationships with both business, operations and technology teams internally and externally.

Provide software design and programming support to projects.

 

 Qualifications & Experience:

Engineering and post graduate candidates, preferably in Computer Science, from premier institutions with proven work experience as a Machine Learning Architect (Deployments) or a similar role for 5-7 years.

 

Read more
VAYUZ Technologies

at VAYUZ Technologies

1 video
4 recruiters
Pooja Chauhan
Posted by Pooja Chauhan
Remote, Bengaluru (Bangalore), Mumbai, Hyderabad, Chennai, Kolkata, Lucknow, Chandigarh
4 - 7 yrs
₹5L - ₹8L / yr
MEAN stack
skill iconNodeJS (Node.js)
skill iconExpress
Angular
skill iconAngular (2+)
+9 more

Roles and Responsibilities
1. Ability to work on diverse backend stack such as Node JS, Java, Express JS
2. Ability to work on diverse frontend stack such as React JS, Angular 6/7/8/9, HTML5, CSS3

3. Ability to deliver quick POC’s using cutting edge technologies.

4. Preparing reports, manuals and other documentation on the status, operation and maintenance of software.

5. Design, develop, and unit test applications in accordance with established standards

6. Developing, refining, and tuning integrations between applications. Analysing and resolving technical and application problems.

7. Ability to debug application.

8. Should have complete knowledge on developing RESTful Services.

9. Should be able to also work in agile development methodology.
10. Work with designated JavaScript framework to design, develop, and debug web applications
11. Can work on Angular and Integrate backend services
12. Work with the team to manage, optimize, and customize multiple web applications
13. Manage end to end module lifecycle management of the product
14. Push and pull codes via Git repository


Competency Requirements

  1. Experience in NodeJS, Java and development using AngularJS / ReactJS
  2. Experience in front end frameworks such as Angular.js, React.js, Bootstrap, Foundation etc
  3. Experience in client/server application development
  4. Knowledge of agile development methodologies
  5. Knowledge of unit testing theory
  6. Knowledge of AWS cloud
  7. Experience in Java, Python and Go will be added advantage
Read more
15 years US based Product Company

15 years US based Product Company

Agency job
Chennai, Bengaluru (Bangalore), Hyderabad
4 - 10 yrs
₹9L - ₹20L / yr
Informatica
informatica developer
Informatica MDM
Data integration
Informatica Data Quality
+7 more
  • Should have good hands-on experience in Informatica MDM Customer 360, Data Integration(ETL) using PowerCenter, Data Quality.
  • Must have strong skills in Data Analysis, Data Mapping for ETL processes, and Data Modeling.
  • Experience with the SIF framework including real-time integration
  • Should have experience in building C360 Insights using Informatica
  • Should have good experience in creating performant design using Mapplets, Mappings, Workflows for Data Quality(cleansing), ETL.
  • Should have experience in building different data warehouse architecture like Enterprise,
  • Federated, and Multi-Tier architecture.
  • Should have experience in configuring Informatica Data Director in reference to the Data
  • Governance of users, IT Managers, and Data Stewards.
  • Should have good knowledge in developing complex PL/SQL queries.
  • Should have working experience on UNIX and shell scripting to run the Informatica workflows and to control the ETL flow.
  • Should know about Informatica Server installation and knowledge on the Administration console.
  • Working experience with Developer with Administration is added knowledge.
  • Working experience in Amazon Web Services (AWS) is an added advantage. Particularly on AWS S3, Data pipeline, Lambda, Kinesis, DynamoDB, and EMR.
  • Should be responsible for the creation of automated BI solutions, including requirements, design,development, testing, and deployment
Read more
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort
Why apply via Cutshort?
Connect with actual hiring teams and get their fast response. No spam.
Find more jobs
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort