Cutshort logo
Amazon S3 Jobs in Hyderabad

9+ Amazon S3 Jobs in Hyderabad | Amazon S3 Job openings in Hyderabad

Apply to 9+ Amazon S3 Jobs in Hyderabad on CutShort.io. Explore the latest Amazon S3 Job opportunities across top companies like Google, Amazon & Adobe.

icon
Hyderabad
7 - 12 yrs
₹12L - ₹24L / yr
Spark
Hadoop
Big Data
Data engineering
PySpark
+5 more

Skills

Proficient experience of minimum 7 years into Hadoop. Hands-on experience of minimum 2 years into AWS - EMR/ S3 and other AWS services and dashboards. Good experience of minimum 2 years into Spark framework. Good understanding of Hadoop Eco system including Hive, MR, Spark and Zeppelin. Responsible for troubleshooting and recommendation for Spark and MR jobs. Should be able to use existing logs to debug the issue. Responsible for implementation and ongoing administration of Hadoop infrastructure including monitoring, tuning and troubleshooting Triage production issues when they occur with other operational teams. Hands on experience to troubleshoot incidents, formulate theories and test hypothesis and narrow down possibilities to find the root cause.
Read more
cirruslabs
shivani sagar
Posted by shivani sagar
Hyderabad
3 - 6 yrs
₹8L - ₹16L / yr
skill iconPython
skill iconDjango
skill iconFlask
AWS Lambda
skill iconAmazon Web Services (AWS)
+16 more
Role Specific Responsibilities
 Interfaces with other processes and/or business functions to ensure they can leverage the
benefits provided by the AWS Platform process
 Responsible for managing the configuration of all IaaS assets across the platforms
 Hands-on python experience
 Manages the entire AWS platform(Python, Flask, RESTAPI, serverless framework) and
recommend those that best meet the organization's requirements
 Has a good understanding of the various AWS services, particularly: S3, Athena, Python code,
Glue, Lambda, Cloud Formation, and other AWS serverless resources.
 AWS Certification is Plus
 Knowledge of best practices for IT operations in an always-on, always-available service model
 Responsible for the execution of the process controls, ensuring that staff comply with process
and data standards
Qualifications
 Bachelor’s degree in Computer Science, Business Information Systems or relevant experience and
accomplishments
 3 to 6 years of experience in the IT field
 AWS Python developer
 AWS, Serverless/Lambda, Middleware.

 Strong AWS skills including Data Pipeline, S3, RDS, Redshift with familiarity with other components
like - Lambda, Glue, Step functions, CloudWatch
 Must have created REST API with AWS Lambda.
 Python relevant exp 3 years
 Good to have Experience working on projects and problem solving with large scale multivendor
teams.
 Good to have knowledge on Agile Development
 Good knowledge on SDLC.
 Hands on AWS Databases, (RDS, etc)
 Good to have Unit testing exp.
 Good to have CICD working knowledge.
 Decent communication, as there will be client interaction and documentation.

Education (degree): Bachelor’s degree in Computer Science, Business Information Systems or relevant
experience and accomplishments
Years of Experience: 3-6 years
Technical Skills
 Linux/Unix system administration
 Continuous Integration/Continuous Delivery tools like Jenkins
 Cloud provisioning and management – Azure, AWS, GCP
 Ansible, Chef, or Puppet
 Python, PowerShell & BASH
Job Details
 JOB TITLE/JOB CODE: AWS Python Develop[er, III-Sr. Analyst
 RC: TBD
 PREFERRED LOCATION: HYDERABAD, IND
 POSITION REPORTS TO: Manager USI T&I Cloud Managed Platform
 CAREER LEVEL: 3
Work Location:
Hyderabad
Read more
Neo Technology Solutions
Remote only
7 - 9 yrs
₹20L - ₹25L / yr
skill iconJava
skill iconAmazon Web Services (AWS)
Amazon SQS
Amazon S3
skill iconSpring Boot

We are looking for a focussed, energetic and motivated Senior Java Developer. You must have sufficient experience in Java Development, Spring, Spring Boot, AWS and SQS, S3. You will be challenged with complex and robust product development.

As a Senior Java Developer, you will be the most senior developer within the company working alongside other Java Developers, the Engineering Manager, and the CTO.

 

Senior Java Developer requirements

Solid development experience using Java, Springboot, SQS, S3 and Microservices

Experience working in Swing Java framework

Revises, updates, refactors, and debugs code

Write clear, concise and efficient code and ensure it is fully tested

Service and develop features using Java with AWS, JUnit, SQL and Spring framework

You have a good knowledge of Cloud technologies (ideally AWS)

Participate in peer coding reviews and work in a collaborative, agile environment

Demonstrate technical knowledge and expertise with a clear understanding on the products' technical composition

 

Senior Java Developer responsibilities

7-9 yrs minimum work experience

Excellent analytical, troubleshooting, and communication skills

Excellent attention to detail and time-management skills

Serves as an expert on developed applications

Ability to adapt and learn new product/technology is key

Collaborates with internal teams to produce software design and architecture

Knowledge of Agile frameworks (i.e. Scrum) and has worked with Jira or Spira

Attend daily stand-up meetings alongside the Scrum Master

A self-starter, hardworking and committed individual

Work UK hours - ideally 9.00am - 6pm with the expectation to work out of ours when needed

Read more
Genesys

at Genesys

5 recruiters
Manojkumar Ganesh
Posted by Manojkumar Ganesh
Chennai, Hyderabad
4 - 10 yrs
₹10L - ₹40L / yr
ETL
Datawarehousing
Business Intelligence (BI)
Big Data
PySpark
+6 more

Join our team

 

We're looking for an experienced and passionate Data Engineer to join our team. Our vision is to empower Genesys to leverage data to drive better customer and business outcomes. Our batch and streaming solutions turn vast amounts of data into useful insights. If you’re interested in working with the latest big data technologies, using industry leading BI analytics and visualization tools, and bringing the power of data to our customers’ fingertips then this position is for you!

 

Our ideal candidate thrives in a fast-paced environment, enjoys the challenge of highly complex business contexts (that are typically being defined in real-time), and, above all, is a passionate about data and analytics.

 

 

What you'll get to do

 

  • Work in an agile development environment, constantly shipping and iterating.
  • Develop high quality batch and streaming big data pipelines.
  • Interface with our Data Consumers, gathering requirements, and delivering complete data solutions.
  • Own the design, development, and maintenance of datasets that drive key business decisions.
  • Support, monitor and maintain the data models
  • Adopt and define the standards and best practices in data engineering including data integrity, performance optimization, validation, reliability, and documentation.
  • Keep up-to-date with advances in big data technologies and run pilots to design the data architecture to scale with the increased data volume using cloud services.
  • Triage many possible courses of action in a high-ambiguity environment, making use of both quantitative analysis and business judgment.

 

Your experience should include

 

  • Bachelor’s degree in CS or related technical field.
  • 5+ years of experience in data modelling, data development, and data warehousing.
  • Experience working with Big Data technologies (Hadoop, Hive, Spark, Kafka, Kinesis).
  • Experience with large scale data processing systems for both batch and streaming technologies (Hadoop, Spark, Kinesis, Flink).
  • Experience in programming using Python, Java or Scala.
  • Experience with data orchestration tools (Airflow, Oozie, Step Functions).
  • Solid understanding of database technologies including NoSQL and SQL.
  • Strong in SQL queries (experience with Snowflake Cloud Datawarehouse is a plus)
  • Work experience in Talend is a plus
  • Track record of delivering reliable data pipelines with solid test infrastructure, CICD, data quality checks, monitoring, and alerting.
  • Strong organizational and multitasking skills with ability to balance competing priorities.
  • Excellent communication (verbal and written) and interpersonal skills and an ability to effectively communicate with both business and technical teams.
  • An ability to work in a fast-paced environment where continuous innovation is occurring, and ambiguity is the norm.

 

Good to have

  • Experience with AWS big data technologies - S3, EMR, Kinesis, Redshift, Glue
Read more
Hyderabad
8 - 12 yrs
₹10L - ₹34L / yr
Ansible
DevOps
skill iconJenkins
skill iconDocker
skill iconKubernetes
+6 more
  • Collaborate with Dev, QA and Data Science teams on environment maintenance, monitoring (ELK, Prometheus or equivalent), deployments and diagnostics
  • Administer a hybrid datacenter, including AWS and EC2 cloud assets
  • Administer, automate and troubleshoot container based solutions deployed on AWS ECS
  • Be able to troubleshoot problems and provide feedback to engineering on issues
  • Automate deployment (Ansible, Python), build (Git, Maven. Make, or equivalent) and integration (Jenkins, Nexus) processes
  • Learn and administer technologies such as ELK, Hadoop etc.
  • A self-starter and enthusiasm to learn and pick up new technologies in a fast-paced environment.

Need to have

  • Hands-on Experience in Cloud based DevOps
  • Experience working in AWS (EC2, S3, CloudFront, ECR, ECS etc)
  • Experience with any programming language.
  • Experience using Ansible, Docker, Jenkins, Kubernetes
  • Experience in Python.
  • Should be very comfortable working in Linux/Unix environment.
  • Exposure to Shell Scripting.
  • Solid troubleshooting skills
Read more
Dremio

at Dremio

4 recruiters
Kiran B
Posted by Kiran B
Hyderabad, Bengaluru (Bangalore)
15 - 20 yrs
Best in industry
skill iconJava
Data Structures
Algorithms
Multithreading
Problem solving
+7 more

About the Role

The Dremio India team owns the DataLake Engine along with Cloud Infrastructure and services that power it. With focus on next generation data analytics supporting modern table formats like Iceberg, Deltalake, and open source initiatives such as Apache Arrow, Project Nessie and hybrid-cloud infrastructure, this team provides various opportunities to learn, deliver, and grow in career. We are looking for technical leaders with passion and experience in architecting and delivering high-quality distributed systems at massive scale.

Responsibilities & ownership

  • Lead end-to-end delivery and customer success of next-generation features related to scalability, reliability, robustness, usability, security, and performance of the product
  • Lead and mentor others about concurrency, parallelization to deliver scalability, performance and resource optimization in a multithreaded and distributed environment
  • Propose and promote strategic company-wide tech investments taking care of business goals, customer requirements, and industry standards
  • Lead the team to solve complex, unknown and ambiguous problems, and customer issues cutting across team and module boundaries with technical expertise, and influence others
  • Review and influence designs of other team members 
  • Design and deliver architectures that run optimally on public clouds like GCP, AWS, and Azure
  • Partner with other leaders to nurture innovation and engineering excellence in the team
  • Drive priorities with others to facilitate timely accomplishments of business objectives
  • Perform RCA of customer issues and drive investments to avoid similar issues in future
  • Collaborate with Product Management, Support, and field teams to ensure that customers are successful with Dremio
  • Proactively suggest learning opportunities about new technology and skills, and be a role model for constant learning and growth

Requirements

  • B.S./M.S/Equivalent in Computer Science or a related technical field or equivalent experience
  • Fluency in Java/C++ with 15+ years of experience developing production-level software
  • Strong foundation in data structures, algorithms, multi-threaded and asynchronous programming models and their use in developing distributed and scalable systems
  • 8+ years experience in developing complex and scalable distributed systems and delivering, deploying, and managing microservices successfully
  • Subject Matter Expert in one or more of query processing or optimization, distributed systems, concurrency, micro service based architectures, data replication, networking, storage systems
  • Experience in taking company-wide initiatives, convincing stakeholders, and delivering them
  • Expert in solving complex, unknown and ambiguous problems spanning across teams and taking initiative in planning and delivering them with high quality
  • Ability to anticipate and propose plan/design changes based on changing requirements 
  • Passion for quality, zero downtime upgrades, availability, resiliency, and uptime of the platform
  • Passion for learning and delivering using latest technologies
  • Hands-on experience of working projects on AWS, Azure, and GCP 
  • Experience with containers and Kubernetes for orchestration and container management in private and public clouds (AWS, Azure,  and GCP) 
  • Understanding of distributed file systems such as  S3, ADLS or HDFS
  • Excellent communication skills and affinity for collaboration and teamwork

 

Read more
VAYUZ Technologies

at VAYUZ Technologies

1 video
4 recruiters
Pooja Chauhan
Posted by Pooja Chauhan
Remote, Bengaluru (Bangalore), Mumbai, Hyderabad, Chennai, Kolkata, Lucknow, Chandigarh
4 - 7 yrs
₹5L - ₹8L / yr
MEAN stack
skill iconNodeJS (Node.js)
skill iconExpress
Angular
skill iconAngular (2+)
+9 more

Roles and Responsibilities
1. Ability to work on diverse backend stack such as Node JS, Java, Express JS
2. Ability to work on diverse frontend stack such as React JS, Angular 6/7/8/9, HTML5, CSS3

3. Ability to deliver quick POC’s using cutting edge technologies.

4. Preparing reports, manuals and other documentation on the status, operation and maintenance of software.

5. Design, develop, and unit test applications in accordance with established standards

6. Developing, refining, and tuning integrations between applications. Analysing and resolving technical and application problems.

7. Ability to debug application.

8. Should have complete knowledge on developing RESTful Services.

9. Should be able to also work in agile development methodology.
10. Work with designated JavaScript framework to design, develop, and debug web applications
11. Can work on Angular and Integrate backend services
12. Work with the team to manage, optimize, and customize multiple web applications
13. Manage end to end module lifecycle management of the product
14. Push and pull codes via Git repository


Competency Requirements

  1. Experience in NodeJS, Java and development using AngularJS / ReactJS
  2. Experience in front end frameworks such as Angular.js, React.js, Bootstrap, Foundation etc
  3. Experience in client/server application development
  4. Knowledge of agile development methodologies
  5. Knowledge of unit testing theory
  6. Knowledge of AWS cloud
  7. Experience in Java, Python and Go will be added advantage
Read more
Hyderabad
8 - 14 yrs
₹15L - ₹35L / yr
Microservices
Technical Architecture
Technical architect
skill iconJava
skill iconAmazon Web Services (AWS)
+6 more
Looking for a great opensource architect with a strong understanding of how best to leverage and exploit the language’s unique paradigms, idioms, and syntax. Primary focus will be on providing architecture and design that are scalable, secure and maintainable. A commitment to collaborative problem solving, sophisticated design, and quality product is essential.

Responsibilities:
  • Working directly with clients to understand requirements for a green field development
  • Designing the technology and cloud infrastructure architecture to achieve the functional and nonfunctional requirements
  • The product you will be working on needs to scale up to support millions of users an efficient micro service distributed architecture
  • Solution needs to be easy to deploy and manage on multiple cloud providers (AWS, GCP or Azure)
  • Mentoring, guiding and training the team on various technology, quality and security aspects
  • Guiding the team to implement automated CI/CD processes
  • Strong analytical, problem solving and data analysis
  • Excellent communication, presentation and interpersonal skills are a must
  • Micro service frameworks such as Java SpringBoot
  • Design and implement automated unit and integration tests
  • Writing scalable, robust, testable, efficient, and easily maintainable code
  • Familiarity with most AWS Services - EC2, ECS, RDS, ECR, S3, SNS, SQS, and more
  • Experience with Docker and Kubernetes
  • Deploying and Scaling microservices to AWS
  • Hands-on experience with AWS cloud platform in evaluation and cost estimation
Read more
Chennai, Bengaluru (Bangalore), Hyderabad
4 - 10 yrs
₹9L - ₹20L / yr
Informatica
informatica developer
Informatica MDM
Data integration
Informatica Data Quality
+7 more
  • Should have good hands-on experience in Informatica MDM Customer 360, Data Integration(ETL) using PowerCenter, Data Quality.
  • Must have strong skills in Data Analysis, Data Mapping for ETL processes, and Data Modeling.
  • Experience with the SIF framework including real-time integration
  • Should have experience in building C360 Insights using Informatica
  • Should have good experience in creating performant design using Mapplets, Mappings, Workflows for Data Quality(cleansing), ETL.
  • Should have experience in building different data warehouse architecture like Enterprise,
  • Federated, and Multi-Tier architecture.
  • Should have experience in configuring Informatica Data Director in reference to the Data
  • Governance of users, IT Managers, and Data Stewards.
  • Should have good knowledge in developing complex PL/SQL queries.
  • Should have working experience on UNIX and shell scripting to run the Informatica workflows and to control the ETL flow.
  • Should know about Informatica Server installation and knowledge on the Administration console.
  • Working experience with Developer with Administration is added knowledge.
  • Working experience in Amazon Web Services (AWS) is an added advantage. Particularly on AWS S3, Data pipeline, Lambda, Kinesis, DynamoDB, and EMR.
  • Should be responsible for the creation of automated BI solutions, including requirements, design,development, testing, and deployment
Read more
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort
Why apply via Cutshort?
Connect with actual hiring teams and get their fast response. No spam.
Find more jobs
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort