Cutshort logo
Amazon Redshift Jobs in Pune

6+ Amazon Redshift Jobs in Pune | Amazon Redshift Job openings in Pune

Apply to 6+ Amazon Redshift Jobs in Pune on CutShort.io. Explore the latest Amazon Redshift Job opportunities across top companies like Google, Amazon & Adobe.

icon
Wissen Technology

at Wissen Technology

4 recruiters
Vijayalakshmi Selvaraj
Posted by Vijayalakshmi Selvaraj
Pune
5 - 10 yrs
Best in industry
Object Oriented Programming (OOPs)
Amazon Redshift
DSA
Big Data
Hadoop
+3 more

Job Summary:

We are seeking a skilled Senior Data Engineer with expertise in application programming, big data technologies, and cloud services. This role involves solving complex problems, designing scalable systems, and working with advanced technologies to deliver innovative solutions.

Key Responsibilities:

  • Develop and maintain scalable applications using OOP principles, data structures, and problem-solving skills.
  • Build robust solutions using Java, Python, or Scala.
  • Work with big data technologies like Apache Spark for large-scale data processing.
  • Utilize AWS services, especially Amazon Redshift, for cloud-based solutions.
  • Manage databases including SQL, NoSQL (e.g., MongoDB, Cassandra), with Snowflake as a plus.

Qualifications:

  • 5+ years of experience in software development.
  • Strong skills in OOPS, data structures, and problem-solving.
  • Proficiency in Java, Python, or Scala.
  • Experience with Spark, AWS (Redshift mandatory), and databases (SQL/NoSQL).
  • Snowflake experience is good to have.
Read more
Rigel Networks Pvt Ltd
Minakshi Soni
Posted by Minakshi Soni
Bengaluru (Bangalore), Pune, Mumbai, Chennai
8 - 12 yrs
₹8L - ₹10L / yr
skill iconAmazon Web Services (AWS)
Terraform
Amazon Redshift
Redshift
Snowflake
+16 more

Dear Candidate,


We are urgently Hiring AWS Cloud Engineer for Bangalore Location.

Position: AWS Cloud Engineer

Location: Bangalore

Experience: 8-11 yrs

Skills: Aws Cloud

Salary: Best in Industry (20-25% Hike on the current ctc)

Note:

only Immediate to 15 days Joiners will be preferred.

Candidates from Tier 1 companies will only be shortlisted and selected

Candidates' NP more than 30 days will get rejected while screening.

Offer shoppers will be rejected.


Job description:

 

Description:

 

Title: AWS Cloud Engineer

Prefer BLR / HYD – else any location is fine

Work Mode: Hybrid – based on HR rule (currently 1 day per month)


Shift Timings 24 x 7 (Work in shifts on rotational basis)

Total Experience in Years- 8+ yrs, 5 yrs of relevant exp is required.

Must have- AWS platform, Terraform, Redshift / Snowflake, Python / Shell Scripting



Experience and Skills Requirements:


Experience:

8 years of experience in a technical role working with AWS


Mandatory

Technical troubleshooting and problem solving

AWS management of large-scale IaaS PaaS solutions

Cloud networking and security fundamentals

Experience using containerization in AWS

Working Data warehouse knowledge Redshift and Snowflake preferred

Working with IaC – Terraform and Cloud Formation

Working understanding of scripting languages including Python and Shell

Collaboration and communication skills

Highly adaptable to changes in a technical environment

 

Optional

Experience using monitoring and observer ability toolsets inc. Splunk, Datadog

Experience using Github Actions

Experience using AWS RDS/SQL based solutions

Experience working with streaming technologies inc. Kafka, Apache Flink

Experience working with a ETL environments

Experience working with a confluent cloud platform


Certifications:


Minimum

AWS Certified SysOps Administrator – Associate

AWS Certified DevOps Engineer - Professional



Preferred


AWS Certified Solutions Architect – Associate


Responsibilities:


Responsible for technical delivery of managed services across NTT Data customer account base. Working as part of a team providing a Shared Managed Service.


The following is a list of expected responsibilities:


To manage and support a customer’s AWS platform

To be technical hands on

Provide Incident and Problem management on the AWS IaaS and PaaS Platform

Involvement in the resolution or high priority Incidents and problems in an efficient and timely manner

Actively monitor an AWS platform for technical issues

To be involved in the resolution of technical incidents tickets

Assist in the root cause analysis of incidents

Assist with improving efficiency and processes within the team

Examining traces and logs

Working with third party suppliers and AWS to jointly resolve incidents


Good to have:


Confluent Cloud

Snowflake




Best Regards,

Minakshi Soni

Executive - Talent Acquisition (L2)

Rigel Networks

Worldwide Locations: USA | HK | IN 

Read more
Vijay Sales
Tech Recruiter
Posted by Tech Recruiter
Pune
3 - 8 yrs
₹5L - ₹15L / yr
skill iconNodeJS (Node.js)
skill iconMongoDB
skill iconExpress
skill iconRedis
skill iconRedux/Flux
+1 more

We are seeking a highly skilled Backend Developer to join our dynamic team. As a backend developer, you will play a critical role in designing, developing, and maintaining the core infrastructure and APIs that power our applications. You will work on creating scalable, efficient, and secure backend solutions leveraging cutting-edge technologies.

Key Responsibilities

  • Develop and maintain server-side logic, APIs, and services using Node.js.
  • Design and implement data storage solutions with MongoDB for scalability and reliability.
  • Integrate and manage caching mechanisms using Redis to enhance application performance.
  • Deploy, monitor, and maintain backend services on AWS using services like EC2, Lambda, S3, RDS, and more.
  • Ensure system security and data protection by following industry best practices.
  • Debug and optimize code for performance, scalability, and reliability.
  • Collaborate with frontend developers, product managers, and other stakeholders to ensure seamless integration and delivery.
  • Write clean, well-documented, and testable code.
  • Monitor and manage the health and performance of backend systems, implementing alerting and monitoring solutions.

Required Skills and Qualifications

  • Strong proficiency in Node.js, with at least [Insert Minimum Years, e.g., 3+ years] of hands-on experience.
  • Hands-on experience with MongoDB and a strong understanding of NoSQL database design principles.
  • Proficiency in Redis, including configuring, maintaining, and optimizing its usage.
  • Solid experience with AWS services, including deploying, managing, and scaling cloud-based solutions.
  • Familiarity with RESTful API design and best practices.
  • Understanding of asynchronous programming and event-driven architecture.
  • Experience with version control systems like Git.
  • Knowledge of best practices in software development, including CI/CD and automated testing.
  • Problem-solving mindset with strong debugging and troubleshooting skills.

Preferred Skills

  • Familiarity with containerization and orchestration tools like Docker and Kubernetes.
  • Experience with serverless architecture and services such as AWS Lambda.
  • Knowledge of authentication and authorization mechanisms (OAuth, JWT).
  • Exposure to message queues like RabbitMQ or Amazon SQS.


Read more
Bengaluru (Bangalore), Mumbai, Delhi, Gurugram, Pune, Hyderabad, Ahmedabad, Chennai
3 - 7 yrs
₹8L - ₹15L / yr
AWS Lambda
Amazon S3
Amazon VPC
Amazon EC2
Amazon Redshift
+3 more

Technical Skills:


  • Ability to understand and translate business requirements into design.
  • Proficient in AWS infrastructure components such as S3, IAM, VPC, EC2, and Redshift.
  • Experience in creating ETL jobs using Python/PySpark.
  • Proficiency in creating AWS Lambda functions for event-based jobs.
  • Knowledge of automating ETL processes using AWS Step Functions.
  • Competence in building data warehouses and loading data into them.


Responsibilities:


  • Understand business requirements and translate them into design.
  • Assess AWS infrastructure needs for development work.
  • Develop ETL jobs using Python/PySpark to meet requirements.
  • Implement AWS Lambda for event-based tasks.
  • Automate ETL processes using AWS Step Functions.
  • Build data warehouses and manage data loading.
  • Engage with customers and stakeholders to articulate the benefits of proposed solutions and frameworks.
Read more
EASEBUZZ

at EASEBUZZ

1 recruiter
Amala Baby
Posted by Amala Baby
Pune
2 - 4 yrs
₹2L - ₹20L / yr
Spotfire
Qlikview
Tableau
PowerBI
Data Visualization
+12 more

Company Profile:

 

Easebuzz is a payment solutions (fintech organisation) company which enables online merchants to accept, process and disburse payments through developer friendly APIs. We are focusing on building plug n play products including the payment infrastructure to solve complete business problems. Definitely a wonderful place where all the actions related to payments, lending, subscription, eKYC is happening at the same time.

 

We have been consistently profitable and are constantly developing new innovative products, as a result, we are able to grow 4x over the past year alone. We are well capitalised and have recently closed a fundraise of $4M in March, 2021 from prominent VC firms and angel investors. The company is based out of Pune and has a total strength of 180 employees. Easebuzz’s corporate culture is tied into the vision of building a workplace which breeds open communication and minimal bureaucracy. An equal opportunity employer, we welcome and encourage diversity in the workplace. One thing you can be sure of is that you will be surrounded by colleagues who are committed to helping each other grow.

 

Easebuzz Pvt. Ltd. has its presence in Pune, Bangalore, Gurugram.

 


Salary: As per company standards.

 

Designation: Data Engineering

 

Location: Pune

 

Experience with ETL, Data Modeling, and Data Architecture

Design, build and operationalize large scale enterprise data solutions and applications using one or more of AWS data and analytics services in combination with 3rd parties
- Spark, EMR, DynamoDB, RedShift, Kinesis, Lambda, Glue.

Experience with AWS cloud data lake for development of real-time or near real-time use cases

Experience with messaging systems such as Kafka/Kinesis for real time data ingestion and processing

Build data pipeline frameworks to automate high-volume and real-time data delivery

Create prototypes and proof-of-concepts for iterative development.

Experience with NoSQL databases, such as DynamoDB, MongoDB etc

Create and maintain optimal data pipeline architecture,

Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.


Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and AWS ‘big data’ technologies.

Build analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency and other key business performance metrics.

Work with stakeholders including the Executive, Product, Data and Design teams to assist with data-related technical issues and support their data infrastructure needs.

Keep our data separated and secure across national boundaries through multiple data centers and AWS regions.

Create data tools for analytics and data scientist team members that assist them in building and optimizing our product into an innovative industry leader.

Evangelize a very high standard of quality, reliability and performance for data models and algorithms that can be streamlined into the engineering and sciences workflow

Build and enhance data pipeline architecture by designing and implementing data ingestion solutions.

 

Employment Type

Full-time

 

Read more
A marketing technology and services company

A marketing technology and services company

Agency job
via Karsh Solutions by manjula D
Pune
2 - 9 yrs
₹4L - ₹13L / yr
MS SQLServer
skill iconPostgreSQL
MySQL DBA
skill iconC#
MySQL
+1 more
SUMMARY
The Database Developer will perform day-to-day database management, maintenance and troubleshooting by providing Tier 1 and Tier 2 support for diverse platforms including, but not limited to, MS SQL, Azure SQL, MySQL,
PostgreSQL and Amazon Redshift.

They are responsible for maintaining functional/technical support documentation
and operational documentation as well as reporting on performance metrics associated with job activity and platform
stability.

Must adhere to SLAs pertaining to data movement and provide evidence and supporting documentation for incidents that violate those SLAs.

Other responsibilities include API development and integrations via Azure
Functions, C# or Python.


Essential Duties and Responsibilities

• Advanced problem-solving skills
• Excellent communication skills
• Advanced T-SQL scripting skills
• Query optimization and performance tuning familiarity with traces, execution plans and server
logs
• SSIS package development and support
• PowerShell scripting
• Report visualization via SSRS, Power BI and/or Jupityr Nootbook
• Maintain functional/technical support documentation
• Maintain operational documentation specific to automated jobs and job steps
• Develop, implement and support user defined stored procedures, functions and (indexed) views
• Monitor database activities and provide Tier 1 and Tier 2 production support
• Provide functional and technical support to ensure performance, operation and stability of
database systems
• Manage data ingress and egress
• Track issue and/or project deliverables in Jira
• Assist in RDBMS patching, upgrades and enhancements
• Prepare database reports for managers as needed
• API integrations and development
Background/Experience
• Bachelor or advanced degree in computer science
• Microsoft SQL Server 2016 or higher
• Working knowledge of MySQL, PostgreSQL and/or Amazon Redshift
• C# and/or Python


Supervisory/Budget Responsibility

• No Supervisory Responsibility/No Budget Responsibility


Level of Authority to Make Decisions

The Database Developers expedite issue resolution pursuant to the functional/technical documentation available.

Issue escalation is at their discretion and should result in additional functional/technical documentation for future
reference.

However, individual problem solving, decision making and performance tuning will constitute 75% of their time.
Read more
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort
Why apply via Cutshort?
Connect with actual hiring teams and get their fast response. No spam.
Find more jobs
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort