Cutshort logo
AWS RDS Jobs in Chennai

3+ AWS RDS Jobs in Chennai | AWS RDS Job openings in Chennai

Apply to 3+ AWS RDS Jobs in Chennai on CutShort.io. Explore the latest AWS RDS Job opportunities across top companies like Google, Amazon & Adobe.

icon
Moative

at Moative

3 candid answers
Eman Khan
Posted by Eman Khan
Chennai
3 - 5 yrs
₹10L - ₹25L / yr
skill iconPython
PySpark
skill iconScala
Data engineering
ETL
+12 more

About Moative

Moative, an Applied AI company, designs and builds transformation AI solutions for traditional industries in energy, utilities, healthcare & lifesciences, and more. Through Moative Labs, we build AI micro-products and launch AI startups with partners in vertical markets that align with our theses.


Our Past: We have built and sold two companies, one of which was an AI company. Our founders and leaders are Math PhDs, Ivy League University Alumni, Ex-Googlers, and successful entrepreneurs.


Our Team: Our team of 20+ employees consist of data scientists, AI/ML Engineers, and mathematicians from top engineering and research institutes such as IITs, CERN, IISc, UZH, Ph.Ds. Our team includes academicians, IBM Research Fellows, and former founders.


Work you’ll do

As a Data Engineer, you will work on data architecture, large-scale processing systems, and data flow management. You will build and maintain optimal data architecture and data pipelines, assemble large, complex data sets, and ensure that data is readily available to data scientists, analysts, and other users. In close collaboration with ML engineers, data scientists, and domain experts, you’ll deliver robust, production-grade solutions that directly impact business outcomes. Ultimately, you will be responsible for developing and implementing systems that optimize the organization’s data use and data quality.


Responsibilities

  • Create and maintain optimal data architecture and data pipelines on cloud infrastructure (such as AWS/ Azure/ GCP)
  • Assemble large, complex data sets that meet functional / non-functional business requirements
  • Identify, design, and implement internal process improvements
  • Build the pipeline infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources
  • Support development of analytics that utilize the data pipeline to provide actionable insights into key business metrics
  • Work with stakeholders to assist with data-related technical issues and support their data infrastructure needs


Who you are

You are a passionate and results-oriented engineer who understands the importance of data architecture and data quality to impact solution development, enhance products, and ultimately improve business applications. You thrive in dynamic environments and are comfortable navigating ambiguity. You possess a strong sense of ownership and are eager to take initiative, advocating for your technical decisions while remaining open to feedback and collaboration. 


You have experience in developing and deploying data pipelines to support real-world applications. You have a good understanding of data structures and are excellent at writing clean, efficient code to extract, create and manage large data sets for analytical uses. You have the ability to conduct regular testing and debugging to ensure optimal data pipeline performance. You are excited at the possibility of contributing to intelligent applications that can directly impact business services and make a positive difference to users.


Skills & Requirements

  • 3+ years of hands-on experience as a data engineer, data architect or similar role, with a good understanding of data structures and data engineering.
  • Solid knowledge of cloud infra and data-related services on AWS (EC2, EMR, RDS, Redshift) and/ or Azure.
  • Advanced knowledge of SQL, including writing complex queries, stored procedures, views, etc.
  • Strong experience with data pipeline and workflow management tools (such as Luigi, Airflow).
  • Experience with common relational SQL, NoSQL and Graph databases.
  • Strong experience with scripting languages: Python, PySpark, Scala, etc.
  • Practical experience with basic DevOps concepts: CI/CD, containerization (Docker, Kubernetes), etc
  • Experience with big data tools (Spark, Kafka, etc) and stream processing.
  • Excellent communication skills to collaborate with colleagues from both technical and business backgrounds, discuss and convey ideas and findings effectively.
  • Ability to analyze complex problems, think critically for troubleshooting and develop robust data solutions.
  • Ability to identify and tackle issues efficiently and proactively, conduct thorough research and collaborate to find long-term, scalable solutions.


Working at Moative

Moative is a young company, but we believe strongly in thinking long-term, while acting with urgency. Our ethos is rooted in innovation, efficiency and high-quality outcomes. We believe the future of work is AI-augmented and boundary less. Here are some of our guiding principles:

  • Think in decades. Act in hours. As an independent company, our moat is time. While our decisions are for the long-term horizon, our execution will be fast – measured in hours and days, not weeks and months.
  • Own the canvas. Throw yourself in to build, fix or improve – anything that isn’t done right, irrespective of who did it. Be selfish about improving across the organization – because once the rot sets in, we waste years in surgery and recovery.
  • Use data or don’t use data. Use data where you ought to but not as a ‘cover-my-back’ political tool. Be capable of making decisions with partial or limited data. Get better at intuition and pattern-matching. Whichever way you go, be mostly right about it.
  • Avoid work about work. Process creeps on purpose, unless we constantly question it. We are deliberate about committing to rituals that take time away from the actual work. We truly believe that a meeting that could be an email, should be an email and you don’t need a person with the highest title to say that out loud.
  • High revenue per person. We work backwards from this metric. Our default is to automate instead of hiring. We multi-skill our people to own more outcomes than hiring someone who has less to do. We don’t like squatting and hoarding that comes in the form of hiring for growth. High revenue per person comes from high quality work from everyone. We demand it.


If this role and our work is of interest to you, please apply. We encourage you to apply even if you believe you do not meet all the requirements listed above.  


That said, you should demonstrate that you are in the 90th percentile or above. This may mean that you have studied in top-notch institutions, won competitions that are intellectually demanding, built something of your own, or rated as an outstanding performer by your current or previous employers. 


The position is based out of Chennai. Our work currently involves significant in-person collaboration and we expect you to work out of our offices in Chennai.

Read more
Product Development Company

Product Development Company

Agency job
via CIEL HR Services by Sivakumar S
Remote, Chennai
5 - 10 yrs
₹20L - ₹30L / yr
skill iconJava
Hibernate (Java)
skill iconSpring Boot
Gradle
skill iconPostgreSQL
+6 more

Responsibilities: A listing of the key responsibilities  

·       Build an enterprise application using Java, Spring boot, Hibernate, Gradle.

·       Work with Postgres database on AWS RDS.

·       Manage the application on AWS cloud.

·       Maintain necessary documentation for the project.

·       Fix the issues reported by application users.

·       Code Review and Code Optimization

·       Coordinate with the development team to manage the fixes and code changes and code merge

·       Manage the backend Java, Database changes/bugs along with UI changes/bugs.

·       You should know what RESTful services are and have experience working with such APIs in the backend.

·       Exposure to Java based technologies such as the Spring framework and RDBMS such as PostgreSQL is preferred. You must be able to connect to a database, write simple SQL statements to verify end-to-end functionality.

·        

 

 

Minimum Qualifications:

 

  • Minimum 5 years of experience with Java platform and technologies.
  • Minimum 5 years of experience with HTML5, CSS and Angular.
  • Experience with enterprise level application design and development is a must.
  • Expert level knowledge in Java Spring framework, JHipster, PostgreSQL, HTML5, Angular, CSS
  • Must have previously worked in an environment using version control systems such as Bitbucket.
  • Good knowledge of RDBMS such as Postgres and MySQL. Should be proficient in SQL and assessing performance of queries. Some ability to monitor and maintain the database is needed.
  • Experience with AWS RDS is MUST.
  • Minimum 3 years of experience deploying Java applications on the AWS cloud.
  • Ability to handle front-end development (Angular JS) is a MUST.
  • Experience working with Monday, Atlassian project management tools (JIRA/Bitbucket/Confluence) or similar is a must.
Read more
Omega IT Resources

at Omega IT Resources

1 recruiter
Ashok Chinta
Posted by Ashok Chinta
Chennai
5 - 8 yrs
₹6L - ₹12L / yr
skill iconAmazon Web Services (AWS)
DevOps
AWS Lambda
AWS CloudFormation
AWS RDS
+1 more

Your skills and experience should cover:

  • 5+ years of experience with developing, deploying, and debugging solutions on the AWS platform using ALL AWS services such as S3, IAM, Lambda, API Gateway, RDS, Cognito, Cloudtrail, CodePipeline, Cloud Formation, Cloudwatch and WAF (Web Application Firewall).

  • Amazon Web Services (AWS) Certified Developer: Associate, is required; Amazon Web Services (AWS) DevOps Engineer: Professional, preferred.

  • 5+ years of experience using one or more modern programming languages (Python, Node.js).

  • Hands-on experience migrating data to the AWS cloud platform

  • Experience with Scrum/Agile methodology.

  • Good understanding of core AWS services, uses, and basic AWS architecture best practices (including security and scalability)

  • Experience with AWS Data Storage Tools.

  • Experience in Configure and implement AWS tools such as CloudWatch, CloudTrail and direct system logs for monitoring.

  • Experience working with GIT, or similar tools.

  • Ability to communicate and represent AWS Recommendations and Standards.

 

The following areas are highly advantageous:

  • Experience with Docker

  • Experience with PostgreSQL database

Read more
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort
Why apply via Cutshort?
Connect with actual hiring teams and get their fast response. No spam.
Find more jobs
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort