26+ AWS Lambda Jobs in Hyderabad | AWS Lambda Job openings in Hyderabad
Apply to 26+ AWS Lambda Jobs in Hyderabad on CutShort.io. Explore the latest AWS Lambda Job opportunities across top companies like Google, Amazon & Adobe.
We are looking for a skilled and 2.5-5 years experienced Senior MEAN/MERN Stack
Developer to join our talented development team. If you are passionate about cutting-edge
technologies, have a proven track record of delivering high-quality solutions, and thrive in a
collaborative and fast-paced environment, we want to hear from you.
Roles and Responsibilities:
● Develop and maintain robust and scalable web applications using MEAN/MERN
stack.
● Collaborate with cross-functional teams to define, design, and ship new features.
● Lead the architecture and design of complex software solutions.
● Ensure the technical feasibility of UI/UX design and implement responsive and user-
friendly interfaces.
● Implement best practices in software development, code reviews, and
documentation.
● Work closely with product management and stakeholders to understand
requirements and translate them into technical solutions.
● Optimise applications for maximum speed and scalability.
● Troubleshoot, debug, and resolve software defects and issues.
● Stay updated on emerging technologies and trends in MEAN/MERN stack
development and AWS Cloud services.
main objectives of the Senior Developer role
- Delivery of working software according to specifications
- Providing clarity on the progress of development work
- Assisting team members by sharing knowledge and coaching
- Suggesting process improvements with regard to team collaboration
The key requirements include the following:
- Bachelor's degree in software engineering or a related field
- 7+ years of development experience with Python programming
- Experience in setting up CI/CD workflows
- Experience in drafting solution architecture
- Excellent written and verbal communication skills in English
Key expertise and experience:
- Understanding of application architecture built on AWS/Python/React/MongoDB technological stack
- Proven Python skills and experience with modules such as Flask and FastAPI
- Experience in building high-load applications in AWS Cloud, preferably in a micro-service architecture
- Experience in developing commercial software, preferably in the financial domain
- Proficiency in working with legacy code and acquiring domain knowledge
- Experience with MongoDB and/or other NoSQL databases
- Experience in creating and automating tests (pytest) and using containers (Docker)
- Proficiency in using Linux-based development environments with GitHub and CI/CD
- Familiarity with the Atlassian stack (JIRA/Confluence)
- Nice to have – experience in integration with ERP, CRM, and SAP
- Nice to have – experience in building financial systems/knowledge of enterprise economics
Knowledge of key processes:
- Scrum / Agile way of working
- TDD/BDD
- CI/CD
Technical Skills:
- Ability to understand and translate business requirements into design.
- Proficient in AWS infrastructure components such as S3, IAM, VPC, EC2, and Redshift.
- Experience in creating ETL jobs using Python/PySpark.
- Proficiency in creating AWS Lambda functions for event-based jobs.
- Knowledge of automating ETL processes using AWS Step Functions.
- Competence in building data warehouses and loading data into them.
Responsibilities:
- Understand business requirements and translate them into design.
- Assess AWS infrastructure needs for development work.
- Develop ETL jobs using Python/PySpark to meet requirements.
- Implement AWS Lambda for event-based tasks.
- Automate ETL processes using AWS Step Functions.
- Build data warehouses and manage data loading.
- Engage with customers and stakeholders to articulate the benefits of proposed solutions and frameworks.
Job Description:-
Design, develop IoT/Cloud-based Typescript/ JavaScript/ Node.JS applications using
Amazon Cloud Computing Services.
Work closely with onsite, offshore, and cross functional teams, Product Management, UI/UX developers, Web and Mobile developers, SQA teams to effectively use technologies to build and deliver high quality and on-time delivery of IoT applications
Bug and issue resolution
Proactively Identify risks and failure modes early in the development lifecycle and develop.
POCs to mitigate the risks early in the program.
Assertive communication and team skills
Primary Skills:
Hands on experience (3+ years) in AWS cloud native environment with work experience in
AWS Lambda, Kinesis, DynamoDB
3+ years’ experience in working with NodeJS, Python, Unit Testing and Git
3+ years in work experience with document, relational or timeseries databases
2+ years in work experience with typescript.
1+ years in IaaS framework like Serverless or CDK with CloudFormation knowledge
Company: SecuvyAI, Backed by Dell Technology Ventures
Location: Hyderabad
About Secuvy:
At Secuvy, we believe Privacy & Data Security will be a necessity for every global brand. Our mission is to provide best in class Contextual Data Intelligence tools to monitor, automate & simplify data governance for security, compliance & legal teams. We hire out of the box thinkers with the passion, creativity, and perseverance to handle constantly expanding data sprawls and deliver impactful results to our customers. Secuvy’s team comprises experts with Deep AI & Security background who have launched products for Fortune 100. We are backed by Dell Technology Ventures & Top Security VC firms in Silicon Valley. Learn more at www.secuvy.ai
About the Role:
Our enthusiasm for leadership driven by purpose is unwavering. We believe that every individual holds latent abilities waiting to be discovered and that a bold, unconventional approach is necessary to unleash them. We have grand aspirations and never settle for mediocrity. We are meticulous in our attention to detail. Our desire to tackle challenging issues and achieve exceptional outcomes for our customers is what drives us to always strive for excellence
Responsibilities:
- Design and develop high-quality, scalable, and performant software solutions using NodeJS and AWS services.
- Collaborate with cross-functional teams, including product managers, designers, and other engineers, to identify and solve complex business problems.
- Design and develop large-scale distributed systems that are reliable, resilient, and fault-tolerant.
- Write clean, well-designed, and maintainable code that is easy to understand and debug.
- Participate in code reviews and ensure that all code is of high quality and adheres to best practices.
- Troubleshoot and debug production issues and work with the team to develop and implement solutions.
- Stay up-to-date with new technologies and best practices in software engineering and cloud computing.
- Experience with Data Security or Cybersecurity Products is a big asset
Requirements:
- Bachelor's or Master's degree in Computer Science, Software Engineering, or a related field.
- At least 2-4 years of professional experience in building web applications using NodeJS and AWS services.
- Strong understanding of NodeJS, and experience with server-side frameworks such as Express and NestJS.
- Strong experience in designing and building large-scale distributed systems, with a solid understanding of distributed computing concepts.
- Hands-on Experience with AWS services, including EC2, S3, Lambda, API Gateway, and RDS.
- Experience with containerization and orchestration technologies such as Docker and Kubernetes.
- Strong understanding of software engineering best practices, including agile development, TDD, and continuous integration and deployment.
- Hands-on Experience with Cloud technologies including Kubernetes and Docker.
- Experience with no-sql technologies like MongoDB or Azure Cosmos
- Experience with a distributed publish-subscribe messaging system like Kafka or redis Pubsub
- Experience developing, configuring & deploying applications on Hapi.js/Express/Fastify.
- Comfortable writing tests in Jest
- Excellent problem-solving and analytical skills, with the ability to identify and solve complex technical problems.
- Strong communication and collaboration skills, with the ability to work effectively in a team environment.
Why Work With Us?
- Join a rapidly growing startup with a passionate team of experts at the forefront of data privacy & security.
- Opportunity to play a key role in shaping the technical vision and strategy for Fortune 1000 Customers.
- Work in a dynamic, fast-paced environment where you can make an impact and drive change.
- Enjoy a competitive salary, benefits, and opportunities for growth and advancement.
- If you are a passionate, self-motivated, and experienced Senior NodeJS Software Engineer with expertise in AWS and large-scale system design, we would love to hear from you!
Position Overview: We are looking for a senior developer who must be capable of building services using modern microservices architecture with NodeJS & serverless frameworks. Should have strong knowledge of ObjectOriented Concepts, and Node. Js frameworks like loopback, RDBMS, and microservice patterns. The candidate should be comfortable with application coding based on design and specifications, and also be comfortable working with JSON and RESTful services. The candidate must be a top-notch developer committed to keeping learning new things and becoming an integral part of and fostering growth within our development team.
Roles & Responsibilities:
- Developing and maintaining the backend systems that support the application
- Designing and implementing APIs and Microservices architecture to handle the application workload
- Developing and optimizing queries for RDBMS (MySQL) to manage data and ensure performance and scalability
- Managing and deploying the application on AWS using Lambda, ECS and other related services
- Collaborating with front-end developers to ensure smooth communication between front-end and back-end systems and back-end systems
- Writing clean, neat and reusable code that follows coding standards and best practices.
- Participating in code reviews and ensuring that coding standards are maintained
- Troubleshooting and debugging issues that arise in production environments
- Optimizing the application for performance, scalability, and security
- Continuously learning and staying up-to-date with the latest technologies and best practices in software development
- Documenting technical specifications, processes, and system design
- Proven experience as a Node.js Developer with a strong understanding of JavaScript and asynchronous programming.
- Hands-on experience/knowledge with AWS Lambda and serverless computing.
- Proficiency in developing and maintaining RESTful APIs using Node.js frameworks like Express.js or Hapi.js.
- Sound knowledge of AWS services, particularly API Gateway, DynamoDB, S3, and CloudFormation.
- Familiarity with AWS Lambda event sources, such as API Gateway, S3, DynamoDB Streams, and CloudWatch Events.
- Experience in integrating external services, APIs, and data sources into applications.
- Solid understanding of software development principles, design patterns, and best practices.
- Strong problem-solving and analytical skills, with the ability to quickly diagnose and resolve issues.
- Experience with version control systems, preferably Git.
- Excellent collaboration and communication skills, with the ability to work effectively in a team environment.
Confidential
Strategic Vendor Alliance – AWS Practice Lead and Overall P&L owner for India Business and Drive Profitability.
A) Managing the AWS Practice for Clients and executing the strategic business plan for the company and channel partners ecosystem with regards to different Services of AWS.
B) Built key relationships with various segment leaders in AWS from the Commercial and Public Sector and create Client led AWS Solutions for Partners to make simplifying cloud approach for customers.
C) Building a predictable pipeline of joint opportunities via differentiated proposition to the customers and Partners by working with AWS on a unique offerings specifically to Client.
D) Drive SMB focus approach for AWS which contributes 50%of Clients overall Business using ready to use cloud bundles for various workloads
E) Own the relationship map, drive and monitor cadence meetings, both with Internal Sellers and Channel Partners measured by different parameters on Incremental growth ,customer acquisition, partners onboarding, Manage Services led approach ,migration and deployment ,success of each GTM
F) Lead a Team of Product Mangers (who will drive specific GTM like SAP on AWS , SMB Scale Drive, Education focus,Cloud Front Drive- CDN bundle , Strategic Workloads like Microsoft Workload,DC Migration ,DR on Cloud)
G) Managing Partner Profitability metrics by creating various different avenues from recurring resale consumption and services led engagement for partners.
H) Worked on Long Term direction of the company business plan to drive Incremental Growth.
I)Collaborate with internal peers within the company to build cloud business model,how to attach manage services to various services of the Hyperscaler in building the framework.
at Softobiz Technologies Private limited
Greetings From Softobiz Technologies!!
Softobiz Technologies is a leading organization among the IT organizations supplying its products along with the services in the form of Product development and Web-Application development.
We are looking for people who have experience in Python and react.js.
Location - Hyderabad
Years – 3+ Years
Primary Purpose:
The Senior Software Engineer role is responsible for ensuring continuous innovation and developing high-quality software aligned with user needs and business goals. This role will work as an integral part of creative and motivated teams to ensure high-quality benchmarks are met for deliveries of innovative products for local and international markets. Working closely with other stakeholders requires the ability to design, develop, and implement creative solutions with Modern architecture and design patterns. . Thinking outside the box with the capability to learn upcoming technologies in the integration space, the role requires contributing proactively to business-aligned digital initiatives. A detailed understanding of modern SDLC frameworks is required, as is the ability to work in an Agile environment.
Job Responsibilities:
- Lead projects in your respective squad, guide and mentor the team as required.
- Provide thought leadership in the architecture and design of Digital products.
- Provide technical leadership, mentoring, and coaching of developers.
- Contribute to the implementation and delivery of Digital products.
- Promote and experiment with new technologies that will help improve the quality and delivery of Digital Products.
- Perform software engineering tasks to ensure the highest levels of product quality are met.
- Analyze system and production defects, including troubleshooting and root cause analysis.
- Learn from industry specialists and apply best practices to match our customer needs further.
- Keep up to date with technical trends, tools, and methodologies, and research as required.
Experience and Skills:
- 3+ years of relevant experience;
- Solid experience working in complex and innovative R&D development environments;
- Experience working in Agile process and result-oriented culture;
- Solid experience with multiple programming languages, including Python, JavaScript, Typescript, ReactJs/AngularJs/Vue.Js.
- Demonstrable robust Architecture, design, development, and collaboration skills gained in developing software product solutions;
- Hands-on experience with developing and implementing integrations;
- Experience with developing enterprise grade complex software applications.
Technical Skills
- Python frameworks Django/Flask/Fast API
- Solid OOP skill in PHP, with S.O.L.I.D principle in mind.
- Experience in react.js
- Experience with test driven development.
- Comfortable with GIT version control systems;
- Normalized SQL database design, strong experience in MySQL
- Knowledge of building and consuming REST API
- Good to have experience with AWS Stack (Lambda, SQS, SNS, API Gateway, etc.)
Bonus Technical Skills:
- Some knowledge or experience in Kubernetes is a bonus
- Experience with Microservices architecture
About Softobiz
Innovation begins with like-minded people aiming to transform the world together.
We invite you to join an organization that has been helping clients transform their businesses by fusing insights, creativity, and technology. At Softobiz, we embrace a diverse mix of talented people who come here to stay and do their best work.
At Softobiz, we transform passionate individuals into proficient professionals dedicated to exploring new frontiers in Software development. Here you will get the opportunity to work with technical craftsmen that are pioneers in the latest technologies.
We promote a culture of equality, learning, collaboration, and creative freedom so that our employees grow.
For more information about our solutions and organization, visit www.softobiz.com,
Follow us on LinkedIn, Twitter, and Facebook for more updates.
at Altimetrik
Big Data Engineer: 5+ yrs.
Immediate Joiner
- Expertise in building AWS Data Engineering pipelines with AWS Glue -> Athena -> Quick sight
- Experience in developing lambda functions with AWS Lambda
- Expertise with Spark/PySpark – Candidate should be hands on with PySpark code and should be able to do transformations with Spark
- Should be able to code in Python and Scala.
- Snowflake experience will be a plus
- We can start keeping Hadoop and Hive requirements as good to have or understanding of is enough rather than keeping it as a desirable requirement.
Interfaces with other processes and/or business functions to ensure they can leverage the
benefits provided by the AWS Platform process
Responsible for managing the configuration of all IaaS assets across the platforms
Hands-on python experience
Manages the entire AWS platform(Python, Flask, RESTAPI, serverless framework) and
recommend those that best meet the organization's requirements
Has a good understanding of the various AWS services, particularly: S3, Athena, Python code,
Glue, Lambda, Cloud Formation, and other AWS serverless resources.
AWS Certification is Plus
Knowledge of best practices for IT operations in an always-on, always-available service model
Responsible for the execution of the process controls, ensuring that staff comply with process
and data standards
Qualifications
Bachelor’s degree in Computer Science, Business Information Systems or relevant experience and
accomplishments
3 to 6 years of experience in the IT field
AWS Python developer
AWS, Serverless/Lambda, Middleware.
Strong AWS skills including Data Pipeline, S3, RDS, Redshift with familiarity with other components
like - Lambda, Glue, Step functions, CloudWatch
Must have created REST API with AWS Lambda.
Python relevant exp 3 years
Good to have Experience working on projects and problem solving with large scale multivendor
teams.
Good to have knowledge on Agile Development
Good knowledge on SDLC.
Hands on AWS Databases, (RDS, etc)
Good to have Unit testing exp.
Good to have CICD working knowledge.
Decent communication, as there will be client interaction and documentation.
Education (degree): Bachelor’s degree in Computer Science, Business Information Systems or relevant
experience and accomplishments
Years of Experience: 3-6 years
Technical Skills
Linux/Unix system administration
Continuous Integration/Continuous Delivery tools like Jenkins
Cloud provisioning and management – Azure, AWS, GCP
Ansible, Chef, or Puppet
Python, PowerShell & BASH
Job Details
JOB TITLE/JOB CODE: AWS Python Develop[er, III-Sr. Analyst
RC: TBD
PREFERRED LOCATION: HYDERABAD, IND
POSITION REPORTS TO: Manager USI T&I Cloud Managed Platform
CAREER LEVEL: 3
Work Location:
Hyderabad
at Altimetrik
-Expertise in building AWS Data Engineering pipelines with AWS Glue -> Athena -> Quick sight.
-Experience in developing lambda functions with AWS Lambda.
-
Expertise with Spark/PySpark
– Candidate should be hands on with PySpark code and should be able to do transformations with Spark
-Should be able to code in Python and Scala.
-
Snowflake experience will be a plus
at Altimetrik
Experience in developing lambda functions with AWS Lambda
Expertise with Spark/PySpark – Candidate should be hands on with PySpark code and should be able to do transformations with Spark
Should be able to code in Python and Scala.
Snowflake experience will be a plus
at Altimetrik
- Expertise in building AWS Data Engineering pipelines with AWS Glue -> Athena -> Quick sight
- Experience in developing lambda functions with AWS Lambda
- Expertise with Spark/PySpark – Candidate should be hands on with PySpark code and should be able to do transformations with Spark
- Should be able to code in Python and Scala.
- Snowflake experience will be a plus
Urgent Openings with one of our client
Experience : 3 to 7 Years
Number of Positions : 20
Job Location : Hyderabad
Notice : 30 Days
1. Expertise in building AWS Data Engineering pipelines with AWS Glue -> Athena -> Quick sight
2. Experience in developing lambda functions with AWS Lambda
3. Expertise with Spark/PySpark – Candidate should be hands on with PySpark code and should be able to do transformations with Spark
4. Should be able to code in Python and Scala.
5. Snowflake experience will be a plus
Hadoop and Hive requirements as good to have or understanding of is enough.
Altimetrik
Big data Developer
Exp: 3yrs to 7 yrs.
Job Location: Hyderabad
Notice: Immediate / within 30 days
1. Expertise in building AWS Data Engineering pipelines with AWS Glue -> Athena -> Quick sight
2. Experience in developing lambda functions with AWS Lambda
3. Expertise with Spark/PySpark Candidate should be hands on with PySpark code and should be able to do transformations with Spark
4. Should be able to code in Python and Scala.
5. Snowflake experience will be a plus
We can start keeping Hadoop and Hive requirements as good to have or understanding of is enough rather than keeping it as a desirable requirement.
consulting & implementation services in the area of Oil & Gas, Mining and Manufacturing Industry
- Data Engineer
Required skill set: AWS GLUE, AWS LAMBDA, AWS SNS/SQS, AWS ATHENA, SPARK, SNOWFLAKE, PYTHON
Mandatory Requirements
- Experience in AWS Glue
- Experience in Apache Parquet
- Proficient in AWS S3 and data lake
- Knowledge of Snowflake
- Understanding of file-based ingestion best practices.
- Scripting language - Python & pyspark
CORE RESPONSIBILITIES
- Create and manage cloud resources in AWS
- Data ingestion from different data sources which exposes data using different technologies, such as: RDBMS, REST HTTP API, flat files, Streams, and Time series data based on various proprietary systems. Implement data ingestion and processing with the help of Big Data technologies
- Data processing/transformation using various technologies such as Spark and Cloud Services. You will need to understand your part of business logic and implement it using the language supported by the base data platform
- Develop automated data quality check to make sure right data enters the platform and verifying the results of the calculations
- Develop an infrastructure to collect, transform, combine and publish/distribute customer data.
- Define process improvement opportunities to optimize data collection, insights and displays.
- Ensure data and results are accessible, scalable, efficient, accurate, complete and flexible
- Identify and interpret trends and patterns from complex data sets
- Construct a framework utilizing data visualization tools and techniques to present consolidated analytical and actionable results to relevant stakeholders.
- Key participant in regular Scrum ceremonies with the agile teams
- Proficient at developing queries, writing reports and presenting findings
- Mentor junior members and bring best industry practices
QUALIFICATIONS
- 5-7+ years’ experience as data engineer in consumer finance or equivalent industry (consumer loans, collections, servicing, optional product, and insurance sales)
- Strong background in math, statistics, computer science, data science or related discipline
- Advanced knowledge one of language: Java, Scala, Python, C#
- Production experience with: HDFS, YARN, Hive, Spark, Kafka, Oozie / Airflow, Amazon Web Services (AWS), Docker / Kubernetes, Snowflake
- Proficient with
- Data mining/programming tools (e.g. SAS, SQL, R, Python)
- Database technologies (e.g. PostgreSQL, Redshift, Snowflake. and Greenplum)
- Data visualization (e.g. Tableau, Looker, MicroStrategy)
- Comfortable learning about and deploying new technologies and tools.
- Organizational skills and the ability to handle multiple projects and priorities simultaneously and meet established deadlines.
- Good written and oral communication skills and ability to present results to non-technical audiences
- Knowledge of business intelligence and analytical tools, technologies and techniques.
Familiarity and experience in the following is a plus:
- AWS certification
- Spark Streaming
- Kafka Streaming / Kafka Connect
- ELK Stack
- Cassandra / MongoDB
- CI/CD: Jenkins, GitLab, Jira, Confluence other related tools
- 5+ years of software development experience in Java 8+ and Microservices.
- Experience in developing micro services. Experience in developing High Cohesion & Loosely Coupled Micro Services.
- Experienced in skills of requirement, analysis, design, develop, Java, springboot, microservices, rest api, AWS, lambda, EC2, Jenkins, design pattern, spring security, splunk, auth, docker, SOLID
- Hands on experience on Microservices Architecture.
- Should have excellent acumen in Data Structures, algorithms, problem-solving and Logical/Analytical skills. Thorough understanding of OOPS concepts, Design principles and implementation of different type of Design patterns.
- Experience with Multithreading, Concurrent Package and Concurrent APIs
- Basic understanding of Java Memory Management (JMM) including garbage collections concepts.
- Experience in RDBMS or NO SQL databases and writing SQL queries (Joins, group by, aggregate functions, etc.)
- Hands-on experience with Message Broker like Kafka/Rabbitmq or other. Hands-on experience in creating RESTful webservices and consuming web services. Hands-on experience with spring Cloud/Spring Boot.
- Hands-on experience with any of the logging frameworks (SLF4J/LogBack/Log4j)
- Experience of writing Junit test cases using Mockito / Powermock frameworks. Should have practical experience with Maven/Gradle and knowledge of version control systems like Git/SVN etc.
- Hands on experience on Cloud deployment/development like AWS/Azure/GCP.
- Good communication skills and ability to work with global teams to define and deliver on projects. Sound understanding/experience in software development process, test-driven development.
Benefits of Working Here:
- Gender Neutral /Diversified Culture
- 51 Leaves annually
- Insurance covered for family
- Incentives, Bonus
- Permanent WFH Option
- Generous parental leave and new parent transition program
- Flexible work arrangements
Technical Experience :
- 2-6 years of Python working experience
- Expertise in at least one popular Python framework /Django/ Flask
- Knowledge of object-relational mapping d
- Familiarity with front-end technologies JavaScript and HTML5
Key Responsibilities :
- Write effective, scalable code
- Develop back-end components to improve responsiveness and overall performance
- Integrate user-facing elements into applications
- Test and debug programs5 Improve functionality
- 5-8 years of experience as a Java/J2EE developer.
- 1-3 years of experience with Angular / React is desirable.
- 1-3 years of experience in using Spring and Spring Boot frameworks.
- Thorough knowledge of server-side development.
- Proven experience as a Full Stack Developer or similar role.
- Good understanding of web services (WSDL SOAP, RESTful).
- Hands-on experience in using Application Servers like WebSphere.
- Expertise in relational databases (Oracle, SQL Server).
- E-commerce domain knowledge is desirable.
- Prior experience in developing desktop and mobile applications.
- Familiarity with common stacks.
- Knowledge of multiple frontend languages and libraries, like HTML/ CSS, JavaScript, XML, jQuery.
- Experience in implementation of Microservices
- Experience with AWS (S3, SQS, SNS, ECS, EC2, ALB, API Gateway, Lambda, etc.) is highly desirable
- Good understanding of Docker & Kubernetes is highly desired.
- Familiarity with databases (MySQL, MongoDB, PostgresSQL), web servers (Apache), and UI/UX designs.
- Collaborate with Dev, QA and Data Science teams on environment maintenance, monitoring (ELK, Prometheus or equivalent), deployments and diagnostics
- Administer a hybrid datacenter, including AWS and EC2 cloud assets
- Administer, automate and troubleshoot container based solutions deployed on AWS ECS
- Be able to troubleshoot problems and provide feedback to engineering on issues
- Automate deployment (Ansible, Python), build (Git, Maven. Make, or equivalent) and integration (Jenkins, Nexus) processes
- Learn and administer technologies such as ELK, Hadoop etc.
- A self-starter and enthusiasm to learn and pick up new technologies in a fast-paced environment.
Need to have
- Hands-on Experience in Cloud based DevOps
- Experience working in AWS (EC2, S3, CloudFront, ECR, ECS etc)
- Experience with any programming language.
- Experience using Ansible, Docker, Jenkins, Kubernetes
- Experience in Python.
- Should be very comfortable working in Linux/Unix environment.
- Exposure to Shell Scripting.
- Solid troubleshooting skills
15 years US based Product Company
- Should have good hands-on experience in Informatica MDM Customer 360, Data Integration(ETL) using PowerCenter, Data Quality.
- Must have strong skills in Data Analysis, Data Mapping for ETL processes, and Data Modeling.
- Experience with the SIF framework including real-time integration
- Should have experience in building C360 Insights using Informatica
- Should have good experience in creating performant design using Mapplets, Mappings, Workflows for Data Quality(cleansing), ETL.
- Should have experience in building different data warehouse architecture like Enterprise,
- Federated, and Multi-Tier architecture.
- Should have experience in configuring Informatica Data Director in reference to the Data
- Governance of users, IT Managers, and Data Stewards.
- Should have good knowledge in developing complex PL/SQL queries.
- Should have working experience on UNIX and shell scripting to run the Informatica workflows and to control the ETL flow.
- Should know about Informatica Server installation and knowledge on the Administration console.
- Working experience with Developer with Administration is added knowledge.
- Working experience in Amazon Web Services (AWS) is an added advantage. Particularly on AWS S3, Data pipeline, Lambda, Kinesis, DynamoDB, and EMR.
- Should be responsible for the creation of automated BI solutions, including requirements, design,development, testing, and deployment
one of the Big 4 IT companies in India
- API
- AWS
Need a strong Amazon Web Service developer with experience developing APIs using Lambda functions. The candidate must have a very good familiarity with API and deployment of API in AWS knowledge are mandatory.
● Responsible for design, development, and implementation of Cloud solutions.
● Responsible for achieving automation & orchestration of tools(Puppet/Chef)
● Monitoring the product's security & health(Datadog/Newrelic)
● Managing and Maintaining databases(Mongo & Postgres)
● Automating Infrastructure using AWS services like CloudFormation
● Participating in Infrastructure Security Audits
● Migrating to Container technologies (Docker/Kubernetes)
● Should be able to work on serverless concepts (AWS Lambda)
● Should be able to work with AWS services like EC2, S3, Cloud-formation, EKS, IAM, RDS, ..etc
What you bring:
● Problem-solving skills that enable you to identify the best solutions.
● Team collaboration and flexibility at work.
● Strong verbal and written communication skills that will help in presenting complex ideas
in
● an accessible and engaging way.
● Ability to choose the best tools and technologies which best fits the business needs.
Aviso offers:
● Dynamic, diverse, inclusive startup environment driven by transparency and velocity
● Bright, open, sunny working environment and collaborative office space
● Convenient office locations in Redwood City, Hyderabad and Bangalore tech hubs
● Competitive salaries and company equity, and a focus on developing world class talent operations
● Comprehensive health insurance available (medical) for you and your family
● Unlimited leaves with manager approval and a 3 month paid sabbatical after 3 years of service
● CEO moonshots projects with cash awards every quarter
● Upskilling and learning support including via paid conferences, online courses, and certifications
● Every month Rupees 2,500 will be credited to Sudexo meal card