
About You: ● Education ranging from a Bachelor’s of Science degree in computer science or related engineering degree. ● 12+ years of high level API, abstraction layers, and application software development experience. ● 5+ years experience building scalable, serverless solutions in GCP or AWS ● 4+ years of experience in Python, MongoDB ● Experience with large-scale distributed systems and streaming data services. ● Experience building, developing, and maintaining cloud native infrastructure, serverless architecture, micro-operations, and workflow automation. ● You are a hardworking problem-solver who thrives in finding solutions to difficult technical challenges. ● Experience with modern high-level languages and databases including Javascript, MongoDB, and Python. ● Experience in Github, Gitlab, CI/CD, Jira, unit testing, integration testing, regression testing, and collaborative documentation. ● Expertise with GCP, Kubernetes, Docker, or containerization, is a great plus. ● Ability to write and assess clean, functional, high quality and testable code for each of our projects. ● Positive and proactive, solution-focused contributor and team motivation.

Similar jobs
Position: Lead Backend Engineer
Location: Remote
Experience: 10+ Years
Budget: 1.7 LPM
Employment Type: [Contract]
Required Skills & Qualifications:
- 10+ years of proven backend engineering experience.
- Strong proficiency in Python.
- Expertise in SQL (Postgres) and database optimization.
- Hands-on experience with OpenAI APIs.
- Strong command of FastAPI and microservices architecture.
- Solid knowledge of debugging, troubleshooting, and performance tuning.
Nice to Have:
- Experience with Agentic Systems or ability to quickly adopt them.
- Exposure to modern CI/CD pipelines, containerization (Docker/Kubernetes), and cloud platforms (AWS, Azure, or GCP).
The Lead Software Developer is responsible for development of CFRA’s report generation framework using a modern technology stack: Python on AWS cloud infrastructure, SQL, and Web technologies. This is an opportunity to make an impact on both the team and the organization by being part of the design and development of a new customer-facing report generation framework that will serve as the foundation for all future report development at CFRA.
The ideal candidate has a passion for solving business problems with technology and can effectively communicate business and technical needs to stakeholders. We are looking for candidates that value collaboration with colleagues and having an immediate, tangible impact for a leading global independent financial insights and data company.
Key Responsibilities
- Analyst Workflows: Lead the design and development of CFRA’s integrated content publishing platform using a proprietary 3rd party editorial and publishing platform for integrated digital publishing.
- Designing and Developing APIs: Lead the design and development of robust, scalable, and secure APIs on AWS, considering factors like performance, reliability, and cost-efficiency.
- Architecture Planning: Collaborate with architects and stakeholders to define architecture, including API gateway, microservices, and serverless components, ensuring alignment with business goals and AWS best practices.
- Technical Leadership: Provide technical guidance and leadership to the development team, ensuring adherence to coding standards, best practices, and AWS guidelines.
- AWS Service Integration: Integrate APIs with various AWS services such as AWS Lambda, Amazon API Gateway, Amazon SQS, Amazon SNS, AWS Glue, and others, to build comprehensive and efficient solutions.
- Performance Optimization: Identify and implement optimizations to improve performance, scalability, and efficiency, leveraging AWS services and tools.
- Security and Compliance: Ensure APIs are developed following best security practices, including authentication, authorization, encryption, and compliance with relevant standards and regulations.
- Monitoring and Logging: Implement monitoring and logging solutions for APIs using AWS CloudWatch, AWS X-Ray, or similar tools, to ensure availability, performance, and reliability.
- Continuous Integration and Deployment (CI/CD): Establish and maintain CI/CD pipelines for API development, automating testing, deployment, and monitoring processes on AWS.
- Documentation and Training: Create and maintain comprehensive documentation for internal and external users, and provide training and support to developers and stakeholders.
- Team Collaboration: Collaborate effectively with cross-functional teams, including product managers, designers, and other developers, to deliver high-quality solutions that meet business requirements.
- Problem Solving: Lead troubleshooting efforts, identifying root causes and implementing solutions to ensure system stability and performance.
- Stay Updated: Stay updated with the latest trends, tools, and technologies related to development on AWS, and continuously improve your skills and knowledge.
Desired Skills and Experience
- Development: 10+ years of extensive experience in designing, developing, and deploying using modern technologies, with a focus on scalability, performance, and security.
- AWS Services: Strong proficiency in using AWS services such as AWS Lambda, Amazon API Gateway, Amazon SQS, Amazon SNS, Amazon SES, Amazon RDS, Amazon DynamoDB, and others, to build and deploy API solutions.
- Programming Languages: Proficiency in programming languages commonly used for development, such as Python, Node.js, or others, as well as experience with serverless frameworks like AWS.
- Architecture Design: Ability to design scalable and resilient API architectures using microservices, serverless, or other modern architectural patterns, considering factors like performance, reliability, and cost-efficiency.
- Security: Strong understanding of security principles and best practices, including authentication, authorization, encryption, and compliance with standards like OAuth, OpenID Connect, and AWS IAM.
- DevOps Practices: Familiarity with DevOps practices and tools, including CI/CD pipelines, infrastructure as code (IaC), and automated testing, to ensure efficient and reliable deployment on AWS.
- Problem-solving Skills: Excellent problem-solving skills, with the ability to troubleshoot complex issues, identify root causes, and implement effective solutions to ensure the stability and performance.
- Team Leadership: Experience leading and mentoring a team of developers, providing technical guidance, code reviews, and fostering a collaborative and innovative environment.
- Communication Skills: Strong communication skills, with the ability to effectively communicate technical concepts to both technical and non-technical stakeholders, and collaborate with cross- functional teams.
- Agile Methodologies: Experience working in Agile development environments, following practices like Scrum or Kanban, and ability to adapt to changing requirements and priorities.
- Continuous Learning: A commitment to continuous learning and staying updated with the latest trends, tools, and technologies related to development and AWS services.
- Bachelor's Degree: A bachelor's degree in Computer Science, Software Engineering, or a related field is often preferred, although equivalent experience and certifications can also be valuable.
Immediate Joiners Preferred. Notice Period - Immediate to 30 Days
Interested candidates are requested to email their resumes with the subject line "Application for [Job Title]".
Only applications received via email will be reviewed. Applications through other channels will not be considered.
About Us
adesso India is a dynamic and innovative IT Services and Consulting company based in Kochi. We are committed to delivering cutting-edge solutions that make a meaningful impact on our clients. As we continue to expand our development team, we are seeking a talented and motivated Backend Developer to join us in creating scalable and high-performance backend systems.
Job Description
We are looking for an experienced Backend and Data Developer with expertise in Java, SQL, BigQuery development working on public clouds, mainly GCP. As a Senior Data Developer, you will play a vital role in designing, building, and maintaining robust systems to support our data analytics. This position offers the opportunity to work on complex services, collaborating closely with cross-functional teams to drive successful project delivery.
Responsibilities
- Development and maintenance of data pipelines and automation scripts with Python
- Creation of data queries and optimization of database processes with SQL
- Use of bash scripts for system administration, automation and deployment processes
- Database and cloud technologies
- Managing, optimizing and querying large amounts of data in an Exasol database (prospectively Snowflake)
- Google Cloud Platform (GCP): Operation and scaling of cloud-based BI solutions, in particular
- Composer (Airflow): Orchestration of data pipelines for ETL processes
- Cloud Functions: Development of serverless functions for data processing and automation
- Cloud Scheduler: Planning and automation of recurring cloud jobs
- Cloud Secret Manager: Secure storage and management of sensitive access data and API keys
- BigQuery: Processing, analyzing and querying large amounts of data in the cloud
- Cloud Storage: Storage and management of structured and unstructured data
- Cloud monitoring: monitoring the performance and stability of cloud-based applications
- Data visualization and reporting
- Creation of interactive dashboards and reports for the analysis and visualization of business data with Power BI
Requirements
- Minimum of 4-6 years of experience in backend development, with strong expertise in BigQuery, Python and MongoDB or SQL.
- Strong knowledge of database design, querying, and optimization with SQL and MongoDB and designing ETL and orchestration of data pipelines.
- Expierience of minimum of 2 years with at least one hyperscaler, in best case GCP
- Combined with cloud storage technologies, cloud monitoring and cloud secret management
- Excellent communication skills to effectively collaborate with team members and stakeholders.
Nice-to-Have:
- Knowledge of agile methodologies and working in cross-functional, collaborative teams.
- Build applications to solve business problems in healthcare using Machine Learning techniques and software technologies
- Designing and developing machine learning models
- Running machine learning tests and experiments
- Work with the product and operations team to understand the business
requirements
- Create epics and user stories from requirements
- Prioritize the work based on the changing client requirements
Skills and Specifications:
Required Skills
- Experience in coding with Java
- Experience in working with Spring Framework
- Experience with basic API protocols
- Experience with developing, debugging and dev testing
- Decent communication skills
Nice to have
- Experience with distributed storage & Database Systems (SQL or
NoSQL, eg MySQL or MongoDB)
- Experience with production quality management, deployment &
monitoring.
- Experience with Tomcat, Nginx & Linux.
- Experience with Cloud Technologies (AWS, Azure, Oracle Cloud
etc)
- Product understanding skills.
- 3+ year of experience in Development in JAVA technology.
- Strong Java Basics
- SpringBoot or Spring MVC
- Experience in AWS.
- Hands on experience on Relationl Databases (SQL query or Hibernate) + Mongo (JSON parsing)
- Proficient in REST API development
- Messaging Queue (RabitMQ or Kafka)
- Microservices
- Any Caching Mechanism
- Good at problem solving
Good to Have Skills:
- 3+ years of experience in using Java/J2EE tech stacks
- Good understanding of data structures and algorithms.
- Excellent analytical and problem solving skills.
- Ability to work in a fast paced internet start-up environment.
- Experience in technical mentorship/coaching is highly desirable.
- Understanding of AI/ML algorithms is a plus.
- 2.5+ year of experience in Development in JAVA technology.
- Strong Java Basics
- SpringBoot or Spring MVC
- Experience in AWS.
- Hands on experience on Relationl Databases (SQL query or Hibernate) + Mongo (JSON parsing)
- Proficient in REST API development
- Messaging Queue (RabitMQ or Kafka)
- Microservices
- Any Caching Mechanism
- Good at problem solving
non-metro and rural markets. DealShare has raised series C funding of USD 21 million with key investors like WestBridge Capital, Falcon Edge Capital, Matrix Partners India, Omidyar Network, Z3 Partners and Partners of DST Global and has a total funding of USD 34 million.They have 2 million customers across Rajasthan, Gujarat, Maharashtra, Karnataka and Delhi NCR with monthly transactions of 1.2 million and annual GMV of $100 million. Our aim is to expand operations to 100 cities across India and reach annual GMV of USD 500 Million by end of 2021.
They started in Sept 2018 and had 5000 active customers in the first three months. Today
we have 25K transactions per day, 1 Lakh DAU and 10 Lakh MAU with a monthly GMV of INR 100 Crores and 50% growth MoM. We aim to hit 2 Lakh transactions per day with an annual GMV of 500 Million USD by 2021.
We are hiring for various teams in discovery (search, recommendation, merchandising,
intelligent notifications) , pricing (automated pricing, competition price awareness, balancing revenue with profits, etc), user growth and retention (bargains, gamification), monetisation (ads), order fulfillment (cart/checkout, warehousing, last mile, delivery promise, demand forecasting), customer support, data infrastructure (warehousing, analytics), ML infrastructure (data versioning, model repository, model training, model hosting, feature store, etc). We are looking for passionate problem solvers to join us and solve really challenging problems and scale DealShare systems
You will:
● Implement the solve with minimal guidance after solutioning closure with senior engineers.
● Write code that has good low level design and is easy to understand, maintain, extend
and test.
● End to end ownership of product/feature from development to production and fixing
issues
● Ensure high unit, functional and integration automated test coverage. Ensure releases
are stable.
● Communicate with various stakeholders (product, QA, senior engineers) as necessary to
ensure quality deliverables, smooth execution and launch.
● Participate in code reviews, improve development and testing processes.
● Participate in hiring great engineers
Required:
● Bachelor’s degree (4 years) or higher in Computer Science or equivalent and 1-3 years
of experience in software development
● Excellent at problem solving, is a self thinker.
● Good understanding of computer science fundamentals, data structures and algorithms
and object oriented design.
● Good coding skills in any object oriented language (C++, Java, Scala, etc), preferably in
Java.
● Prior experience in building one or more modules of large-scale, highly available, low
latency, high quality distributed system is preferred.
● Extremely good at problem solving, is a self thinker.
● Ability to multitask and thrive in a fast paced timeline-driven environment.
● Good team player and ability to collaborate with others
● Self driven and motivated, very high on ownership
Is a plus
● Prior experience of working in Java
● Prior experience of using AWS offerings - EC2, S3, DynamoDB, Lambda, API Gateway,
Cloudfront, etc
● Prior experience of working on big data technologies - Spark, Hadoop, etc
● Prior experience on asynchronous processing (queuing systems), workflow systems.
If you are a great Java developer with experience in building scalable SaaS web applications and looking for an opportunity in building world-class products using cutting-edge technologies, please read on.
Nimesa is a Data Protection & Copy Data Management company creating an enterprise-class Backup & Recovery solution. Our product can cater to the needs of the enterprise AWS users who are looking for a 360 Data Protection solution that can do more than just Backup & Recovery.
As a Senior R&D Engineer, You will
- Design and build scalable complex systems with Java and Spring
- Contribute to the development of new features, debugging, and deliver timely fixes
- Perform peer code reviews in order to ensure quality standards
Requirements
- Experience with Java, Spring Boot, AWS.
- Good at Design patterns, Algorithms.
- Experience with relational databases like PostgreSQL and MySQL
- Good understanding of web programming like REST and HTTP
- Strong Knowledge of Java Concurrency and Collection frameworks
- Knowledge of microservices architecture, messaging systems (RabbitMQ or Kafka), and docker is good to have
- Insights on workings of Database, Queues, Cache, servers.












