
Immediate Joiners Preferred. Notice Period - Immediate to 30 Days
Interested candidates are requested to email their resumes with the subject line "Application for [Job Title]".
Only applications received via email will be reviewed. Applications through other channels will not be considered.
About Us
adesso India is a dynamic and innovative IT Services and Consulting company based in Kochi. We are committed to delivering cutting-edge solutions that make a meaningful impact on our clients. As we continue to expand our development team, we are seeking a talented and motivated Backend Developer to join us in creating scalable and high-performance backend systems.
Job Description
We are looking for an experienced Backend and Data Developer with expertise in Java, SQL, BigQuery development working on public clouds, mainly GCP. As a Senior Data Developer, you will play a vital role in designing, building, and maintaining robust systems to support our data analytics. This position offers the opportunity to work on complex services, collaborating closely with cross-functional teams to drive successful project delivery.
Responsibilities
- Development and maintenance of data pipelines and automation scripts with Python
- Creation of data queries and optimization of database processes with SQL
- Use of bash scripts for system administration, automation and deployment processes
- Database and cloud technologies
- Managing, optimizing and querying large amounts of data in an Exasol database (prospectively Snowflake)
- Google Cloud Platform (GCP): Operation and scaling of cloud-based BI solutions, in particular
- Composer (Airflow): Orchestration of data pipelines for ETL processes
- Cloud Functions: Development of serverless functions for data processing and automation
- Cloud Scheduler: Planning and automation of recurring cloud jobs
- Cloud Secret Manager: Secure storage and management of sensitive access data and API keys
- BigQuery: Processing, analyzing and querying large amounts of data in the cloud
- Cloud Storage: Storage and management of structured and unstructured data
- Cloud monitoring: monitoring the performance and stability of cloud-based applications
- Data visualization and reporting
- Creation of interactive dashboards and reports for the analysis and visualization of business data with Power BI
Requirements
- Minimum of 4-6 years of experience in backend development, with strong expertise in BigQuery, Python and MongoDB or SQL.
- Strong knowledge of database design, querying, and optimization with SQL and MongoDB and designing ETL and orchestration of data pipelines.
- Expierience of minimum of 2 years with at least one hyperscaler, in best case GCP
- Combined with cloud storage technologies, cloud monitoring and cloud secret management
- Excellent communication skills to effectively collaborate with team members and stakeholders.
Nice-to-Have:
- Knowledge of agile methodologies and working in cross-functional, collaborative teams.

Similar jobs
Sr Software Engineer
Company Summary :
As the recognized global standard for project-based businesses, Deltek delivers software and information solutions to help organizations achieve their purpose. Our market leadership stems from the work of our diverse employees who are united by a passion for learning, growing and making a difference.
At Deltek, we take immense pride in creating a balanced, values-driven environment, where every employee feels included and empowered to do their best work. Our employees put our core values into action daily, creating a one-of-a-kind culture that has been recognized globally. Thanks to our incredible team, Deltek has been named one of America's Best Midsize Employers by Forbes, a Best Place to Work by Glassdoor, a Top Workplace by The Washington Post and a Best Place to Work in Asia by World HRD Congress. www.deltek.com
Business Summary :
The Deltek Engineering and Technology team builds best-in-class solutions to delight customers and meet their business needs. We are laser-focused on software design, development, innovation and quality. Our team of experts has the talent, skills and values to deliver products and services that are easy to use, reliable, sustainable and competitive. If you're looking for a safe environment where ideas are welcome, growth is supported and questions are encouraged – consider joining us as we explore the limitless opportunities of the software industry.
Position Responsibilities :
About the Role
We are looking for a skilled and motivated Senior Software Developer to join our team responsible for developing and maintaining a robust ERP solution used by approximately 400 customers and more than 30000 users worldwide. The system is built using C# (.NET Core), leverages SQL Server for data management, and is hosted in the Microsoft Azure cloud.
This role offers the opportunity to work on a mission-critical product, contribute to architectural decisions, and help shape the future of our cloud-native ERP platform.
Key Responsibilities
- Design, develop, and maintain features and modules within the ERP system using C# (.NET Core)
- Optimize and manage SQL Server database interactions for performance and scalability
- Collaborate with cross-functional teams, including QA, DevOps, Product Management, and Support
- Participate in code reviews, architecture discussions, and technical planning
- Contribute to the adoption and improvement of CI/CD pipelines and cloud deployment practices
- Troubleshoot and resolve complex technical issues across the stack
- Ensure code quality, maintainability, and adherence to best practices
- Stay current with emerging technologies and recommend improvements where applicable
Qualifications
- Curiosity, passion, teamwork, and initiative
- Strong experience with C# and .NET Core in enterprise application development
- Solid understanding of SQL Server, including query optimization and schema design
- Experience with Azure cloud services (App Services, Azure SQL, Storage, etc.)
- Ability to utilize agentic AI as a development support, with a critical thinking attitude
- Familiarity with agile development methodologies and DevOps practices
- Ability to work independently and collaboratively in a fast-paced environment
- Excellent problem-solving and communication skills
- Master's degree in Computer Science or equivalent; 5+ years of relevant work experience
- Experience with ERP systems or other complex business applications is a plus
What We Offer
- A chance to work on a product that directly impacts thousands of users worldwide
- A collaborative and supportive engineering culture
- Opportunities for professional growth and technical leadership
- Competitive salary and benefits package
We are looking for a Senior Backend Engineer to build and operate the core AI/ML-backed systems that power large-scale, consumer-facing products. You will work on production-grade AI runtimes, retrieval systems, and ML-adjacent backend infrastructure, making pragmatic tradeoffs across quality, latency, reliability, and cost.
This role is not an entry point into AI/ML. You are expected to already have hands-on experience shipping ML-backed backend systems in production.
At Proximity, you won’t just build APIs - you’ll own critical backend systems end-to-end, collaborate closely with Applied ML and Product teams, and help define the foundations that power intelligent experiences at scale.
Responsibilities -
- Own and deliver end-to-end backend systems for AI product runtime, including orchestration, request lifecycle management, state/session handling, and policy enforcement.
- Design and implement retrieval and memory primitives end-to-end — document ingestion, chunking strategies, embeddings generation, indexing, vector/hybrid search, re-ranking, caching, freshness, and deletion semantics.
- Productionize ML workflows and interfaces, including feature and metadata services, online/offline parity, model integration contracts, and evaluation instrumentation.
- Drive performance, reliability, and cost optimization, owning P50/P95 latency, throughput, cache hit rates, token and inference costs, and infrastructure efficiency.
- Build observability by default, including structured logs, metrics, distributed tracing, guardrail signals, failure taxonomies, and reliable fallback paths.
- Collaborate closely with Applied ML teams on model routing, prompt and tool schemas, evaluation datasets, and release safety gates.
- Write clean, testable, and maintainable backend code, contributing to design reviews, code reviews, and operational best practices.
- Take systems from design → build → deploy → operate, including on-call ownership and incident response.
- Continuously identify bottlenecks and failure modes in AI-backed systems and proactively improve system robustness.
Requirements
- Bachelor’s degree in Computer Science, Engineering, or a related field, or equivalent practical experience.
- 6–10 years of experience building backend systems in production, with 2–3+ years working on ML/AI-backed products such as search, recommendations, ranking, RAG pipelines, or AI assistants.
- Strong practical understanding of ML system fundamentals, including embeddings, vector similarity, reranking, retrieval quality, and evaluation metrics (precision/recall, nDCG, MRR).
- Proven experience implementing or operating RAG pipelines, covering ingestion, chunking, indexing, query understanding, hybrid retrieval, and rerankers.
- Solid distributed systems fundamentals, including API design, idempotency, concurrency, retries, circuit breakers, rate limiting, and multi-tenant reliability.
- Experience with common ML/AI platform components, such as feature stores, metadata systems, streaming or batch pipelines, offline evaluation jobs, and A/B measurement hooks.
- Strong proficiency in backend programming languages and frameworks (e.g., Go, Java, Python, or similar) and API development.
- Ability to work independently, take ownership of complex systems, and collaborate effectively with cross-functional teams.
- Strong problem-solving, communication, and system-design skills.
Desired Skills -
- Experience with agentic runtimes, including tool-calling or function-calling patterns, structured outputs, and production guardrails.
- Hands-on exposure to vector and hybrid retrieval stacks such as FAISS, Milvus, Pinecone, or Elasticsearch.
- Experience running systems on Kubernetes, with strong knowledge of observability stacks like OpenTelemetry, Prometheus, Grafana, and distributed tracing.
- Familiarity with privacy, security, and data governance considerations for user and model data.
Benefits
- Best in class compensation: We hire only the best, and we pay accordingly.
- Proximity Talks: Meet engineers, designers, and product leaders — and learn from experts across domains.
Keep on learning with a world-class team: Work on real, production AI systems at scale, challenge yourself daily, and grow alongside some of the best minds in the industry.
Company Overview
McKinley Rice is not just a company; it's a dynamic community, the next evolutionary step in professional development. Spiritually, we're a hub where individuals and companies converge to unleash their full potential. Organizationally, we are a conglomerate composed of various entities, each contributing to the larger narrative of global excellence.
Redrob by McKinley Rice: Redefining Prospecting in the Modern Sales Era
Backed by a $40 million Series A funding from leading Korean & US VCs, Redrob is building the next frontier in global outbound sales. We’re not just another database—we’re a platform designed to eliminate the chaos of traditional prospecting. In a world where sales leaders chase meetings and deals through outdated CRMs, fragmented tools, and costly lead-gen platforms, Redrob provides a unified solution that brings everything under one roof.
Inspired by the breakthroughs of Salesforce, LinkedIn, and HubSpot, we’re creating a future where anyone, not just enterprise giants, can access real-time, high-quality data on 700 M+ decision-makers, all in just a few clicks.
At Redrob, we believe the way businesses find and engage prospects is broken. Sales teams deserve better than recycled data, clunky workflows, and opaque credit-based systems. That’s why we’ve built a seamless engine for:
- Precision prospecting
- Intent-based targeting
- Data enrichment from 16+ premium sources
- AI-driven workflows to book more meetings, faster
We’re not just streamlining outbound—we’re making it smarter, scalable, and accessible. Whether you’re an ambitious startup or a scaled SaaS company, Redrob is your growth copilot for unlocking warm conversations with the right people, globally.
EXPERIENCE
Duties you'll be entrusted with:
- Develop and execute scalable APIs and applications using the Node.js or Nest.js framework
- Writing efficient, reusable, testable, and scalable code.
- Understanding, analyzing, and implementing – Business needs, feature modification requests, and conversion into software components
- Integration of user-oriented elements into different applications, data storage solutions
- Developing – Backend components to enhance performance and receptiveness, server-side logic, and platform, statistical learning models, highly responsive web applications
- Designing and implementing – High availability and low latency applications, data protection and security features
- Performance tuning and automation of applications and enhancing the functionalities of current software systems.
- Keeping abreast with the latest technology and trends.
Expectations from you:
Basic Requirements
- Minimum qualification: Bachelor’s degree or more in Computer Science, Software Engineering, Artificial Intelligence, or a related field.
- Experience with Cloud platforms (AWS, Azure, GCP).
- Strong understanding of monitoring, logging, and observability practices.
- Experience with event-driven architectures (e.g., Kafka, RabbitMQ).
- Expertise in designing, implementing, and optimizing Elasticsearch.
- Work with modern tools including Jira, Slack, GitHub, Google Docs, etc.
- Expertise in Event driven architecture.
- Experience in Integrating Generative AI APIs.
- Working experience in high user concurrency.
- Experience in scaled databases for handling millions of records - indexing, retrieval, etc.,
Technical Skills
- Demonstrable experience in web application development with expertise in Node.js or Nest.js.
- Knowledge of database technologies and agile development methodologies.
- Experience working with databases, such as MySQL or MongoDB.
- Familiarity with web development frameworks, such as Express.js.
- Understanding of microservices architecture and DevOps principles.
- Well-versed with AWS and serverless architecture.
Soft Skills
- A quick and critical thinker with the ability to come up with a number of ideas about a topic and bring fresh and innovative ideas to the table to enhance the visual impact of our content.
- Potential to apply innovative and exciting ideas, concepts, and technologies.
- Stay up-to-date with the latest design trends, animation techniques, and software advancements.
- Multi-tasking and time-management skills, with the ability to prioritize tasks.
THRIVE
Some of the extensive benefits of being part of our team:
- We offer skill enhancement and educational reimbursement opportunities to help you further develop your expertise.
- The Member Reward Program provides an opportunity for you to earn up to INR 85,000 as an annual Performance Bonus.
- The McKinley Cares Program has a wide range of benefits:
- The wellness program covers sessions for mental wellness, and fitness and offers health insurance.
- In-house benefits have a referral bonus window and sponsored social functions.
- An Expanded Leave Basket including paid Maternity and Paternity Leaves and rejuvenation Leaves apart from the regular 20 leaves per annum.
- Our Family Support benefits not only include maternity and paternity leaves but also extend to provide childcare benefits.
- In addition to the retention bonus, our McKinley Retention Benefits program also includes a Leave Travel Allowance program.
- We also offer an exclusive McKinley Loan Program designed to assist our employees during challenging times and alleviate financial burdens.
Position Overview:
We are looking for a talented and self-motivated Back-end Developer to join our development team. The ideal candidate will be responsible for writing clean, efficient, and maintainable code that enhances workflow organization and automates various internal processes within the organization. The role involves continuous improvement of our software solutions to save man-hours and ensure overall organizational efficiency. Key tasks include development, testing, debugging, troubleshooting, and maintenance of both new and existing programs.
Key responsibilities:
1) Software Development: Write clean, efficient, and maintainable code/flows that automate internal processes and improve workflow efficiency.
2) Testing and Maintenance: Perform testing, debugging, troubleshooting, and daily maintenance of created or integrated programs.
3) Adherence to Standards: Follow preferred development methodologies and adhere to organizational development standards.
4) Team work: To work closely with other team members to ensure the successful implementation of projects. Maintain clear and concise documentation of code, APIs, and software components to aid in knowledge sharing and future development.
5) Stay Current: Keep up to date with the latest developments in the Python, RPA ecosystem and engage in best practices of software engineering.
- Build campaign generation services which can send app notifications at a speed of 10 million a minute
- Dashboards to show Real time key performance indicators to clients
- Develop complex user segmentation engines which creates segments on Terabytes of data within few seconds
- Building highly available & horizontally scalable platform services for ever growing data
- Use cloud based services like AWS Lambda for blazing fast throughput & auto scalability
- Work on complex analytics on terabytes of data like building Cohorts, Funnels, User path analysis, Recency Frequency & Monetary analysis at blazing speed
- You will build backend services and APIs to create scalable engineering systems.
- As an individual contributor, you will tackle some of our broadest technical challenges that requires deep technical knowledge, hands-on software development and seamless collaboration with all functions.
- You will envision and develop features that are highly reliable and fault tolerant to deliver a superior customer experience.
- Collaborating various highly-functional teams in the company to meet deliverables throughout the software development lifecycle.
- Identify and improvise areas of improvement through data insights and research.
- 2-5 years of experience in backend development and must have worked on Java/shell/Perl/python scripting.
- Solid understanding of engineering best practices, continuous integration, and incremental delivery.
- Strong analytical skills, debugging and troubleshooting skills, product line analysis.
- Follower of agile methodology (Sprint planning, working on JIRA, retrospective etc).
- Proficiency in usage of tools like Docker, Maven, Jenkins and knowledge on frameworks in Java like spring, spring boot, hibernate, JPA.
- Ability to design application modules using various concepts like object oriented, multi-threading, synchronization, caching, fault tolerance, sockets, various IPCs, database interfaces etc.
- Hands on experience on Redis, MySQL and streaming technologies like Kafka producer consumers and NoSQL databases like mongo dB/Cassandra.
- Knowledge about versioning like Git and deployment processes like CICD.
Task:
- Design, implement and maintain Java application phases
- To take part in software and architectural development activities
- Conduct software analysis, programming, testing and debugging
- Identifying production and non-production application issues
- Transforming requirements into stipulations
- Develop, test, implement and maintain application software
- Recommend changes to improve established Java application processes
- Develop technical designs for application development
- Develop application code for Java programs
We wish:
- Minimum 7 years of experience with excellent Java development skills
- Demonstrable hands-on experience of delivering solutions in a JavaEE 6 environment
- Open-source frameworks and standards Hibernate, Spring JDBC
- Hands-on experience with tools such as: Ant, Eclipse, Maven, SVN, SoapUI, JIRA, Bamboo
- WS concepts and protocols: WSDL, REST, SOAP, JMS, XSLT, XML Schema
- Experienced with one or more open-source ESB products (JBOSS FUSE ESB, Apache Camel, Apache servicemix) and / or message-oriented middleware and Active MQ, JMS
- Strong Java unit test skills
- Good experience in design patterns & design principles
- Experience with Webservices, PLSQL and SQL (Oracle 10g or above)
- Experience in Oops development with Core Java
- Desirable:
- Experience with: Elastic Search, Smooks, Lucene, Jasper, Report
- Domain driven design
- Experience with NoSQL technologies: Casandra, Mongo DB
- Experience with agile development practices (Eg Scrum) and continues integration environment
- J2EE application and web containrs such as GlassFish, Jetty, Tomcat
- CMMi-5 level process experience advantage
We Offer:
- Freedom to realize your own ideas & individual career & development opportunity.
- A motivating work environment, flat hierarchical structure, numerous company events which cannot be forgotten and fun at work place with flexibilities.
- Professional challenges and career development opportunities.
Good knowledge of Python frameworks such as Django, CherryPy, etc.
Good understanding of REST API
Experience with JavaScript, jQuery, HTML and CSS
Build back-end features in Python that are efficient.
Integrate front-end and back-end components into the application
Develop integrations with third party applications (mostly web-based).
Working Knowledge of SQL and databases.
Bachelor’s Degree in Computer Science Engineering or other related fields.
Understand the needs of the client and Implement functional requirements accordingly.
Agile development methodology
Good communication (verbal and written)











