50+ SQL Jobs in Hyderabad | SQL Job openings in Hyderabad
Apply to 50+ SQL Jobs in Hyderabad on CutShort.io. Explore the latest SQL Job opportunities across top companies like Google, Amazon & Adobe.
As a Data Quality Engineer at PalTech, you will be responsible for designing and executing comprehensive test strategies for end-to-end data validation. Your role will ensure data completeness, accuracy, and integrity across ETL processes, data warehouses, and reporting environments. You will automate data validation using Python, validate fact and dimension tables, large datasets, file ingestions, and data exports, while ensuring adherence to data security standards, including encryption and authorization. This role requires strong analytical abilities, proficiency in SQL and Python, and the capability to collaborate effectively with cross-functional teams to drive continuous improvements through automation and best practices.
Key Responsibilities
- Create test strategies, test plans, business scenarios, and data validation scripts for end-to-end data validation.
- Verify data completeness, accuracy, and integrity throughout ETL processes, data pipelines, and reports.
- Evaluate and monitor the performance of ETL jobs to ensure adherence to defined SLAs.
- Automate data testing processes using Python or other relevant technologies.
- Validate various types of fact and dimension tables within data warehouse environments.
- Apply strong data warehousing (DWH) skills to ensure accurate data modeling and validation.
- Validate large datasets and ensure accuracy across relational databases.
- Validate file ingestions and data exports across different data sources.
- Assess and validate implementation of data security standards (encryption, authorization, anonymization).
- Demonstrate proficiency in SQL, Python, and ETL/ELT validation techniques.
- Validate reports and dashboards built on Power BI, Tableau, or similar platforms.
- Write complex scripts to validate business logic and KPIs across datasets.
- Create test data as required based on business use cases and scenarios.
- Identify, validate, and test corner business cases and edge scenarios.
- Prepare comprehensive test documentation including test cases, test results, and test summary reports.
- Collaborate closely with developers, business analysts, data architects, and other stakeholders.
- Recommend enhancements and implement best practices to strengthen and streamline testing processes.
Required Skills and Qualifications
- Education: Bachelor’s degree in Computer Science, Information Technology, or a related discipline.
- Technical Expertise: Strong understanding of ETL processes, data warehousing concepts, SQL, and Python.
- Experience: 4–6 years of experience in ETL testing, data validation, and report/dashboard validation; prior experience in automating data validation processes.
- Tools: Hands-on experience with ETL tools such as ADF, DBT, etc., defect tracking systems like JIRA, and reporting platforms such as Power BI or Tableau.
- Soft Skills: Excellent communication and teamwork abilities, with strong analytical and problem-solving skills.
Why Join PalTech?
- Great Place to Work Certified: We prioritize employee well-being and nurture an inclusive, collaborative environment where everyone can thrive.
- Competitive compensation, strong learning and professional g
As a Data Engineer at PalTech, you will design, develop, and maintain scalable and reliable data pipelines to ensure seamless data flow across systems. You will leverage SQL and leading ETL tools (such as Informatica, ADF, etc.) to support data integration and transformation needs. This role involves building and optimizing data warehouse architectures, performing performance tuning, and ensuring high levels of data quality, accuracy, and consistency throughout the data lifecycle.
You will collaborate closely with cross-functional teams to understand business requirements and translate them into effective data solutions. The ideal candidate should possess strong problem-solving skills, sound knowledge of data architecture principles, and a passion for building clean and efficient data systems.
Key Responsibilities
- Design, develop, and maintain ETL/ELT pipelines using SQL and tools such as Informatica, ADF, etc.
- Build and optimize data warehouse and data lake solutions for reporting, analytics, and operational usage.
- Apply strong understanding of data warehousing concepts to architect scalable data solutions.
- Handle large datasets and design effective load/update strategies.
- Collaborate with data analysts, business users, and data scientists to understand requirements and deliver scalable solutions.
- Implement data quality checks and validation frameworks to ensure data reliability and integrity.
- Perform SQL and ETL performance tuning and optimization.
- Work with structured and semi-structured data from various source systems.
- Monitor, troubleshoot, and resolve issues in data workflows.
- Maintain documentation for data pipelines, data flows, and data definitions.
- Follow best practices in data engineering including security, logging, and error handling.
Required Skills & Qualifications
Education:
- Bachelor’s degree in Computer Science, Information Technology, or a related field.
Technical Skills:
- Strong proficiency in SQL and data manipulation.
- Hands-on experience with ETL tools (e.g., Informatica, Talend, ADF).
- Experience with cloud data warehouse platforms such as BigQuery, Redshift, or Snowflake.
- Strong understanding of data warehousing concepts and data modeling.
- Proficiency in Python or a similar programming language.
- Experience working with RDBMS platforms (e.g., SQL Server, Oracle).
- Familiarity with version control systems and job schedulers.
Experience:
- 4 to 8 years of relevant experience in data engineering and ETL development.
Soft Skills:
- Strong problem-solving skills.
- Excellent communication and collaboration abilities.
- Ability to work effectively in a cross-functional team environment.
We are seeking a highly skilled Senior Data Engineer with expertise in Databricks, Python, Scala, Azure Synapse, and Azure Data Factory to join our data engineering team. The team is responsible for ingesting data from multiple sources, making it accessible to internal stakeholders, and enabling seamless data exchange across internal and external systems.
You will play a key role in enhancing and scaling our Enterprise Data Platform (EDP) hosted on Azure and built using modern technologies such as Databricks, Synapse, Azure Data Factory (ADF), ADLS Gen2, Azure DevOps, and CI/CD pipelines.
Responsibilities
- Design, develop, optimize, and maintain scalable data architectures and pipelines aligned with ETL principles and business goals.
- Collaborate across teams to build simple, functional, and scalable data solutions.
- Troubleshoot and resolve complex data issues to support business insights and organizational objectives.
- Build and maintain data products to support company-wide usage.
- Advise, mentor, and coach data and analytics professionals on standards and best practices.
- Promote reusability, scalability, operational efficiency, and knowledge-sharing within the team.
- Develop comprehensive documentation for data engineering standards, processes, and capabilities.
- Participate in design and code reviews.
- Partner with business analysts and solution architects on enterprise-level technical architectures.
- Write high-quality, efficient, and maintainable code.
Technical Qualifications
- 5–8 years of progressive data engineering experience.
- Strong expertise in Databricks, Python, Scala, and Microsoft Azure services including Synapse & Azure Data Factory (ADF).
- Hands-on experience with data pipelines across multiple source & target systems (Databricks, Synapse, SQL Server, Data Lake, SQL/NoSQL sources, and file-based systems).
- Experience with design patterns, code refactoring, CI/CD, and building scalable data applications.
- Experience developing batch ETL pipelines; real-time streaming experience is a plus.
- Solid understanding of data warehousing, ETL, dimensional modeling, data governance, and handling both structured and unstructured data.
- Deep understanding of Synapse and SQL Server, including T-SQL and stored procedures.
- Proven experience working effectively with cross-functional teams in dynamic environments.
- Experience extracting, processing, and analyzing large / complex datasets.
- Strong background in root cause analysis for data and process issues.
- Advanced SQL proficiency and working knowledge of a variety of database technologies.
- Knowledge of Boomi is an added advantage.
Core Skills & Competencies
- Excellent analytical and problem-solving abilities.
- Strong communication and cross-team collaboration skills.
- Self-driven with the ability to make decisions independently.
- Innovative mindset and passion for building quality data solutions.
- Ability to understand operational systems, identify gaps, and propose improvements.
- Experience with large-scale data ingestion and engineering.
- Knowledge of CI/CD pipelines (preferred).
- Understanding of Python and parallel processing frameworks (MapReduce, Spark, Scala).
- Familiarity with Agile development methodologies.
Education
- Bachelor’s degree in Computer Science, Information Technology, MIS, or an equivalent field.
As a Data Engineer, you will be an integral part of our team, working on data pipelines, data warehousing, and data integration for various analytics and AI use cases. You will collaborate closely with Delivery Managers, ML Engineers and other stakeholders to ensure seamless data flow and accessibility. Your expertise will be crucial in enabling data-driven decision-making for our clients. To thrive in this role, you need to be a quick learner, get excited about innovation and be on the constant lookout to master new technologies as they come up in the Data, AI & Cloud teams.
Key Responsibilities
- Design, develop, and maintain scalable data pipelines and ETL processes to support downstream analytics and AI applications.
- Collaborate with ML Engineers to integrate data solutions into machine learning models and workflows.
- Work closely with clients to understand their data requirements and deliver tailored data solutions.
- Ensure data quality, integrity, and security across all projects.
- Optimize and manage data storage solutions in cloud environments (AWS, Azure, GCP).
- Utilize Databricks for data processing and analytics tasks, leveraging its capabilities to enhance data workflows.
- Monitor the performance of data pipelines, identify bottlenecks or failures, and implement improvements to enhance efficiency and reliability.
- Implement best practices for data engineering, including documentation, testing, and version control.
- Troubleshoot and resolve data-related issues in a timely manner.
Qualifications
- Bachelor’s or Master’s degree in Computer Science, Information Technology, or a related field.
- 3 to 5 years of experience as a Data Engineer or in a similar role.
- Strong proficiency in SQL, Python, and other relevant programming languages.
- Hands-on experience with Databricks and its ecosystem.
- Familiarity with major cloud environments (AWS, Azure, GCP) and their data services.
- Experience with data warehousing solutions like Snowflake, Redshift, or BigQuery.
- Comfortable working with a variety of SQL, NoSQL and graph databases like PostgreSQL and MongoDB;
- Knowledge of data integration tools.
- Understanding of data modelling, data architecture, and database design.
- Excellent problem-solving skills and attention to detail.
- Strong communication and collaboration skills.
Highly Desirable Skills
- Experience with real-time data processing frameworks (e.g., Apache Kafka, Spark Streaming).
- Knowledge of data visualisation tools (e.g., Tableau, Power BI).
- Familiarity with machine learning concepts and frameworks.
- Experience working in a client-facing role.
Responsibilities:
Build and optimize batch and streaming data pipelines using Apache Beam (Dataflow)
Design and maintain BigQuery datasets using best practices in partitioning, clustering, and materialized views
Develop and manage Airflow DAGs in Cloud Composer for workflow orchestration
Implement SQL-based transformations using Dataform (or dbt)
Leverage Pub/Sub for event-driven ingestion and Cloud Storage for raw/lake layer data architecture
Drive engineering best practices across CI/CD, testing, monitoring, and pipeline observability
Partner with solution architects and product teams to translate data requirements into technical designs
Mentor junior data engineers and support knowledge-sharing across the team
Contribute to documentation, code reviews, sprint planning, and agile ceremonies
Requirements
2+ years of hands-on experience in data engineering, with at least 2 years on GCP
Proven expertise in BigQuery, Dataflow (Apache Beam), Cloud Composer (Airflow)
Strong programming skills in Python and/or Java
Experience with SQL optimization, data modeling, and pipeline orchestration
Familiarity with Git, CI/CD pipelines, and data quality monitoring frameworks
Exposure to Dataform, dbt, or similar tools for ELT workflows
Solid understanding of data architecture, schema design, and performance tuning
Excellent problem-solving and collaboration skills
Bonus Skills:
GCP Professional Data Engineer certification
Experience with Vertex AI, Cloud Functions, Dataproc, or real-time streaming architectures
Familiarity with data governance tools (e.g., Atlan, Collibra, Dataplex)
Exposure to Docker/Kubernetes, API integration, and infrastructure-as-code (Terraform)
Senior Software Engineer
Location: Hyderabad, India
Who We Are:
Since our inception back in 2006, Navitas has grown to be an industry leader in the digital transformation space, and we’ve served as trusted advisors supporting our client base within the commercial, federal, and state and local markets.
What We Do:
At our very core, we’re a group of problem solvers providing our award-winning technology solutions to drive digital acceleration for our customers! With proven solutions, award-winning technologies, and a team of expert problem solvers, Navitas has consistently empowered customers to use technology as a competitive advantage and deliver cutting-edge transformative solutions.
What You’ll Do:
Build, Innovate, and Own:
- Design, develop, and maintain high-performance microservices in a modern .NET/C# environment.
- Architect and optimize data pipelines and storage solutions that power our AI-driven products.
- Collaborate closely with AI and data teams to bring machine learning models into production systems.
- Build integrations with external services and APIs to enable scalable, interoperable solutions.
- Ensure robust security, scalability, and observability across distributed systems.
- Stay ahead of the curve — evaluating emerging technologies and contributing to architectural decisions for our next-gen platform.
Responsibilities will include but are not limited to:
- Provide technical guidance and code reviews that raise the bar for quality and performance.
- Help create a growth-minded engineering culture that encourages experimentation, learning, and accountability.
What You’ll Need:
- Bachelor’s degree in Computer Science or equivalent practical experience.
- 8+ years of professional experience, including 5+ years designing and maintaining scalable backend systems using C#/.NET and microservices architecture.
- Strong experience with SQL and NoSQL data stores.
- Solid hands-on knowledge of cloud platforms (AWS, GCP, or Azure).
- Proven ability to design for performance, reliability, and security in data-intensive systems.
- Excellent communication skills and ability to work effectively in a global, cross-functional environment.
Set Yourself Apart With:
- Startup experience - specifically in building product from 0-1
- Exposure to AI/ML-powered systems, data engineering, or large-scale data processing.
- Experience in healthcare or fintech domains.
- Familiarity with modern DevOps practices, CI/CD pipelines, and containerization (Docker/Kubernetes).
Equal Employer/Veterans/Disabled
Navitas Business Consulting is an affirmative action and equal opportunity employer. If reasonable accommodation is needed to participate in the job application or interview process, to perform essential job functions, and/or to receive other benefits and privileges of employment, please contact Navitas Human Resources.
Navitas is an equal opportunity employer. We provide employment and opportunities for advancement, compensation, training, and growth according to individual merit, without regard to race, color, religion, sex (including pregnancy), national origin, sexual orientation, gender identity or expression, marital status, age, genetic information, disability, veteran-status veteran or military status, or any other characteristic protected under applicable Federal, state, or local law. Our goal is for each staff member to have the opportunity to grow to the limits of their abilities and to achieve personal and organizational objectives. We will support positive programs for equal treatment of all staff and full utilization of all qualified employees at all levels within Navita
Who We Are
At Sonatype, we help organizations build better, more secure software by enabling them to understand and control their software supply chains. Our products are trusted by thousands of engineering teams globally, providing critical insights into dependency health, license risk, and software security. We’re passionate about empowering developers—and we back it with data.
The Opportunity
We’re looking for a Data Engineer with full stack expertise to join our growing Data Platform team. This role blends data engineering, microservices, and full-stack development to deliver end-to-end services that power analytics, machine learning, and advanced search across Sonatype.
You will design and build data-driven microservices and workflows using Java, Python, and Spring Batch, implement frontends for data workflows, and deploy everything through CI/CD pipelines into AWS ECS/Fargate. You’ll also ensure services are monitorable, debuggable, and reliable at scale, while clearly documenting designs with Mermaid-based sequence and dataflow diagrams.
This is a hands-on engineering role for someone who thrives at the intersection of data systems, fullstack development, ML, and cloud-native platforms.
What You’ll Do
- Design, build, and maintain data pipelines, ETL/ELT workflows, and scalable microservices.
- Development of complex web scraping (Playwright) and realtime pipelines (Kafka/Queues/Flink).
- Develop end-to-end microservices with backend (Java 5+, Python 5+, Spring Batch 2+) and frontend (React or any).
- Deploy, publish, and operate services in AWS ECS/Fargate using CI/CD pipelines (Jenkins, GitOps).
- Architect and optimize data storage models in SQL (MySQL, PostgreSQL) and NoSQL stores.
- Implement web scraping and external data ingestion pipelines.
- Enable Databricks and PySpark-based workflows for large-scale analytics.
- Build advanced data search capabilities (fuzzy matching, vector similarity search, semantic retrieval).
- Apply ML techniques (scikit-learn, classification algorithms, predictive modeling) to data-driven solutions.
- Implement observability, debugging, monitoring, and alerting for deployed services.
- Create Mermaid sequence diagrams, flowcharts, and dataflow diagrams to document system architecture and workflows.
- Drive best practices in fullstack data service development, including architecture, testing, and documentation.
What We’re Looking For
Minimum Qualifications
- 2+ years of experience as a Data Engineer or a Software Backend engineering role
- Strong programming skills in Python, Scala, or Java
- Hands-on experience with HBase or similar NoSQL columnar stores
- Hands-on experience with distributed data systems like Spark, Kafka, or Flink
- Proficient in writing complex SQL and optimizing queries for performance
- Experience building and maintaining robust ETL/ELT pipelines in production
- Familiarity with workflow orchestration tools (Airflow, Dagster, or similar)
- Understanding of data modeling techniques (star schema, dimensional modeling, etc.)
- Familiarity with CI/CD pipelines (Jenkins or similar)
- Ability to visualize and communicate architectures using Mermaid diagrams
Bonus Points
- Experience working with Databricks, dbt, Terraform, or Kubernetes
- Familiarity with streaming data pipelines or real-time processing
- Exposure to data governance frameworks and tools
- Experience supporting data products or ML pipelines in production
- Strong understanding of data privacy, security, and compliance best practices
Why You’ll Love Working Here
- Data with purpose: Work on problems that directly impact how the world builds secure software
- Modern tooling: Leverage the best of open-source and cloud-native technologies
- Collaborative culture: Join a passionate team that values learning, autonomy, and impact
Job Summary:
We are in search of a proficient Java Lead with a minimum of 10 years' experience in designing and developing Java applications. The ideal candidate will demonstrate a deep understanding of Java technologies, including Java EE, Spring Framework, and Hibernate. Proficiency in database technologies such as MySQL, Oracle, or PostgreSQL is essential, along with a proven track record of delivering high-quality, scalable, and efficient Java solutions.
We are looking for you!
You are a team player, get-it-done person, intellectually curious, customer focused, self motivated, responsible individual who can work under pressure with a positive attitude. You have the zeal to think differently, understand that career is a journey and make the choices right. You must have experience in creating visually compelling designs that effectively communicate our message and engage our target audience. Ideal candidates would be someone who is creative, proactive, go getter and motivated to look for ways to add value to job accomplishments.
As an ideal candidate for the Java Lead position, you bring a wealth of experience and expertise in Java development, combined with strong leadership qualities. Your proven track record showcases your ability to lead and mentor teams to deliver high-quality, enterprise-grade applications. Your technical proficiency and commitment to excellence make you a valuable asset in driving innovation and success within our development projects. You possess a team oriented mindset and a "get-it-done" attitude, inspiring your team members to excel and collaborate effectively.
You have a proven ability to lead mid to large size teams, emphasizing a quality-first approach and ensuring that projects are delivered on time and within scope. As a Java Lead, you are responsible for overseeing project planning, implementing best practices, and driving technical solutions that align with business objectives. You collaborate closely with development managers, architects, and cross-functional teams to design scalable and robust Java applications.
What You Will Do:
- Design and development of RESTful Web Services.
- Hands on database experience (Oracle / PostgreSQL / MySQL /SQL Server).
- Hands on experience with developing web applications leveraging Spring Framework.
- Hands on experience with developing microservices leveraging Spring Boot.
- Experience with cloud platforms (e.g., AWS, Azure) and containerization technologies.
- Continuous Integration tools (Jenkins & Git Lab), CICD Tools.
- Strong believer and follower of agile methodologies with an emphasis on Quality & Standards based development.
- Architect, design, and implement complex software systems using [Specify relevant technologies, e.g., Java, Python, Node.js.
What we need?
- BTech computer science or equivalent
- Minimum 8 years of relevant experience in Java/J2EE technologies
- Experience in building back in API using Spring Boot Framework, Spring DI, Spring AOP
- Real time messaging integration using Kafka or similar framework
- Experience in at least one database: Oracle, SQL server or PostgreSQL
- Previous experience managing and leading high-performing software engineering teams
Why join us?
- Work with a passionate and innovative team in a fast-paced, growth-oriented environment.
- Gain hands-on experience in content marketing with exposure to real-world projects.
- Opportunity to learn from experienced professionals and enhance your marketing skills.
- Contribute to exciting initiatives and make an impact from day one.
- Competitive stipend and potential for growth within the company.
- Recognized for excellence in data and AI solutions with industry awards and accolades.
Employee Benefits:
1. Culture:
- Open Door Policy: Encourages open communication and accessibility to management.
- Open Office Floor Plan: Fosters a collaborative and interactive work environment.
- Flexible Working Hours: Allows employees to have flexibility in their work schedules.
- Employee Referral Bonus: Rewards employees for referring qualified candidates.
- Appraisal Process Twice a Year: Provides regular performance evaluations and feedback.
2. Inclusivity and Diversity:
- Hiring practices that promote diversity: Ensures a diverse and inclusive workforce.
- Mandatory POSH training: Promotes a safe and respectful work environment.
3. Health Insurance and Wellness Benefits:
- GMC and Term Insurance: Offers medical coverage and financial protection.
- Health Insurance: Provides coverage for medical expenses.
- Disability Insurance: Offers financial support in case of disability.
4. Child Care & Parental Leave Benefits:
- Company-sponsored family events: Creates opportunities for employees and their families to bond.
- Generous Parental Leave: Allows parents to take time off after the birth or adoption of a child.
- Family Medical Leave: Offers leave for employees to take care of family members' medical needs.
5. Perks and Time-Off Benefits:
- Company-sponsored outings: Organizes recreational activities for employees.
- Gratuity: Provides a monetary benefit as a token of appreciation.
- Provident Fund: Helps employees save for retirement.
- Generous PTO: Offers more than the industry standard for paid time off.
- Paid sick days: Allows employees to take paid time off when they are unwell.
- Paid holidays: Gives employees paid time off for designated holidays.
- Bereavement Leave: Provides time off for employees to grieve the loss of a loved one.
6. Professional Development Benefits:
- L&D with FLEX- Enterprise Learning Repository: Provides access to a learning repository for professional development.
- Mentorship Program: Offers guidance and support from experienced professionals.
- Job Training: Provides training to enhance job-related skills.
- Professional Certification Reimbursements: Assists employees in obtaining professional certifications.
- Promote from Within: Encourages internal growth and advancement opportunities.
We are looking for experienced Data Engineers who can independently build, optimize, and manage scalable data pipelines and data platforms. In this role, you will collaborate with clients and internal teams to deliver robust data solutions that support analytics, AI/ML, and operational systems. You will also mentor junior engineers and bring strong engineering discipline to our data engagements.
Key Responsibilities
- Design, build, and optimize large-scale, distributed batch and streaming data pipelines.
- Implement scalable data models, data warehouses/lakehouses, and data lakes to support analytics and decision-making.
- Work closely with cross-functional stakeholders to translate business requirements into technical data solutions.
- Drive performance tuning, monitoring, and reliability of data pipelines.
- Write clean, modular, production-ready code with proper documentation and testing.
- Contribute to architecture discussions, tool evaluations, and platform setup.
- Mentor junior engineers and participate in code/design reviews.
Must-Have Skills
- Strong programming skills in Python (exp with Java is a good to have).
- Advanced SQL expertise with ability to work on complex queries and optimizations.
- Deep understanding of data engineering concepts such as ETL/ELT, data modeling (OLTP & OLAP), warehousing, and stream processing.
- Experience with distributed processing frameworks like Apache Spark, Flink, or similar.
- Experience with Snowflake (preferred).
- Hands-on experience building pipelines using orchestration tools such as Airflow or similar.
- Familiarity with CI/CD, version control (Git), and modern development practices.
- Ability to debug, optimize, and scale data pipelines in real-world environments.
Good to Have
- Experience with major cloud platforms (AWS preferred; GCP/Azure also welcome).
- Exposure to Databricks, dbt, or similar platforms.
- Understanding of data governance, data quality frameworks, and observability.
- Certifications in AWS (Data Analytics / Solutions Architect) or Databricks.
Other Expectations
- Comfortable working in fast-paced, client-facing environments.
- Strong analytical and problem-solving skills with excellent attention to detail.
- Ability to adapt across tools, stacks, and business domains.
- Willingness to travel within India for short/medium-term client engagements as needed.
API Developer (.NET Core 8/9)
Location: Hyderabad/Vijayawada- India
Navitas is seeking a Senior API Developer (.NET Core 8/9) to join our development team in building robust, high-performance microservices and APIs. You will play a key role in designing scalable, secure, and maintainable backend services that power our web and mobile applications. In this role, you will collaborate with product managers, front-end developers, and DevOps engineers to deliver seamless digital experiences and ensure smooth partner integration. This is a mission-critical position that contributes directly to our organization’s digital transformation initiatives.
Responsibilities will include but are not limited to:
- Microservices & API Development: Design, develop, and maintain RESTful APIs and microservices using .NET Core 8/9 and ASP.NET Core Web API.
- API Design & Documentation: Create secure, versioned, and well-documented endpoints for internal and external consumption.
- Asynchronous Processing: Build and manage background jobs and message-driven workflows using Azure Service Bus and Azure Storage Queues.
- Authentication & Security: Implement OAuth2.0, JWT, Azure AD for securing APIs; enforce best practices for secure coding.
- Caching Integration: Enhance performance through caching mechanisms (Redis, in-memory caching).
- Performance Optimization: Profile APIs and database queries to identify bottlenecks; tune services for speed, scalability, and resilience.
- Clean Code & Architecture: Follow SOLID principles, Clean Architecture, and domain-driven design to write modular, testable code.
- Technical Collaboration: Participate in Agile development processes; collaborate with cross-functional teams to plan and deliver solutions.
- Troubleshooting & Maintenance: Use debugging tools and logging strategies to maintain uptime and resolve production issues.
- Documentation: Maintain clear, accessible technical documentation for services, endpoints, and integration requirements.
What You’ll Need:
- Bachelor’s degree in Computer Science, Information Systems, or a related technical field.
- 8+ years of backend development experience using .NET Core (6+ preferred, experience with 8/9 strongly desired).
- Strong understanding of RESTful API design, versioning, and integration.
- Experience with Clean Architecture and Domain-Driven Design (DDD).
- Deep knowledge of SOLID principles, design patterns, and reusable code practices.
- Skilled in SQL Server, including schema design, query tuning, and optimization.
- Proficiency in Entity Framework Core and Dapper for data access.
- Familiarity with API security standards (OAuth2.0, JWT, API keys).
- Experience writing unit/integration tests using xUnit, Moq, or similar frameworks.
- Basic experience with Azure services, including message queues and storage.
- Proficiency with Git, Agile workflows, and collaboration tools.
- Strong communication and problem-solving skills.
Set Yourself Apart With:
- Hands-on experience with Azure components (e.g., Service Bus, Functions, App Services, AKS).
- Experience with Azure Application Insights, Datadog, or other observability tools.
- Familiarity with Docker, containerization, and CI/CD pipelines.
- Performance testing and load testing experience.
- Familiarity with Postman, Swagger/OpenAPI, and other dev/test tools.
- Exposure to Agile/Scrum methodologies and sprint planning processes.
Equal Employer/Veterans/Disabled
Navitas Business Consulting is an affirmative action and equal opportunity employer. If reasonable accommodation is needed to participate in the job application or interview process, to perform essential job functions, and/or to receive other benefits and privileges of employment, please contact Navitas Human Resources.
Navitas is an equal opportunity employer. We provide employment and opportunities for advancement, compensation, training, and growth according to individual merit, without regard to race, color, religion, sex (including pregnancy), national origin, sexual orientation, gender identity or expression, marital status, age, genetic information, disability, veteran-status veteran or military status, or any other characteristic protected under applicable Federal, state, or local law. Our goal is for each staff member to have the opportunity to grow to the limits of their abilities and to achieve personal and organizational objectives. We will support positive programs for equal treatment of all staff and full utilization of all qualified employees at all levels within Navitas.
About the Role:
We are seeking a highly skilled Integration Specialist / Full Stack Developer with strong experience in .NET Core, API integrations, and modern front-end development. The ideal candidate will build and integrate scalable web and mobile applications, manage end-to-end delivery, and ensure smooth data exchange across platforms.
Key Responsibilities:
- Design, develop, and maintain backend APIs using .NET Core / C#.
- Build and integrate REST and SOAP-based services (JSON, XML, OAuth2, JWT, API Key).
- Implement file-based integrations (Flat file, CSV, Excel, XML, JSON) and manage FTP/SFTP transfers.
- Work with databases such as MSSQL, PostgreSQL, Oracle, and SQLite — including writing queries, stored procedures, and using ADO.NET.
- Handle data serialization/deserialization using Newtonsoft.Json or System.Text.Json.
- Implement robust error handling and logging with Serilog, NLog, or log4net.
- Automate and schedule processes using Quartz.NET, Hangfire, or Windows Task Scheduler.
- Manage version control and CI/CD pipelines via Git and Azure DevOps.
- Develop front-end interfaces with React and React Native ensuring responsive, modular UI.
- Implement offline-first functionality for mobile apps (sync logic, caching, etc.).
- Collaborate with cross-functional teams or independently handle full project ownership.
- Utilize AI-assisted development tools (e.g., Cursor, GitHub Copilot, Claude Code) to enhance productivity.
- Apply integration best practices including middleware, API gateways, and optionally message queues (MSMQ, RabbitMQ).
- Ensure scalability, security, and performance in all deliverables.
Key Skills & Technologies:
- Backend: .NET Core, C#, REST/SOAP APIs, WCF, ADO.NET
- Frontend: React, React Native
- Databases: MSSQL, PostgreSQL, Oracle, SQLite
- Tools: Git, Azure DevOps, Hangfire, Quartz.NET, Serilog/NLog
- Integration: JSON, XML, CSV, FTP/SFTP, OAuth2, JWT
- DevOps: CI/CD automation, deployment pipelines
- Optional: Middleware, API Gateway, Message Queues (MSMQ, RabbitMQ)
Qualifications:
- Bachelor’s degree in Computer Science, Engineering, or a related field.
- Minimum 5 years of hands-on experience in software development and integration.
- Proven expertise in designing and implementing scalable applications.
- Strong analytical and problem-solving skills with a proactive approach.
Nice to Have:
- Experience with cloud services (Azure, AWS, GCP).
- Knowledge of containerization tools like Docker or Kubernetes.
- Familiarity with mobile deployment workflows and app store publishing.
About the Role
At Sonatype, we empower developers with best-in-class tools to build secure, high-quality software at scale. Our mission is to create a world where software is always secure and developers can innovate without fear. Trusted by thousands of organizations, including Fortune 500 companies, we are pioneers in software supply chain management, open-source security, and DevSecOps.
We're looking for a Senior Data Analyst to help us shape the future of secure software development. If you love solving complex problems, working with cutting-edge technologies, and mentoring engineering teams, we’d love to hear from you.
What You’ll Do
As a Senior Data Analyst with 5+ years of demonstrated experience, you will transform complex datasets into actionable insights, build and maintain analytics infrastructure, and partner with cross-functional teams to drive data-informed decision-making and product improvements.
You’ll own the end-to-end analytics lifecycle—from data modeling and dashboard creation to experimentation and KPI development—ensuring that our stakeholders have timely, accurate information to optimize operations and enhance customer experiences.
Key Responsibilities:
- Using the available data and data models, perform analyses that answer specific data questions and identify trends, patterns, and anomalies
- Build and maintain dashboards and reports using tools like Looker and Databricks; support monthly reporting requirements
- Collaborate with data engineers, data scientists, and product teams to support data initiatives for internal use as well as for end customers
- Present findings and insights to both technical and non-technical audiences – provide visual aids, dashboards, reports, and white papers that explain insights gained through multiple analyses
- Monitor select data and dashboards for usage anomalies and flag for upsell and cross-sell opportunities
- Translate business requirements into technical specifications for data queries and models
- Assist in the development and maintenance of databases and data systems; collect, clean, and validate data from various sources to ensure accuracy and completeness
What You Need
We’re seeking an experienced analyst who thrives in an agile, collaborative environment and enjoys tackling technical challenges.
Minimum Qualifications:
- Bachelor’s degree in a quantitative field (e.g., Mathematics, Statistics, Computer Science, Economics, Business Analytics)
- 4+ years of experience in a data analysis or business intelligence role
- Proficiency in SQL, Python, Scala, Pyspark, and other data analyst languages and standards for data querying and manipulation
- Experience working in a collaborative coding environment (e.g., GitHub)
- Experience with data science, analysis, and visualization tools (e.g., Databricks, Looker, Spark, Power BI, Plotly)
- Strong analytical and problem-solving skills with attention to detail
- Ability to communicate insights clearly and concisely to a variety of stakeholders
- Understanding of data lakes and data warehousing concepts and experience with data pipelines
- Knowledge of business systems is a plus (e.g., CRMs, demand generation tools, etc.)
Job Summary:
We are looking for technically skilled and customer-oriented SME Voice – Technical Support Associates to provide voice-based support to enterprise clients. The role involves real-time troubleshooting of complex issues across servers, networks, cloud platforms (Azure), databases, and more. Strong communication and problem-solving skills are essential.
Key Responsibilities:
- Provide technical voice support to B2B (enterprise) customers.
- Troubleshoot and resolve issues related to:
- SQL, DNS, VPN, Server Support (Windows/Linux)
- Networking (TCP/IP, routing, firewalls)
- Cloud Services – especially Microsoft Azure
- Application and system-level issues
- Assist with technical configurations and product usage.
- Accurately document cases and escalate unresolved issues.
- Ensure timely resolution while meeting SLAs and quality standards.
Required Skills & Qualifications:
- 2.5 to 5 years in technical support (voice-based, B2B preferred)
Proficiency in:
- SQL, DNS, VPN, Server Support
- Networking, Microsoft Azure
- Basic understanding of coding/scripting
- Strong troubleshooting and communication skills
- Ability to work in a 24x7 rotational shift environment
Job Summary:
Technical Support Associates
We are looking for technically skilled and customer-oriented SME Voice – Technical Support Associates to provide voice-based support to enterprise clients. The role involves real-time troubleshooting of complex issues across servers, networks, cloud platforms (Azure), databases, and more. Strong communication and problem-solving skills are essential.
Key Responsibilities:
- Provide technical voice support to B2B (enterprise) customers.
- Troubleshoot and resolve issues related to:
- SQL, DNS, VPN, Server Support (Windows/Linux)
- Networking (TCP/IP, routing, firewalls)
- Cloud Services – especially Microsoft Azure
- Application and system-level issues
- Assist with technical configurations and product usage.
- Accurately document cases and escalate unresolved issues.
- Ensure timely resolution while meeting SLAs and quality standards.
Required Skills & Qualifications:
- 2.5 to 5 years in technical support (voice-based, B2B preferred)
Proficiency in:
- SQL, DNS, VPN, Server Support
- Networking, Microsoft Azure
- Basic understanding of coding/scripting
- Strong troubleshooting and communication skills
- Ability to work in a 24x7 rotational shift environment
Who we are:
Kanerika Inc. is a premier global software products and services firm that specializes in providing innovative solutions and services for data-driven enterprises. Our focus is to empower businesses to achieve their digital transformation goals and maximize their business impact through the effective use of data and AI.
We leverage cutting-edge technologies in data analytics, data governance, AI-ML, GenAI/ LLM and industry best practices to deliver custom solutions that help organizations optimize their operations, enhance customer experiences, and drive growth.
Awards and Recognitions:
Kanerika has won several awards over the years, including:
1. Best Place to Work 2023 by Great Place to Work®
2. Top 10 Most Recommended RPA Start-Ups in 2022 by RPA Today
3. NASSCOM Emerge 50 Award in 2014
4. Frost & Sullivan India 2021 Technology Innovation Award for its Kompass composable solution architecture
5. Kanerika has also been recognized for its commitment to customer privacy and data security, having achieved ISO 27701, SOC2, and GDPR compliances.
Working for us:
Kanerika is rated 4.6/5 on Glassdoor, for many good reasons. We truly value our employees' growth, well-being, and diversity, and people’s experiences bear this out. At Kanerika, we offer a host of enticing benefits that create an environment where you can thrive both personally and professionally. From our inclusive hiring practices and mandatory training on creating a safe work environment to our flexible working hours and generous parental leave, we prioritize the well-being and success of our employees.
Our commitment to professional development is evident through our mentorship programs, job training initiatives, and support for professional certifications. Additionally, our company-sponsored outings and various time-off benefits ensure a healthy work-life balance. Join us at Kanerika and become part of a vibrant and diverse community where your talents are recognized, your growth is nurtured, and your contributions make a real impact. See the benefits section below for the perks you’ll get while working for Kanerika.
About the Role:
We are looking for A highly skilled Full Stack .NET Developer with strong hands-on experience in C#, .NET Core, ASP.NET Core, Web API, and Microservices Architecture. Proficient in developing scalable and high-performing applications using SQL Server, NoSQL databases, and Entity Framework (v6+). Recognized for excellent troubleshooting, problem-solving, and communication skills, with the ability to collaborate effectively with cross-functional and international teams, including US counterparts.
Technical Skills:
- Programming Languages: C#, TypeScript, JavaScript
- Frameworks & Technologies: .NET Core, ASP.NET Core, Web API, Angular (v10+), Entity Framework (v6+), Microservices Architecture
- Databases: SQL Server, NoSQL
- Cloud Platform: Microsoft Azure
- Design & Architecture: OOPs Concepts, Design Patterns, Reusable Libraries, Microservices Implementation
- Front-End Development: Angular Material, HTML5, CSS3, Responsive UI Development
- Additional Skills: Excellent troubleshooting abilities, strong communication (verbal & written), and effective collaboration with US counterparts
What You’ll Bring:
- Bachelor’s degree in Computer Science, Engineering, or a related field, or equivalent work experience.
- 6+ years of experience
- Proven experience delivering high-quality web applications.
Mandatory Skills:
- Strong hands-on experience on C#, SQL Server, OOPS Concepts, Micro Services Architecture.
- Solid experience on .NET Core, ASP.NET Core, Web API, SQL, No SQL, Entity Framework 6 or above, Azure, Applying Design Patterns. Strong proficiency in Angular framework (v10+ preferred)and TypeScript & Solid understanding of HTML5, CSS3, JavaScript
- Skill for writing reusable libraries & Experience with Angular Material or other UI component libraries
- Excellent Communication skills both oral & written.
- Excellent troubleshooting and communication skills, ability to communicate clearly with US counter parts
Preferred Skills (Nice to Have):
- Self – Starter with solid analytical and problem- solving skills. Willingness to work extra hours to meet deliverables
- Understanding of Agile/Scrum Methodologies.
- Exposure to cloud platform like AWS/Azure
Employee Benefits:
1. Culture:
- Open Door Policy: Encourages open communication and accessibility to management.
- Open Office Floor Plan: Fosters a collaborative and interactive work environment.
- Flexible Working Hours: Allows employees to have flexibility in their work schedules.
- Employee Referral Bonus: Rewards employees for referring qualified candidates.
- Appraisal Process Twice a Year: Provides regular performance evaluations and feedback.
2. Inclusivity and Diversity:
- Hiring practices that promote diversity: Ensures a diverse and inclusive workforce.
- Mandatory POSH training: Promotes a safe and respectful work environment.
3. Health Insurance and Wellness Benefits:
- GMC and Term Insurance: Offers medical coverage and financial protection.
- Health Insurance: Provides coverage for medical expenses.
- Disability Insurance: Offers financial support in case of disability.
4. Child Care & Parental Leave Benefits:
- Company-sponsored family events: Creates opportunities for employees and their families to bond.
- Generous Parental Leave: Allows parents to take time off after the birth or adoption of a child.
- Family Medical Leave: Offers leave for employees to take care of family members' medical needs.
5. Perks and Time-Off Benefits:
- Company-sponsored outings: Organizes recreational activities for employees.
- Gratuity: Provides a monetary benefit as a token of appreciation.
- Provident Fund: Helps employees save for retirement.
- Generous PTO: Offers more than the industry standard for paid time off.
- Paid sick days: Allows employees to take paid time off when they are unwell.
- Paid holidays: Gives employees paid time off for designated holidays.
- Bereavement Leave: Provides time off for employees to grieve the loss of a loved one.
6. Professional Development Benefits:
- L&D with FLEX- Enterprise Learning Repository: Provides access to a learning repository for professional development.
- Mentorship Program: Offers guidance and support from experienced professionals.
- Job Training: Provides training to enhance job-related skills.
- Professional Certification Reimbursements: Assists employees in obtaining professional certifications.
- Promote from Within: Encourages internal growth and advancement opportunities.
Job Title: Mid-Level .NET Developer (Agile/SCRUM)
Location: Mohali, Bangalore, Pune, Navi Mumbai, Chennai, Hyderabad, Panchkula, Gurugram (Delhi NCR), Dehradun
Night Shift from 6:30 pm to 3:30 am IST
Experience: 5+ Years
Job Summary:
We are seeking a proactive and detail-oriented Mid-Level .NET Developer to join our dynamic team. The ideal candidate will be responsible for designing, developing, and maintaining high-quality applications using Microsoft technologies with a strong emphasis on .NET Core, C#, Web API, and modern front-end frameworks. You will collaborate with cross-functional teams in an Agile/SCRUM environment and participate in the full software development lifecycle—from requirements gathering to deployment—while ensuring adherence to best coding and delivery practices.
Key Responsibilities:
- Design, develop, and maintain applications using C#, .NET, .NET Core, MVC, and databases such as SQL Server, PostgreSQL, and MongoDB.
- Create responsive and interactive user interfaces using JavaScript, TypeScript, Angular, HTML, and CSS.
- Develop and integrate RESTful APIs for multi-tier, distributed systems.
- Participate actively in Agile/SCRUM ceremonies, including sprint planning, daily stand-ups, and retrospectives.
- Write clean, efficient, and maintainable code following industry best practices.
- Conduct code reviews to ensure high-quality and consistent deliverables.
- Assist in configuring and maintaining CI/CD pipelines (Jenkins or similar tools).
- Troubleshoot, debug, and resolve application issues effectively.
- Collaborate with QA and product teams to validate requirements and ensure smooth delivery.
- Support release planning and deployment activities.
Required Skills & Qualifications:
- 4–6 years of professional experience in .NET development.
- Strong proficiency in C#, .NET Core, MVC, and relational databases such as SQL Server.
- Working knowledge of NoSQL databases like MongoDB.
- Solid understanding of JavaScript/TypeScript and the Angular framework.
- Experience in developing and integrating RESTful APIs.
- Familiarity with Agile/SCRUM methodologies.
- Basic knowledge of CI/CD pipelines and Git version control.
- Hands-on experience with AWS cloud services.
- Strong analytical, problem-solving, and debugging skills.
- Excellent communication and collaboration skills.
Preferred / Nice-to-Have Skills:
- Advanced experience with AWS services.
- Knowledge of Kubernetes or other container orchestration platforms.
- Familiarity with IIS web server configuration and management.
- Experience in the healthcare domain.
- Exposure to AI-assisted code development tools (e.g., GitHub Copilot, ChatGPT).
- Experience with application security and code quality tools such as Snyk or SonarQube.
- Strong understanding of SOLID principles and clean architecture patterns.
Technical Proficiencies:
- ASP.NET Core, ASP.NET MVC
- C#, Entity Framework, Razor Pages
- SQL Server, MongoDB
- REST API, jQuery, AJAX
- HTML, CSS, JavaScript, TypeScript, Angular
- Azure Services, Azure Functions, AWS
- Visual Studio
- CI/CD, Git
We are looking for a highly skilled Sr. Big Data Engineer with 3-5 years of experience in
building large-scale data pipelines, real-time streaming solutions, and batch/stream
processing systems. The ideal candidate should be proficient in Spark, Kafka, Python, and
AWS Big Data services, with hands-on experience in implementing CDC (Change Data
Capture) pipelines and integrating multiple data sources and sinks.
Responsibilities
- Design, develop, and optimize batch and streaming data pipelines using Apache Spark and Python.
- Build and maintain real-time data ingestion pipelines leveraging Kafka and AWS Kinesis.
- Implement CDC (Change Data Capture) pipelines using Kafka Connect, Debezium or similar frameworks.
- Integrate data from multiple sources and sinks (databases, APIs, message queues, file systems, cloud storage).
- Work with AWS Big Data ecosystem: Glue, EMR, Kinesis, Athena, S3, Lambda, Step Functions.
- Ensure pipeline scalability, reliability, and performance tuning of Spark jobs and EMR clusters.
- Develop data transformation and ETL workflows in AWS Glue and manage schema evolution.
- Collaborate with data scientists, analysts, and product teams to deliver reliable and high-quality data solutions.
- Implement monitoring, logging, and alerting for critical data pipelines.
- Follow best practices for data security, compliance, and cost optimization in cloud environments.
Required Skills & Experience
- Programming: Strong proficiency in Python (PySpark, data frameworks, automation).
- Big Data Processing: Hands-on experience with Apache Spark (batch & streaming).
- Messaging & Streaming: Proficient in Kafka (brokers, topics, partitions, consumer groups) and AWS Kinesis.
- CDC Pipelines: Experience with Debezium / Kafka Connect / custom CDC frameworks.
- AWS Services: AWS Glue, EMR, S3, Athena, Lambda, IAM, CloudWatch.
- ETL/ELT Workflows: Strong knowledge of data ingestion, transformation, partitioning, schema management.
- Databases: Experience with relational databases (MySQL, Postgres, Oracle) and NoSQL (MongoDB, DynamoDB, Cassandra).
- Data Formats: JSON, Parquet, Avro, ORC, Delta/Iceberg/Hudi.
- Version Control & CI/CD: Git, GitHub/GitLab, Jenkins, or CodePipeline.
- Monitoring/Logging: CloudWatch, Prometheus, ELK/Opensearch.
- Containers & Orchestration (nice-to-have): Docker, Kubernetes, Airflow/Step
- Functions for workflow orchestration.
Preferred Qualifications
- Bachelor’s or Master’s degree in Computer Science, Data Engineering, or related field.
- Experience in large-scale data lake / lake house architectures.
- Knowledge of data warehousing concepts and query optimisation.
- Familiarity with data governance, lineage, and cataloging tools (Glue Data Catalog, Apache Atlas).
- Exposure to ML/AI data pipelines is a plus.
Tools & Technologies (must-have exposure)
- Big Data & Processing: Apache Spark, PySpark, AWS EMR, AWS Glue
- Streaming & Messaging: Apache Kafka, Kafka Connect, Debezium, AWS Kinesis
- Cloud & Storage: AWS (S3, Athena, Lambda, IAM, CloudWatch)
- Programming & Scripting: Python, SQL, Bash
- Orchestration: Airflow / Step Functions
- Version Control & CI/CD: Git, Jenkins/CodePipeline
- Data Formats: Parquet, Avro, ORC, JSON, Delta, Iceberg, Hudi
Key Responsibilities
- Design, develop, and maintain scalable microservices and RESTful APIs using Python (Flask, FastAPI, or Django).
- Architect data models for SQL and NoSQL databases (PostgreSQL, ClickHouse, MongoDB, DynamoDB) to optimize performance and reliability.
- Implement efficient and secure data access layers, caching, and indexing strategies.
- Collaborate closely with product and frontend teams to deliver seamless user experiences.
- Build responsive UI components using HTML, CSS, JavaScript, and frameworks like React or Angular.
- Ensure system reliability, observability, and fault tolerance across services.
- Lead code reviews, mentor junior engineers, and promote engineering best practices.
- Contribute to DevOps and CI/CD workflows for smooth deployments and testing automation.
Required Skills & Experience
- 10+ years of professional software development experience.
- Strong proficiency in Python, with deep understanding of OOP, asynchronous programming, and performance optimization.
- Proven expertise in building FAST API based microservices architectures.
- Solid understanding of SQL and NoSQL data modeling, query optimization, and schema design.
- Excellent hands on proficiency in frontend proficiency with HTML, CSS, JavaScript, and a modern framework (React, Angular, or Vue).
- Experience working with cloud platforms (AWS, GCP, or Azure) and containerized deployments (Docker, Kubernetes).
- Familiarity with distributed systems, event-driven architectures, and messaging queues (Kafka, RabbitMQ).
- Excellent problem-solving, communication, and system design skills.
- 8+ years of Data Engineering experience
- Strong SQL and Redshift experience
- CI/CD and orchestration experience using Bitbucket, Jenkins and Control-M
- Reporting experience preferably Tableau
- Location – Pune, Hyderabad, Bengaluru
Integration Developer
ROLE TITLE
Integration Developer
ROLE LOCATION(S)
Bangalore/Hyderabad/Chennai/Coimbatore/Noida/Kolkata/Pune/Indore
ROLE SUMMARY
The Integration Developer is a key member of the operations team, responsible for ensuring the smooth integration and functioning of various systems and software within the organization. This role involves technical support, system troubleshooting, performance monitoring, and assisting with the implementation of integration solutions.
ROLE RESPONSIBILITIES
· Design, develop, and maintain integration solutions using Spring Framework, Apache Camel, and other integration patterns such as RESTful APIs, SOAP services, file-based FTP/SFTP, and OAuth authentication.
· Collaborate with architects and cross-functional teams to design integration solutions that are scalable, secure, and aligned with business requirements.
· Resolve complex integration issues, performance bottlenecks, and data discrepancies across multiple systems. Support Production issues and fixes.
· Document integration processes, technical designs, APIs, and workflows to ensure clarity and ease of use.
· Participate in on-call rotation to provide 24/7 support for critical production issues.
· Develop source code / version control management experience in a collaborative work environment.
TECHNICAL QUALIFICATIONS
· 5+ years of experience in Java development with strong expertise in Spring Framework and Apache Camel for building enterprise-grade integrations.
· Proficient with Azure DevOps (ADO) for version control, CI/CD pipeline implementation, and project management.
· Hands-on experience with RESTful APIs, SOAP services, and file-based integrations using FTP and SFTP protocols.
· Strong analytical and troubleshooting skills for resolving complex integration and system issues.
· Experience in Azure Services, including Azure Service Bus, Azure Kubernetes Service (AKS), Azure Container Apps, and ideally Azure API Management (APIM) is a plus.
· Good understanding of containerization and cloud-native development, with experience in using Docker, Kubernetes, and Azure AKS.
· Experience with OAuth for secure authentication and authorization in integration solutions.
· Strong experience level using GitHub Source Control application.
· Strong background in SQL databases (e.g., T-SQL, Stored Procedures) and working with data in an integration context.
· Skilled with Azure DevOps (ADO) for version control, CI/CD pipeline implementation, and project management.
· Experience in Azure Services, including Azure Service Bus, Azure Kubernetes Service (AKS), Azure Container Apps, and ideally Azure API Management (APIM) is a plus.
GENERAL QUALIFICATIONS
· Excellent analytical and problem-solving skills, with a keen attention to detail.
· Effective communication skills, with the ability to collaborate with technical and non-technical stakeholders.
· Experience working in a fast paced, production support environment with a focus on incident management and resolution.
· Experience in the insurance domain is considered a plus.
EDUCATION REQUIREMENTS
· Bachelor’s degree in Computer Science, Information Technology, or related field.
Shift timings : Afternoon
Job Summary
We are seeking an experienced Senior Java Developer with strong expertise in legacy system migration, server management, and deployment. The candidate will be responsible for maintaining, enhancing, and migrating an existing Java/JSF (PrimeFaces), EJB, REST API, and SQL Server-based application to a modern Spring Boot architecture. The role involves ensuring smooth production deployments, troubleshooting server issues, and optimizing the existing infrastructure.
Key Responsibilities
● Maintain & Enhance the existing Java, JSF (PrimeFaces), EJB, REST API, andSQL Server application.
● Migrate the legacy system to Spring Boot while ensuring minimal downtime.
● Manage deployments using Ansible, GlassFish/Payara, and deployer.sh scripts.
● Optimize and troubleshoot server performance (Apache, Payara, GlassFish).
● Handle XML file generation, email integrations, and REST API maintenance.
● Database management (SQL Server) including query optimization and schema updates.
● Collaborate with teams to ensure smooth transitions during migration.
● Automate CI/CD pipelines using Maven, Ansible, and shell scripts.
● Document migration steps, deployment processes, and system architecture.
Required Skills & Qualifications
● 8+ years of hands-on experience with Java, JSF (PrimeFaces), EJB, and REST APIs.
● Strong expertise in Spring Boot (migration experience from legacy Java is a must).
● Experience with Payara/GlassFish server management and deployment.
● Proficient in Apache, Ansible, and shell scripting (deployer.sh).
● Solid knowledge of SQL Server (queries, stored procedures, optimization).
● Familiarity with XML processing, email integrations, and Maven builds.
● Experience in production deployments, server troubleshooting, and performance tuning.
● Ability to work independently and lead migration efforts.
Preferred Skills
● Knowledge of microservices architecture (helpful for modernization).
● Familiarity with cloud platforms (AWS/Azure) is a plus.
Job Title: Python Developer (Full Time)
Location: Hyderabad (Onsite)
Interview: Virtual and Face to Face Interview (Last round)
Experience Required: 4 + Years
Working Days: 5 Days
About the Role
We are seeking a highly skilled Lead Python Developer with a strong background in building scalable and secure applications. The ideal candidate will have hands-on expertise in Python frameworks, API integrations, and modern application architectures. This role requires a tech leader who can balance innovation, performance, and compliance while driving successful project delivery.
Key Responsibilities
- Application Development
- Architect and develop robust, high-performance applications using Django, Flask, and FastAPI.
- API Integration
- Design and implement seamless integration with third-party APIs (including travel-related APIs, payment gateways, and external service providers).
- Data Management
- Develop and optimize ETL pipelines for structured and unstructured data using data lakes and distributed storage solutions.
- Microservices Architecture
- Build modular, scalable applications using microservices principles for independent deployment and high availability.
- Performance Optimization
- Enhance application performance through load balancing, caching, and query optimization to deliver superior user experiences.
- Security & Compliance
- Apply secure coding practices, implement data encryption, and ensure compliance with industry security and privacy standards (e.g., PCI DSS, GDPR).
- Automation & Deployment
- Utilize CI/CD pipelines, Docker/Kubernetes, and monitoring tools for automated testing, deployment, and production monitoring.
- Collaboration
- Partner with front-end developers, product managers, and stakeholders to deliver user-centric, business-aligned solutions.
Requirements
Education
- Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field.
Technical Expertise
- 4+ years of hands-on experience with Python frameworks (Django, Flask, FastAPI).
- Proficiency in RESTful APIs, GraphQL, and asynchronous programming.
- Strong knowledge of SQL/NoSQL databases (PostgreSQL, MongoDB) and big data tools (Spark, Kafka).
- Familiarity with Kibana, Grafana, Prometheus for monitoring and visualization.
- Experience with AWS, Azure, or Google Cloud, containerization (Docker, Kubernetes), and CI/CD tools (Jenkins, GitLab CI).
- Working knowledge of testing tools: PyTest, Selenium, SonarQube.
- Experience with API integrations, booking flows, and payment gateway integrations (travel domain knowledge is a plus, but not mandatory).
Soft Skills
- Strong problem-solving and analytical skills.
- Excellent communication, presentation, and teamwork abilities.
- Proactive, ownership-driven mindset with the ability to perform under pressure.
Job Description :
We are seeking a talented and experienced Full Stack Developer with 6+ years of experience to join our dynamic team in Hyderabad. The ideal candidate will have a passion for building scalable and efficient web applications, a strong understanding of modern frameworks and technologies, and a keen eye for user experience and design.
Key Responsibilities :
- Design, develop, and maintain web-based applications using React JS, NodeJS and other modern frameworks.
- Develop hybrid mobile applications and responsive web interfaces using Bootstrap and JavaScript.
- Build and optimize back-end services with frameworks such as Express.js or Restify.
- Work with SQL databases, including schema design and query optimization.
- Utilize ORM tools like Sequelize for database management.
- Implement real-time communication features and ensure browser compatibility.
- Collaborate with cross-functional teams to participate in the product development lifecycle, including prototyping, testing, and deployment.
- Adapt to and learn alternative technologies based on project requirements.
Required Skills & Experience :
- 6+ years of experience in full-stack web development.
- Proficient in Angular, NodeJS, React.JS, and JavaScript, TypeScript
- Strong experience with Express.js or Restify frameworks.
- Solid understanding of SQL databases and ORM tools like Sequelize.
- Knowledge of responsive design principles and hands-on experience in developing responsive web applications.
- Familiarity with React Native for mobile development (a plus)
- Strong understanding of real-time communication technologies.
Additional Skills & Experience :
-Exposure to Dotnet
- Experience with NoSQL databases such as MongoDB or Cassandra.
- Awareness of internationalization (i18n) and the latest trends in UI/UX design.
- Familiarity with other JavaScript libraries/frameworks like VueJS.
- Hands-on experience with implementing payment gateways for different regions.
- Excellent facilitation, verbal, and written communication skills.
- Eagerness to contribute to functional and user experience design discussions.
Education
B.Tech/M.Tech in CSE/IT.ECE
🚀 Hiring: Tableau Developer
⭐ Experience: 5+ Years
📍 Location: Pune, Gurgaon, Bangalore, Chennai, Hyderabad
⭐ Work Mode:- Hybrid
⏱️ Notice Period: Immediate Joiners or 15 Days
(Only immediate joiners & candidates serving notice period)
We are looking for a skilled Tableau Developer to join our team. The ideal candidate will have hands-on experience in creating, maintaining, and optimizing dashboards, reports, and visualizations that enable data-driven decision-making across the organization.
⭐ Key Responsibilities:
✅Develop and maintain Tableau dashboards & reports
✅Translate business needs into data visualizations
✅Work with SQL & multiple data sources for insights
✅Optimize dashboards for performance & usability
✅Collaborate with stakeholders for BI solutions
We are seeking a skilled SQL Developer to join our team. This role serves as a key bridge between insurance operations and technical solutions, ensuring business requirements are accurately translated into efficient system functionality. The SQL Developer will play a critical part in maintaining and enhancing underwriting software products and system integrations—helping deliver reliable, high-quality solutions to clients in the insurtech space.
The ideal candidate possesses strong SQL expertise, advanced data mapping capabilities, and hands-on experience working with APIs, JSON, XML, and other data exchange formats. Experience with insurance technology platforms, such as ConceptOne or similar underwriting systems, is preferred. In this role, you will regularly develop, maintain, and troubleshoot stored procedures and functions, perform data validation, support integration efforts across multiple systems, and configure insurance workflows. You will work closely with business analysts, underwriters, and technical teams to ensure smooth product updates and continuous improvement of system functionality.
What We’re Looking For:
- 3+ years of experience in a technical, insurance, or insurtech-focused role
- Strong proficiency in writing SQL, including complex queries, stored procedures, and performance tuning
- Expertise in data mapping, data validation, and reporting
- Experience working with APIs, JSON, XML, and system-to-system integrations
- Strong analytical and problem-solving skills with the ability to troubleshoot and optimize complex workflows
- Clear and effective communication skills, able to translate technical concepts for non-technical stakeholders
- Ability to work independently and manage multiple tasks in a fast-paced environment
- Keen attention to detail and commitment to delivering accurate, high-quality results
Bonus:
- Hands-on experience with underwriting or policy administration systems (e.g., ConceptOne or similar platforms)
- Familiarity with core insurance processes, such as policy issuance, endorsements, raters, claims, and reporting
- Experience with the U.S. P&C (Property & Casualty) insurance
What You’ll Be Doing:
- Develop and optimize SQL stored procedures, functions, and triggers to support underwriting and compliance requirements
- Create and maintain reports, quote covers, and validations or map and configure forms, raters, and system workflows to ensure accurate data processes
- Set up, troubleshoot, and optimize underwriting platforms (ConceptOne/others) for performance and accuracy
- Manage integrations with APIs, JSON, and XML to connect external services and streamline data exchange
- Collaborate with BAs, QAs, and Developers to translate requirements, test outputs, and resolve issues
- Provide technical support and training to internal teams and clients to ensure effective system usage
Required Skills/Experience:
- 6+ years of experience in designing and developing enterprise and/or consumer-facing applications using technologies and frameworks like JavaScript, Node.js (Javascript), ReactJS, Angular, SCSS, CSS, React Native
- 3+ years experience in leading teams (guide, design, track), taking responsibilities to deliver as per the agreed-upon schedules
- Hands-on experience with SQL and NoSQL databases
- Hands-on experience working in Linux OS
- Very good debugging and problem resolution experience
- Experience developing responsive web applications
- Very good communication (verbal and written) to interact with our customers
- Ability and interest to learn alternative technologies based on need
- Experienced in product development lifecycle (prototyping, hardening, testing etc.)
Additional Skills/Experience:
- Working experience with Python and NoSQL databases such as MongoDB, Cassandra
- Eagerness to participate in product functional and user experience designs
- Experience in AI, ML, NLP, and Predictive Analytics domains
- Familiarity with i18n, latest trends in UI and UX designs
- Experience with implementation of payment gateways applicable in different countries
- Experience with CI/CD, Jenkins, Nginx
· 5 years of experience as a Product Specialist, Business Analyst, or any other occupation/title providing experience with business process and data analysis
· Proven experience with and understanding of relational databases, and ability to construct basic to intermediate query logic
· 2 years of experience in asset management and/or fintech domain in R&D or product capacity
Responsibilities
• Enhance and maintain our main web application, ASCLOGS, built on ASP.NET MVC
(.NET Framework 4.8) with features such as authentication, PDF generation via
IronPdf, audit logging with log4net, Twilio SMS integration, and data access through
PetaPoco.
• Support multiple companion projects including:
• CopyForm, an MVC tool that copies form templates between SQL Server
databases using AJAX and PetaPoco data access.
• SQLImportApp, a WinForms importer leveraging ExcelDataReader and
Z.Dapper.Plus for bulk inserts.
• DEAVerification, a WinForms app automating data retrieval via Selenium
WebDriver and storing results with PetaPoco.
• UniversalScrapperAPI, an ASP.NET Web API that scrapes licensing
information using Selenium and logs results with log4net.
• HL7DocAssistantSync, a VB.NET library for HL7 message processing and
PDF generation with PdfSharp.
• ChatGPT Implementation, a .NET 8 Web API example showing how we
integrate with OpenAI's ChatGPT service.
• S3MicroService, a .NET 8 Web API using AWS SDK packages
(AWSSDK.S3, AWSSDK.Extensions.NETCore.Setup) and Entity Framework
Core for storage.
• Maintain PowerShell utilities used for onboarding tasks such as document-to-PDF
conversion and CSV generation.
• Review existing code to improve reliability, enhance testability, and refactor large
code files (for example, BusinessLayer/BusinessLayer.cs is roughly 28k lines).
• Work closely with stakeholders to gather requirements for new features and ensure
compatibility with our SQL Server backend.
• Assist in modernizing legacy components and implementing best practices for
security, error handling, logging, and deployment automation.
Required Skills
• 10+ years of development experience
• Extensive experience with C# and the .NET ecosystem, including both legacy .NET
Framework and modern .NET Core / .NET 8.
• Solid understanding of ASP.NET MVC, Web API, and Windows Forms development.
• Familiarity with PetaPoco, Entity Framework Core, and SQL Server.
• Experience integrating third-party services such as Twilio, OpenAI, Selenium
WebDriver.
• Ability to write and troubleshoot PowerShell scripts for automation tasks.
• Comfortable navigating large codebases and improving code quality through
refactoring, unit testing, and documentation.
• Proficiency with version control (Git) and the Visual Studio toolchain.
Preferred Skills
• Background in healthcare or regulated industries, since many applications manage
sensitive data (e.g., DEA verification, HL7 messaging).
• Knowledge of PDF generation libraries (IronPdf, PdfSharp) and logging frameworks
(log4net).
• Experience with CI/CD pipelines for .NET applications.
• Nice to have Unit Testing frameworks (i.e. NUnit, MS Testing Framework )
Location & Work Environment
This role requires working with Visual Studio on Windows for the .NET Framework solutions
and .NET 8 projects. Experience with IIS or IIS Express is helpful for local development and
testing.
- 5 -10 years of experience in ETL Testing, Snowflake, DWH Concepts.
- Strong SQL knowledge & debugging skills are a must.
- Experience on Azure and Snowflake Testing is plus
- Experience with Qlik Replicate and Compose tools (Change Data Capture) tools is considered a plus
- Strong Data warehousing Concepts, ETL tools like Talend Cloud Data Integration, Pentaho/Kettle tool
- Experience in JIRA, Xray defect management toolis good to have.
- Exposure to the financial domain knowledge is considered a plus
- Testing the data-readiness (data quality) address code or data issues
- Demonstrated ability to rationalize problems and use judgment and innovation to define clear and concise solutions
- Demonstrate strong collaborative experience across regions (APAC, EMEA and NA) to effectively and efficiently identify root cause of code/data issues and come up with a permanent solution
- Prior experience with State Street and Charles River Development (CRD) considered a plus
- Experience in tools such as PowerPoint, Excel, SQL
- Exposure to Third party data providers such as Bloomberg, Reuters, MSCI and other Rating agencies is a plus
Key Attributes include:
- Team player with professional and positive approach
- Creative, innovative and able to think outside of the box
- Strong attention to detail during root cause analysis and defect issue resolution
- Self-motivated & self-sufficient
- Effective communicator both written and verbal
- Brings a high level of energy with enthusiasm to generate excitement and motivate the team
- Able to work under pressure with tight deadlines and/or multiple projects
- Experience in negotiation and conflict resolution
🔍 Job Description:
We are looking for an experienced and highly skilled Technical Lead to guide the development and enhancement of a large-scale Data Observability solution built on AWS. This platform is pivotal in delivering monitoring, reporting, and actionable insights across the client's data landscape.
The Technical Lead will drive end-to-end feature delivery, mentor junior engineers, and uphold engineering best practices. The position reports to the Programme Technical Lead / Architect and involves close collaboration to align on platform vision, technical priorities, and success KPIs.
🎯 Key Responsibilities:
- Lead the design, development, and delivery of features for the data observability solution.
- Mentor and guide junior engineers, promoting technical growth and engineering excellence.
- Collaborate with the architect to align on platform roadmap, vision, and success metrics.
- Ensure high quality, scalability, and performance in data engineering solutions.
- Contribute to code reviews, architecture discussions, and operational readiness.
🔧 Primary Must-Have Skills (Non-Negotiable):
- 5+ years in Data Engineering or Software Engineering roles.
- 3+ years in a technical team or squad leadership capacity.
- Deep expertise in AWS Data Services: Glue, EMR, Kinesis, Lambda, Athena, S3.
- Advanced programming experience with PySpark, Python, and SQL.
- Proven experience in building scalable, production-grade data pipelines on cloud platforms.
🚀 Hiring: Manual Tester
⭐ Experience: 5+ Years
📍 Location: Pan India
⭐ Work Mode:- Hybrid
⏱️ Notice Period: Immediate Joiners
(Only immediate joiners & candidates serving notice period)
Must-Have Skills:
✅5+ years of experience in Manual Testing
✅Solid experience in ETL, Database, and Report Testing
✅Strong expertise in SQL queries, RDBMS concepts, and DML/DDL operations
✅Working knowledge of BI tools such as Power BI
✅Ability to write effective Test Cases and Test Scenarios

Product company for financial operations automation platform
Mandatory Criteria (Can't be neglected during screening) :
- Candidate Must have Project management experience.
- Strong hands-on experience with SQL, including the ability to write, optimize, and debug complex queries (joins, CTEs, subqueries).
- Must have experience in Treasury Module.
- Should have a basic understanding of accounting principles and financial workflows
- 3+ years of implementation experience is required.
- Looking candidates from Fintech company ONLY. ( Candidate should have Strong knowledge of fintech products, financial workflows, and integrations )
- Candidate should have Hands-on experience with tools such as Jira, Confluence, Excel, and project management platforms.
- Candidate should have Experience in managing multi-stakeholder projects from scratch.
Position Overview
We are looking for an experienced Implementation Lead to drive the onboarding and implementation of our platform for new and existing fintech clients. This role is ideal for someone with a strong understanding of financial systems, implementation methodologies, and client management. You’ll collaborate closely with product, engineering, and customer success teams to ensure timely, accurate, and seamless deployments.
Key Responsibilities
- Lead end-to-end implementation projects for enterprise fintech clients
- Translate client requirements into detailed implementation plans and configure solutions accordingly.
- Write and optimize complex SQL queries for data analysis, validation, and integration
- Oversee ETL processes – extract, transform, and load financial data across systems
- Collaborate with cross-functional teams including Product, Engineering, and Support
- Ensure timely, high-quality delivery across multiple stakeholders and client touchpoints
- Document processes, client requirements, and integration flows in detail.
Required Qualifications
- Bachelor’s degree in Finance, Business Administration, Information Systems, or related field
- 3+ years of hands-on implementation/project management experience
- Proven experience delivering projects in Fintech, SaaS, or ERP environments
- Strong understanding of accounting principles and financial workflows
- Hands-on SQL experience, including the ability to write and debug complex queries (joins, CTEs, subqueries)
- Experience working with ETL pipelines or data migration processes
- Proficiency in tools like Jira, Confluence, Excel, and project tracking systems
- Strong communication and stakeholder management skills
- Ability to manage multiple projects simultaneously and drive client success
Preferred Qualifications
- Prior experience implementing financial automation tools (e.g., SAP, Oracle, Anaplan, Blackline)
- Familiarity with API integrations and basic data mapping
- Experience in agile/scrum-based implementation environments
- Exposure to reconciliation, book closure, AR/AP, and reporting systems
- PMP, CSM, or similar certifications
Skills & Competencies
Functional Skills
- Financial process knowledge (e.g., reconciliation, accounting, reporting)
- Business analysis and solutioning
- Client onboarding and training
- UAT coordination
- Documentation and SOP creation
Project Skills
- Project planning and risk management
- Task prioritization and resource coordination
- KPI tracking and stakeholder reporting
Soft Skills
- Cross-functional collaboration
- Communication with technical and non-technical teams
- Attention to detail and customer empathy
- Conflict resolution and crisis management
What We Offer
- An opportunity to shape fintech implementations across fast-growing companies
- Work in a dynamic environment with cross-functional experts
- Competitive compensation and rapid career growth
- A collaborative and meritocratic culture
Job Title: PostgreSQL Database Administrator
Experience: 6–8 Years
Work Mode: Hybrid
Locations: Hyderabad / Pune
Joiners: Only immediate joiners & candidates who have completed notice period
Required Skills
- Strong hands-on experience in PostgreSQL administration (6+ years).
- Excellent understanding of SQL and query optimization techniques.
- Deep knowledge of database services, architecture, and internals.
- Experience in performance tuning at both DB and OS levels.
- Familiarity with DataGuard or similar high-availability solutions.
- Strong experience in job scheduling and automation.
- Comfortable with installing, configuring, and upgrading PostgreSQL.
- Basic to intermediate knowledge of Linux system administration.
- Hands-on experience with shell scripting for automation and monitoring tasks.
Key Responsibilities
- Administer and maintain PostgreSQL databases with 6+ years of hands-on experience.
- Write and optimize complex SQL queries for performance and scalability.
- Manage database storage structures and ensure optimal disk usage and performance.
- Monitor, analyze, and resolve database performance issues using tools and logs.
- Perform database tuning, configuration adjustments, and query optimization.
- Plan, schedule, and automate jobs using cron or other job scheduling tools at DB and OS levels.
- Install and upgrade PostgreSQL database software to new versions as required.
- Manage high availability and disaster recovery setups, including replication and DataGuard administration (or equivalent techniques).
- Perform regular database backups and restorations to ensure data integrity and availability.
- Apply security patches and updates on time.
- Collaborate with developers for schema design, stored procedures, and access privileges.
- Document configurations, processes, and performance tuning results.
🚀 Hiring: Postgres DBA at Deqode
⭐ Experience: 6+ Years
📍 Location: Pune & Hyderabad
⭐ Work Mode:- Hybrid
⏱️ Notice Period: Immediate Joiners
(Only immediate joiners & candidates serving notice period)
Looking for an experienced Postgres DBA with:-
✅ 6+ years in Postgres & strong SQL skills
✅ Good understanding of database services & storage management
✅ Performance tuning & monitoring expertise
✅ Knowledge of Dataguard admin, backups, upgrades
✅ Basic Linux admin & shell scripting
Immediate Hiring for Business Analyst
Position: Business Analyst
Experiance : 5 - 8 Years
Location:Hyderabad
Job Summary:
We are seeking a motivated and detail-oriented Business Analyst with 5 years of experience in the Travel domain. The ideal candidate will have a strong understanding of the travel industry, including airlines, travel agencies, and online booking systems. You will work closely with cross-functional teams to gather business requirements, analyze processes, and deliver solutions that improve customer experience and operational efficiency.
Key Responsibilities:
- Requirement Gathering & Analysis: Collaborate with stakeholders to gather, document, and analyze business requirements, ensuring alignment with business goals.
- Process Improvement: Identify opportunities for process improvement and optimization in travel booking, ticketing, and customer support systems.
- Stakeholder Communication: Act as the bridge between the business stakeholders and technical teams, ensuring clear communication of requirements, timelines, and deliverables.
- Solution Design: Participate in the design and development of solutions, collaborating with IT and development teams to ensure business needs are met.
- Data Analysis: Analyze data related to customer journeys, bookings, and cancellations to identify trends and insights for decision-making.
- Documentation: Prepare detailed documentation including business requirements documents (BRD), user stories, process flows, and functional specifications.
- Testing & Validation: Support testing teams during User Acceptance Testing (UAT) to ensure solutions meet business needs, and facilitate issue resolution.
- Market Research: Stay up to date with travel industry trends, customer preferences, and competitor offerings to ensure innovative solutions are delivered.
Qualifications & Skills:
- Education: Bachelor’s degree in Business Administration, Information Technology, or a related field.
- Experience:
- 5 years of experience as a Business Analyst in the travel industry.
- Hands-on experience in working with travel booking systems (GDS, OTA) is highly preferred.
- Domain Knowledge:
- Strong understanding of the travel industry, including booking engines, reservations, ticketing, cancellations, and customer support.
- Familiarity with industry-specific regulations and best practices.
- Analytical Skills: Excellent problem-solving skills with the ability to analyze complex data and business processes.
- Technical Skills:
- Proficiency in Microsoft Office (Word, Excel, PowerPoint).
- Knowledge of SQL or data visualization tools (Power BI, Tableau) is a plus.
- Communication: Strong verbal and written communication skills with the ability to convey complex information clearly.
- Attention to Detail: Strong focus on accuracy and quality of work, ensuring that solutions meet business requirements.
Preferred:
- Prior experience with Agile methodologies.
- Certification in Business Analysis (CBAP or similar).
- A minimum of 4-10 years of experience into data integration/orchestration services, service architecture and providing data driven solutions for client requirements
- Experience on Microsoft Azure cloud and Snowflake SQL, database query/performance tuning.
- Experience with Qlik Replicate and Compose tools(Change Data Capture) tools is considered a plus
- Strong Data warehousing Concepts, ETL tools such as Talend Cloud Data Integration tool is must
- Exposure to the financial domain knowledge is considered a plus.
- Cloud Managed Services such as source control code Github, MS Azure/Devops is considered a plus.
- Prior experience with State Street and Charles River Development ( CRD) considered a plus.
- Experience in tools such as Visio, PowerPoint, Excel.
- Exposure to Third party data providers such as Bloomberg, Reuters, MSCI and other Rating agencies is a plus.
- Strong SQL knowledge and debugging skills is a must.
Job description
🔧 Key Responsibilities:
- Design and implement robust backend services using Node.js.
- Develop and maintain RESTful APIs to support front-end applications and third-party integrations
- Manage and optimize SQL/NoSQL databases (e.g., PostgreSQL, MongoDB, Snowflake)
- Collaborate with front-end developers to ensure seamless integration and data flow
- Implement caching, logging, and monitoring strategies for performance and reliability
- Ensure application security, scalability, and maintainability
- Participate in code reviews, architecture discussions, and agile ceremonies
✅ Required Skills:
- Proficiency in backend programming languages (Node.js, Java, .NET Core)
- Experience with API development and tools like Postman, Swagger
- Strong understanding of database design and query optimization
- Familiarity with microservices architecture and containerization (Docker, Kubernetes)
- Knowledge of cloud platforms (Azure, AWS) and CI/CD pipelines.
About Cognida.ai:
Our Purpose is to boost your competitive advantage using AI and Analytics.
We Deliver tangible business impact with data-driven insights powered by AI. Drive revenue growth, increase profitability and improve operational efficiencies.
We Are technologists with keen business acumen - Forever curious, always on the front lines of technological advancements. Applying our latest learnings, and tools to solve your everyday business challenges.
We Believe the power of AI should not be the exclusive preserve of the few. Every business, regardless of its size or sector deserves the opportunity to harness the power of AI to make better decisions and drive business value.
We See a world where our AI and Analytics solutions democratise decision intelligence for all businesses. With Cognida.ai, our motto is ‘No enterprise left behind’.
Position: Python Fullstack Architect
Location: Hyderabad
Job Summary
We’re seeking a seasoned Python Fullstack Architect with 15+ years of experience to lead solution design, mentor teams, and drive technical excellence across projects. You'll work closely with stakeholders, contribute to architecture governance, and integrate modern technologies across the stack.
Key Responsibilities
- Design and review Python-based fullstack solution architectures.
- Guide development teams on best practices, modern frameworks, and cloud-native patterns.
- Engage with clients to translate business needs into scalable technical solutions.
- Stay current with tech trends and contribute to internal innovation initiatives.
Required Skills
- Strong expertise in Python (Django/Flask/FastAPI) and frontend frameworks (React, Angular, etc.).
- Cloud experience (AWS, Azure, or GCP) and DevOps/CI-CD setup.
- Familiarity with enterprise tools: RabbitMQ, Kafka, OAuth2, PostgreSQL, MongoDB.
- Solid understanding of microservices, API design, batch/stream processing.
- Strong leadership, mentoring, and architectural problem-solving skills.
Position Summary:
As a CRM ETL Developer, you will be responsible for the analysis, transformation, and integration of data from legacy and external systems into CRM application. This includes developing ETL/ELT workflows, ensuring data quality through cleansing and survivorship rules, and supporting daily production loads. You will work in an Agile environment and play a vital role in building scalable, high-quality data integration solutions.
Key Responsibilities:
- Analyze data from legacy and external systems; develop ETL/ELT pipelines to ingest and process data.
- Cleanse, transform, and apply survivorship rules before loading into the CRM platform.
- Monitor, support, and troubleshoot production data loads (Tier 1 & Tier 2 support).
- Contribute to solution design, development, integration, and scaling of new/existing systems.
- Promote and implement best practices in data integration, performance tuning, and Agile development.
- Lead or support design reviews, technical documentation, and mentoring of junior developers.
- Collaborate with business analysts, QA, and cross-functional teams to resolve defects and clarify requirements.
- Deliver working solutions via quick POCs or prototypes for business scenarios.
Technical Skills:
- ETL/ELT Tools: 5+ years of hands-on experience in ETL processes using Siebel EIM.
- Programming & Databases: Strong SQL & PL/SQL development; experience with Oracle and/or SQL Server.
- Data Integration: Proven experience in integrating disparate data systems.
- Data Modelling: Good understanding of relational, dimensional modelling, and data warehousing concepts.
- Performance Tuning: Skilled in application and SQL query performance optimization.
- CRM Systems: Familiarity with Siebel CRM, Siebel Data Model, and Oracle SOA Suite is a plus.
- DevOps & Agile: Strong knowledge of DevOps pipelines and Agile methodologies.
- Documentation: Ability to write clear technical design documents and test cases.
Soft Skills & Attributes:
- Strong analytical and problem-solving skills.
- Excellent communication and interpersonal abilities.
- Experience working with cross-functional, globally distributed teams.
- Proactive mindset and eagerness to learn new technologies.
- Detail-oriented with a focus on reliability and accuracy.
Preferred Qualifications:
- Bachelor’s degree in Computer Science, Information Systems, or a related field.
- Experience in Tier 1 & Tier 2 application support roles.
- Exposure to real-time data integration systems is an advantage.
Position : Senior Data Analyst
Experience Required : 5 to 8 Years
Location : Hyderabad or Bangalore (Work Mode: Hybrid – 3 Days WFO)
Shift Timing : 11:00 AM – 8:00 PM IST
Notice Period : Immediate Joiners Only
Job Summary :
We are seeking a highly analytical and experienced Senior Data Analyst to lead complex data-driven initiatives that influence key business decisions.
The ideal candidate will have a strong foundation in data analytics, cloud platforms, and BI tools, along with the ability to communicate findings effectively across cross-functional teams. This role also involves mentoring junior analysts and collaborating closely with business and tech teams.
Key Responsibilities :
- Lead the design, execution, and delivery of advanced data analysis projects.
- Collaborate with stakeholders to identify KPIs, define requirements, and develop actionable insights.
- Create and maintain interactive dashboards, reports, and visualizations.
- Perform root cause analysis and uncover meaningful patterns from large datasets.
- Present analytical findings to senior leaders and non-technical audiences.
- Maintain data integrity, quality, and governance in all reporting and analytics solutions.
- Mentor junior analysts and support their professional development.
- Coordinate with data engineering and IT teams to optimize data pipelines and infrastructure.
Must-Have Skills :
- Strong proficiency in SQL and Databricks
- Hands-on experience with cloud data platforms (AWS, Azure, or GCP)
- Sound understanding of data warehousing concepts and BI best practices
Good-to-Have :
- Experience with AWS
- Exposure to machine learning and predictive analytics
- Industry-specific analytics experience (preferred but not mandatory)
🚀 Blitz Drive : .NET Full Stack Developer – In-Person Interviews on 18th June 2025 | Hyderabad
- We are conducting a Blitz Hiring Drive for the position of .NET Full Stack Developer on 18th June 2025 (Tuesday) at Hyderabad. This will be an in-person interview process.
🔍 Job Details :
- Position : .NET Full Stack Developer
- Experience : 3 to 8 Years
- Number of Positions : 6
- Job Location : Hyderabad (Onsite – In-Person Interview)
- Interview Date : 18th June 2025
- Notice Period : Immediate to 15 days preferred
✅ Mandatory Skills :
Core .NET, Angular (v8+), SQL (complex queries, stored procedures), REST API development, Entity Framework, LINQ, RxJS, and Dependency Injection.
🛠️ Technical Skill Requirements :
- Frontend : Angular (v8+), RxJS, TypeScript, Bootstrap 5, Reactive/Template Forms, Telerik Kendo, NX mono repo
- Backend : Core .NET, REST APIs, Entity Framework, LINQ, Middleware, Auth, DI, OOPS
- Database : SQL Server, Complex Queries, Joins, Stored Procedures, Performance Tuning
- Good to Have : Git, Cloud Basics (Azure/AWS), CI/CD understanding
Job Title : Cognos BI Developer
Experience : 6+ Years
Location : Bangalore / Hyderabad (Hybrid)
Notice Period : Immediate Joiners Preferred (Candidates serving notice with 10–15 days left can be considered)
Interview Mode : Virtual
Job Description :
We are seeking an experienced Cognos BI Developer with strong data modeling, dashboarding, and reporting expertise to join our growing team. The ideal candidate should have a solid background in business intelligence, data visualization, and performance analysis, and be comfortable working in a hybrid setup from Bangalore or Hyderabad.
Mandatory Skills :
Cognos BI, Framework Manager, Cognos Dashboarding, SQL, Data Modeling, Report Development (charts, lists, cross tabs, maps), ETL Concepts, KPIs, Drill-through, Macros, Prompts, Filters, Calculations.
Key Responsibilities :
- Understand business requirements in the BI context and design data models using Framework Manager to transform raw data into meaningful insights.
- Develop interactive dashboards and reports using Cognos Dashboard.
- Identify and define KPIs and create reports to monitor them effectively.
- Analyze data and present actionable insights to support business decision-making.
- Translate business requirements into technical specifications and determine timelines for execution.
- Design and develop models in Framework Manager, publish packages, manage security, and create reports based on these packages.
- Develop various types of reports, including charts, lists, cross tabs, and maps, and design dashboards combining multiple reports.
- Implement reports using macros, prompts, filters, and calculations.
- Perform data warehouse development activities and ensure seamless data flow.
- Write and optimize SQL queries to investigate data and resolve performance issues.
- Utilize Cognos features such as master-detail reports, drill-throughs, bookmarks, and page sets.
- Analyze and improve ETL processes to enhance data integration.
- Apply technical enhancements to existing BI systems to improve their performance and usability.
- Possess solid understanding of database fundamentals, including relational and multidimensional database design.
- Hands-on experience with Cognos Data Modules (data modeling) and dashboarding.
Job Title : Python Data Engineer
Experience : 4+ Years
Location : Bangalore / Hyderabad (On-site)
Job Summary :
We are seeking a skilled Python Data Engineer to work on cloud-native data platforms and backend services.
The role involves building scalable APIs, working with diverse data systems, and deploying containerized services using modern cloud infrastructure.
Mandatory Skills : Python, AWS, RESTful APIs, Microservices, SQL/PostgreSQL/NoSQL, Docker, Kubernetes, CI/CD (Jenkins/GitLab CI/AWS CodePipeline)
Key Responsibilities :
- Design, develop, and maintain backend systems using Python.
- Build and manage RESTful APIs and microservices architectures.
- Work extensively with AWS cloud services for deployment and data storage.
- Implement and manage SQL, PostgreSQL, and NoSQL databases.
- Containerize applications using Docker and orchestrate with Kubernetes.
- Set up and maintain CI/CD pipelines using Jenkins, GitLab CI, or AWS CodePipeline.
- Collaborate with teams to ensure scalable and reliable software delivery.
- Troubleshoot and optimize application performance.
Must-Have Skills :
- 4+ years of hands-on experience in Python backend development.
- Strong experience with AWS cloud infrastructure.
- Proficiency in building microservices and APIs.
- Good knowledge of relational and NoSQL databases.
- Experience with Docker and Kubernetes.
- Familiarity with CI/CD tools and DevOps processes.
- Strong problem-solving and collaboration skills.
Job Title : Oracle Analytics Cloud (OAC) / Fusion Data Intelligence (FDI) Specialist
Experience : 3 to 8 years
Location : All USI locations – Hyderabad, Bengaluru, Mumbai, Gurugram (preferred) and Pune, Chennai, Kolkata
Work Mode : Hybrid Only (2-3 days from office or all 5 days from office)
Mandatory Skills : Oracle Analytics Cloud (OAC), Fusion Data Intelligence (FDI), RPD, OAC Reports, Data Visualizations, SQL, PL/SQL, Oracle Databases, ODI, Oracle Cloud Infrastructure (OCI), DevOps tools, Agile methodology.
Key Responsibilities :
- Design, develop, and maintain solutions using Oracle Analytics Cloud (OAC).
- Build and optimize complex RPD models, OAC reports, and data visualizations.
- Utilize SQL and PL/SQL for data querying and performance optimization.
- Develop and manage applications hosted on Oracle Cloud Infrastructure (OCI).
- Support Oracle Cloud migrations, OBIEE upgrades, and integration projects.
- Collaborate with teams using the ODI (Oracle Data Integrator) tool for ETL processes.
- Implement cloud scripting using CURL for Oracle Cloud automation.
- Contribute to the design and implementation of Business Continuity and Disaster Recovery strategies for cloud applications.
- Participate in Agile development processes and DevOps practices including CI/CD and deployment orchestration.
Required Skills :
- Strong hands-on expertise in Oracle Analytics Cloud (OAC) and/or Fusion Data Intelligence (FDI).
- Deep understanding of data modeling, reporting, and visualization techniques.
- Proficiency in SQL, PL/SQL, and relational databases on Oracle.
- Familiarity with DevOps tools, version control, and deployment automation.
- Working knowledge of Oracle Cloud services, scripting, and monitoring.
Good to Have :
- Prior experience in OBIEE to OAC migrations.
- Exposure to data security models and cloud performance tuning.
- Certification in Oracle Cloud-related technologies.
Job Title : IBM Sterling Integrator Developer
Experience : 3 to 5 Years
Locations : Hyderabad, Bangalore, Mumbai, Gurgaon, Chennai, Pune
Employment Type : Full-Time
Job Description :
We are looking for a skilled IBM Sterling Integrator Developer with 3–5 years of experience to join our team across multiple locations.
The ideal candidate should have strong expertise in IBM Sterling and integration, along with scripting and database proficiency.
Key Responsibilities :
- Develop, configure, and maintain IBM Sterling Integrator solutions.
- Design and implement integration solutions using IBM Sterling.
- Collaborate with cross-functional teams to gather requirements and provide solutions.
- Work with custom languages and scripting to enhance and automate integration processes.
- Ensure optimal performance and security of integration systems.
Must-Have Skills :
- Hands-on experience with IBM Sterling Integrator and associated integration tools.
- Proficiency in at least one custom scripting language.
- Strong command over Shell scripting, Python, and SQL (mandatory).
- Good understanding of EDI standards and protocols is a plus.
Interview Process :
- 2 Rounds of Technical Interviews.
Additional Information :
- Open to candidates from Hyderabad, Bangalore, Mumbai, Gurgaon, Chennai, and Pune.
Position Summary
Designing, developing, and debugging new and existing software using Microsoft .Net tools and database platforms. Work in every level of our technology department, providing solutions to meet the needs of our end-users. A Software Engineer will also document solutions and designs, test code modifications and provide mentoring to programmers and junior developers.
Responsibilities:
- Work closely with senior engineers to develop high-quality software solutions
- Collaborate with team members to analyze user requirements and design software solutions
- Participate in agile development processes, including sprint planning, daily stand-ups, and sprint reviews
- Write clean, maintainable, and efficient code
- Develop and maintain unit tests to ensure code quality
- Contribute to the design and architecture of microservices-based applications
- Collaborate with cross-functional teams to troubleshoot, debug, and optimize software applications
Requirements:
- 5-10 years of experience in software development
- Proficiency in C#, .NET, Microservices architecture, SQL, and Azure services
- Experience with Agile development methodologies
- Strong problem-solving skills and attention to detail
- Ability to work independently and as part of a team
- Excellent communication skills and willingness to ask questions and seek guidance when needed
Additional Preferred Skills:
- Familiarity with other programming languages and technologies is a plus
About the Position
Our Expectations
• Strong knowledge of Microsoft Dynamics CRM architecture In-depth hands-on knowledge of the Microsoft Dynamics CRM platform, the entity model, security model, and Web services.
• Hands-on experience architecting solutions that involve CRM customization to include server side code, custom business logic, integration modules, workflow assemblies, and plug-ins. Experience of automating business processes with workflows
• Should be able to automate the CRM for converting the quote to order, which should be fetched from our websites or manual entry.
• Must Understand the functionality of Microsoft CRM Modules, Customizing Objects (Forms, Classes, Data Dictionary and Reporting)
• Should be well versed with Email Router Configuration and Reporting Extensions
• Customize the Ticketing Tool in CRM which should be used internally as well for clients
• Experience in implementation and integration between CRM with other tools like GP , AX etc
• Should have good written and verbal communication.
• Exposure to the following Great Plains Modules: Project management & accounting, Travel & Expenses, Procurement & Sourcing, Accounts Payable, Accounts Receivable, Inventory Management, Project management and accounting, Human Resources, System Administration.
• Involve in numerous customizations as per the client requirements in both Rich and Enterprise portal developments to enhance the standard functionality of Sales, Project management, and Expense, Home, Timesheets, Purchase and Inventory modules. Performed unit testing and prepared unit test documents.
• Exceptional communication, analytical, inter-personal and problem solving skills. Dedicated and highly ambitious to achieve personal as well as the organizational goals.
Must Have:
• 8+ years of experience with Microsoft Dynamics CRM (architecture, development, rollout, maintenance, data management, migration)
• C#
• SQL
• Proficient in English
• Functional requirement gathering, technical specification writing, system implementation, & integration development.
Salary as per Industry or 30% hike on current CTC
Responsibilities:
- Work closely with senior engineers to develop high-quality software solutions
- Collaborate with team members to analyze user requirements and design software solutions
- Participate in agile development processes, including sprint planning, daily stand-ups, and sprint reviews
- Write clean, maintainable, and efficient code
- Develop and maintain unit tests to ensure code quality
- Contribute to the design and architecture of microservices-based applications
- Collaborate with cross-functional teams to troubleshoot, debug, and optimize software applications
Requirements:
- 8+ years of experience in software development
- Proficiency in C#, .NET, Microservices architecture, SQL, and Azure services
- Experience with Agile development methodologies
- Strong problem-solving skills and attention to detail
- Ability to work independently and as part of a team
- Excellent communication skills and willingness to ask questions and seek guidance when needed
Additional Preferred Skills:
- Familiarity with other programming languages and technologies is a plus
We are looking for a highly skilled Senior Software Engineer with over 5 years of experience in full stack development using React.js and Node.js. As a senior member of our engineering team, you’ll take ownership of complex technical challenges, influence architecture decisions, mentor junior developers, and contribute to high-impact products.
Key Responsibilities:
Design, build, and maintain scalable web applications using React.js (frontend) and Node.js (backend).
Architect robust, secure, and scalable backend APIs and frontend components.
Collaborate closely with Product Managers, Designers, and DevOps to deliver end-to-end features.
Conduct code reviews, enforce best practices, and guide junior developers.
Optimize application performance, scalability, and responsiveness.
Troubleshoot, debug, and upgrade existing systems.
Stay current with new technologies and advocate for continuous improvement.
Required Qualifications:
Bachelor’s or Master’s degree in Computer Science, Engineering, or related field.
5+ years of experience in full stack development.
Strong expertise in React.js and related libraries (Redux, Hooks, etc.).
In-depth experience with Node.js, Express.js, and RESTful APIs.
Proficiency with JavaScript/TypeScript and modern frontend tooling (Webpack, Babel, etc.).
Experience with relational and NoSQL databases (e.g., PostgreSQL, MongoDB).
Solid understanding of CI/CD, testing (Jest, Mocha), and version control (Git).
Familiarity with cloud services (AWS/GCP/Azure) and containerization (Docker, Kubernetes) is a plus.
Excellent communication and problem-solving skills.
Nice to Have:
Experience with microservices architecture.
Knowledge of GraphQL.
Exposure to serverless computing.
Prior experience working in Agile/Scrum teams.
Work Mode: Hybrid
Need B.Tech, BE, M.Tech, ME candidates - Mandatory
Must-Have Skills:
● Educational Qualification :- B.Tech, BE, M.Tech, ME in any field.
● Minimum of 3 years of proven experience as a Data Engineer.
● Strong proficiency in Python programming language and SQL.
● Experience in DataBricks and setting up and managing data pipelines, data warehouses/lakes.
● Good comprehension and critical thinking skills.
● Kindly note Salary bracket will vary according to the exp. of the candidate -
- Experience from 4 yrs to 6 yrs - Salary upto 22 LPA
- Experience from 5 yrs to 8 yrs - Salary upto 30 LPA
- Experience more than 8 yrs - Salary upto 40 LPA




















