50+ SQL Jobs in Hyderabad | SQL Job openings in Hyderabad
Apply to 50+ SQL Jobs in Hyderabad on CutShort.io. Explore the latest SQL Job opportunities across top companies like Google, Amazon & Adobe.
About the company:
Aptroid Consulting (India) Pvt Ltd is a Web Development company focused on helping marketers transforms the customer experience increasing engagement and driving revenue, customer data to inform and drive it in every interaction in real time and with each individual behavior possibly.
About the Role:
We’re looking for a Senior Java & PHP Developer to join our backend engineering team that powers high-throughput, large-scale SaaS platforms — delivering billions of personalized communications and marketing events every month. You’ll work on mission-critical services that drive automation, data intelligence, and real-time campaign delivery across global clients. You’ll play a key role in designing scalable APIs, improving platform performance, and mentoring developers — while working closely with distributed teams aligned to US (EST) time zones.
Key Responsibilities:
- Architect, design, and develop scalable backend services using Java (Spring Boot, Microservices) and PHP (Laravel/Symfony/custom frameworks).
- Lead system design and architecture reviews, ensuring clean code, maintainability, and adherence to best practices.
- Drive API integrations, microservices deployment, and modernization of legacy components.
- Collaborate with product managers, DevOps, and data engineering teams to deliver high impact features and performance improvements.
- Build, maintain, and monitor data-intensive systems that handle large message/event volumes with high reliability.
- Implement strong observability practices (metrics, tracing, alerting) and contribute to production incident reviews.
- Perform code reviews, mentor junior engineers, and advocate engineering excellence.
- Work collaboratively across global teams during EST business hours for sprint planning, releases, and incident response.
Required Qualifications:
- 6–9 years of professional backend development experience.
- Expert in Java (Spring Boot, REST APIs, concurrency, JVM tuning) and PHP (Laravel/Symfony, Composer, PSR standards)
- Strong experience with MySQL / PostgreSQL, Redis, and NoSQL systems. • Familiarity with AWS services (S3, Lambda, ECS/EKS, CloudWatch) and CI/CD pipelines (Jenkins, GitLab, GitHub Actions).
- Hands-on experience in scalable, distributed architectures and performance optimization.
- Strong debugging, profiling, and system performance tuning capabilities.
- Proven ability to deliver reliable, secure, and production-ready code in fast-paced agile environments.
- Excellent communication skills; able to coordinate with global teams across time zones.
Preferred Skills:
- Exposure to Kafka, RabbitMQ, or other event-driven systems.
- Familiarity with containerization (Docker) and orchestration (Kubernetes).
- Experience integrating with third-party APIs and external SDKs.
- Prior experience in AdTech, Martech, or high-volume SaaS platforms (similar to Sailthru/Zeta Global ecosystem).
- Knowledge of Python/Go for cross-service utilities or internal tooling is a plus.
What We Offer:
- Opportunity to work on high-scale enterprise systems with real-world impact.
- Exposure to global engineering practices and advanced cloud architectures.
- Collaborative culture with technical ownership and innovation freedom.
- Competitive compensation aligned with experience and global standards.
Who we are:
Kanerika Inc. is a premier global software products and services firm that specializes in providing innovative solutions and services for data-driven enterprises. Our focus is to empower businesses to achieve their digital transformation goals and maximize their business impact through the effective use of data and AI. We leverage cutting-edge technologies in data analytics, data governance, AI-ML, GenAI/ LLM and industry best practices to deliver custom solutions that help organizations optimize their operations, enhance customer experiences, and drive growth.
Awards and Recognitions
Kanerika has won several awards over the years, including:
- Best Place to Work 2023 by Great Place to Work®
- Top 10 Most Recommended RPA Start-Ups in 2022 by RPA Today
- NASSCOM Emerge 50 Award in 2014
- Frost & Sullivan India 2021 Technology Innovation Award for its Kompass composable solution architecture
- Kanerika has also been recognized for its commitment to customer privacy and data security, having achieved ISO 27701, SOC2, and GDPR compliances.
Working for us
Kanerika is rated 4.6/5 on Glassdoor, for many good reasons. We truly value our employees' growth, well-being, and diversity, and people’s experiences bear this out. At Kanerika, we offer a host of enticing benefits that create an environment where you can thrive both personally and professionally. From our inclusive hiring practices and mandatory training on creating a safe work environment to our flexible working hours and generous parental leave, we prioritize the well-being and success of our employees. Our commitment to professional development is evident through our mentorship programs, job training initiatives, and support for professional certifications. Additionally, our company-sponsored outings and various time-off benefits ensure a healthy work-life balance. Join us at Kanerika and become part of a vibrant and diverse community where your talents are recognized, your growth is nurtured, and your contributions make a real impact. See the benefits section below for the perks you’ll get while working for Kanerika.
Locations
We are located in Austin (USA), Singapore, Hyderabad, Indore and Ahmedabad (India).
Job Location: Hyderabad, Indore and Ahmedabad.
Role:
We are looking for A highly skilled Full Stack .NET Developer with strong hands-on experience in C#, .NET Core, ASP.NET Core, Web API, and Microservices Architecture. Proficient in developing scalable and high-performing applications using SQL Server, NoSQL databases, and Entity Framework (v6+). Recognized for excellent troubleshooting, problem-solving, and communication skills, with the ability to collaborate effectively with cross-functional and international teams, including US counterparts.
Technical Skills
- Programming Languages: C#, TypeScript, JavaScript
- Frameworks & Technologies: .NET Core, ASP.NET Core, Web API, Angular (v10+), Entity Framework (v6+), Microservices Architecture
- Databases: SQL Server, NoSQL
- Cloud Platform: Microsoft Azure
- Design & Architecture: OOPs Concepts, Design Patterns, Reusable Libraries, Microservices Implementation
- Front-End Development: Angular Material, HTML5, CSS3, Responsive UI Development
- Additional Skills: Excellent troubleshooting abilities, strong communication (verbal & written), and effective collaboration with US counterparts
What You’ll Bring:
- Bachelor’s degree in Computer Science, Engineering, or a related field, or equivalent work experience.
- 2-5 years of experience.
- Proven experience delivering high-quality web applications.
Mandatory Skills
- Strong hands-on experience on C#, SQL Server, OOPS Concepts, Micro Services Architecture.
- Solid experience on .NET Core, ASP.NET Core, Web API, SQL, No SQL, Entity Framework 6 or above, Azure, Applying Design Patterns.
- ·Strong proficiency in Angular framework (v10+ preferred)and TypeScript & Solid understanding of HTML5, CSS3, JavaScript
- Skill for writing reusable libraries & Experience with Angular Material or other UI component libraries
- Excellent Communication skills both oral & written.
- Excellent troubleshooting and communication skills, ability to communicate clearly with US counter parts
Preferred Skills (Nice to Have):
- Self – Starter with solid analytical and problem- solving skills.
- Willingness to work extra hours to meet deliverables.
- Understanding of Agile/Scrum Methodologies.
- Exposure to cloud platform like AWS/Azure
Why join us?
- Work with a passionate and innovative team in a fast-paced, growth-oriented environment.
- Gain hands-on experience in content marketing with exposure to real-world projects.
- Opportunity to learn from experienced professionals and enhance your marketing skills.
- Contribute to exciting initiatives and make an impact from day one.
- Competitive stipend and potential for growth within the company.
Employee Benefits
1. Culture:
- Open Door Policy: Encourages open communication and accessibility to management.
- Open Office Floor Plan: Fosters a collaborative and interactive work environment.
- Flexible Working Hours: Allows employees to have flexibility in their work schedules.
- Employee Referral Bonus: Rewards employees for referring qualified candidates.
- Appraisal Process Twice a Year: Provides regular performance evaluations and feedback.
2. Inclusivity and Diversity:
- Hiring practices that promote diversity: Ensures a diverse and inclusive workforce.
- Mandatory POSH training: Promotes a safe and respectful work environment.
3. Health Insurance and Wellness Benefits:
- GMC and Term Insurance: Offers medical coverage and financial protection.
- Health Insurance: Provides coverage for medical expenses.
- Disability Insurance: Offers financial support in case of disability.
4. Child Care & Parental Leave Benefits:
- Company-sponsored family events: Creates opportunities for employees and their families to bond.
- Generous Parental Leave: Allows parents to take time off after the birth or adoption of a child.
- Family Medical Leave: Offers leave for employees to take care of family members' medical needs.
5. Perks and Time-Off Benefits:
- Company-sponsored outings: Organizes recreational activities for employees.
- Gratuity: Provides a monetary benefit as a token of appreciation.
- Provident Fund: Helps employees save for retirement.
- Generous PTO: Offers more than the industry standard for paid time off.
- Paid sick days: Allows employees to take paid time off when they are unwell.
- Paid holidays: Gives employees paid time off for designated holidays.
- Bereavement Leave: Provides time off for employees to grieve the loss of a loved one.
6. Professional Development Benefits:
- L&D with FLEX- Enterprise Learning Repository: Provides access to a learning repository for professional development.
- Mentorship Program: Offers guidance and support from experienced professionals.
- Job Training: Provides training to enhance job-related skills.
- Professional Certification Reimbursements: Assists employees in obtaining professional certifications.
- Promote from Within: Encourages internal growth and advancement opportunities.
Who we are:
Kanerika Inc. is a premier global software products and services firm that specializes in providing innovative solutions and services for data-driven enterprises. Our focus is to empower businesses to achieve their digital transformation goals and maximize their business impact through the effective use of data and AI.
We leverage cutting-edge technologies in data analytics, data governance, AI-ML, GenAI/ LLM and industry best practices to deliver custom solutions that help organizations optimize their operations, enhance customer experiences, and drive growth.
Awards and Recognitions:
Kanerika has won several awards over the years, including:
1. Best Place to Work 2023 by Great Place to Work®
2. Top 10 Most Recommended RPA Start-Ups in 2022 by RPA Today
3. NASSCOM Emerge 50 Award in 2014
4. Frost & Sullivan India 2021 Technology Innovation Award for its Kompass composable solution architecture
5. Kanerika has also been recognized for its commitment to customer privacy and data security, having achieved ISO 27701, SOC2, and GDPR compliances.
Working for us:
Kanerika is rated 4.6/5 on Glassdoor, for many good reasons. We truly value our employees' growth, well-being, and diversity, and people’s experiences bear this out. At Kanerika, we offer a host of enticing benefits that create an environment where you can thrive both personally and professionally. From our inclusive hiring practices and mandatory training on creating a safe work environment to our flexible working hours and generous parental leave, we prioritize the well-being and success of our employees.
Our commitment to professional development is evident through our mentorship programs, job training initiatives, and support for professional certifications. Additionally, our company-sponsored outings and various time-off benefits ensure a healthy work-life balance. Join us at Kanerika and become part of a vibrant and diverse community where your talents are recognized, your growth is nurtured, and your contributions make a real impact. See the benefits section below for the perks you’ll get while working for Kanerika.
About the Role:
We are looking for A highly skilled Full Stack .NET Developer with strong hands-on experience in C#, .NET Core, ASP.NET Core, Web API, and Microservices Architecture. Proficient in developing scalable and high-performing applications using SQL Server, NoSQL databases, and Entity Framework (v6+). Recognized for excellent troubleshooting, problem-solving, and communication skills, with the ability to collaborate effectively with cross-functional and international teams, including US counterparts.
Technical Skills:
- Programming Languages: C#, TypeScript, JavaScript
- Frameworks & Technologies: .NET Core, ASP.NET Core, Web API, Angular (v10+), Entity Framework (v6+), Microservices Architecture
- Databases: SQL Server, NoSQL
- Cloud Platform: Microsoft Azure
- Design & Architecture: OOPs Concepts, Design Patterns, Reusable Libraries, Microservices Implementation
- Front-End Development: Angular Material, HTML5, CSS3, Responsive UI Development
- Additional Skills: Excellent troubleshooting abilities, strong communication (verbal & written), and effective collaboration with US counterparts
What You’ll Bring:
- Bachelor’s degree in Computer Science, Engineering, or a related field, or equivalent work experience.
- 6+ years of experience
- Proven experience delivering high-quality web applications.
Mandatory Skills:
- Strong hands-on experience on C#, SQL Server, OOPS Concepts, Micro Services Architecture.
- Solid experience on .NET Core, ASP.NET Core, Web API, SQL, No SQL, Entity Framework 6 or above, Azure, Applying Design Patterns. Strong proficiency in Angular framework (v10+ preferred)and TypeScript & Solid understanding of HTML5, CSS3, JavaScript
- Skill for writing reusable libraries & Experience with Angular Material or other UI component libraries
- Excellent Communication skills both oral & written.
- Excellent troubleshooting and communication skills, ability to communicate clearly with US counter parts
Preferred Skills (Nice to Have):
- Self – Starter with solid analytical and problem- solving skills. Willingness to work extra hours to meet deliverables
- Understanding of Agile/Scrum Methodologies.
- Exposure to cloud platform like AWS/Azure
Employee Benefits:
1. Culture:
- Open Door Policy: Encourages open communication and accessibility to management.
- Open Office Floor Plan: Fosters a collaborative and interactive work environment.
- Flexible Working Hours: Allows employees to have flexibility in their work schedules.
- Employee Referral Bonus: Rewards employees for referring qualified candidates.
- Appraisal Process Twice a Year: Provides regular performance evaluations and feedback.
2. Inclusivity and Diversity:
- Hiring practices that promote diversity: Ensures a diverse and inclusive workforce.
- Mandatory POSH training: Promotes a safe and respectful work environment.
3. Health Insurance and Wellness Benefits:
- GMC and Term Insurance: Offers medical coverage and financial protection.
- Health Insurance: Provides coverage for medical expenses.
- Disability Insurance: Offers financial support in case of disability.
4. Child Care & Parental Leave Benefits:
- Company-sponsored family events: Creates opportunities for employees and their families to bond.
- Generous Parental Leave: Allows parents to take time off after the birth or adoption of a child.
- Family Medical Leave: Offers leave for employees to take care of family members' medical needs.
5. Perks and Time-Off Benefits:
- Company-sponsored outings: Organizes recreational activities for employees.
- Gratuity: Provides a monetary benefit as a token of appreciation.
- Provident Fund: Helps employees save for retirement.
- Generous PTO: Offers more than the industry standard for paid time off.
- Paid sick days: Allows employees to take paid time off when they are unwell.
- Paid holidays: Gives employees paid time off for designated holidays.
- Bereavement Leave: Provides time off for employees to grieve the loss of a loved one.
6. Professional Development Benefits:
- L&D with FLEX- Enterprise Learning Repository: Provides access to a learning repository for professional development.
- Mentorship Program: Offers guidance and support from experienced professionals.
- Job Training: Provides training to enhance job-related skills.
- Professional Certification Reimbursements: Assists employees in obtaining professional certifications.
- Promote from Within: Encourages internal growth and advancement opportunities.
ROLES AND RESPONSIBILITIES:
You will be responsible for architecting, implementing, and optimizing Dremio-based data Lakehouse environments integrated with cloud storage, BI, and data engineering ecosystems. The role requires a strong balance of architecture design, data modeling, query optimization, and governance enablement in large-scale analytical environments.
- Design and implement Dremio lakehouse architecture on cloud (AWS/Azure/Snowflake/Databricks ecosystem).
- Define data ingestion, curation, and semantic modeling strategies to support analytics and AI workloads.
- Optimize Dremio reflections, caching, and query performance for diverse data consumption patterns.
- Collaborate with data engineering teams to integrate data sources via APIs, JDBC, Delta/Parquet, and object storage layers (S3/ADLS).
- Establish best practices for data security, lineage, and access control aligned with enterprise governance policies.
- Support self-service analytics by enabling governed data products and semantic layers.
- Develop reusable design patterns, documentation, and standards for Dremio deployment, monitoring, and scaling.
- Work closely with BI and data science teams to ensure fast, reliable, and well-modeled access to enterprise data.
IDEAL CANDIDATE:
- Bachelor’s or Master’s in Computer Science, Information Systems, or related field.
- 5+ years in data architecture and engineering, with 3+ years in Dremio or modern lakehouse platforms.
- Strong expertise in SQL optimization, data modeling, and performance tuning within Dremio or similar query engines (Presto, Trino, Athena).
- Hands-on experience with cloud storage (S3, ADLS, GCS), Parquet/Delta/Iceberg formats, and distributed query planning.
- Knowledge of data integration tools and pipelines (Airflow, DBT, Kafka, Spark, etc.).
- Familiarity with enterprise data governance, metadata management, and role-based access control (RBAC).
- Excellent problem-solving, documentation, and stakeholder communication skills.
PREFERRED:
- Experience integrating Dremio with BI tools (Tableau, Power BI, Looker) and data catalogs (Collibra, Alation, Purview).
- Exposure to Snowflake, Databricks, or BigQuery environments.
- Experience in high-tech, manufacturing, or enterprise data modernization programs.
We are seeking a motivated Data Analyst to support business operations by analyzing data, preparing reports, and delivering meaningful insights. The ideal candidate should be comfortable working with data, identifying patterns, and presenting findings in a clear and actionable way.
Key Responsibilities:
- Collect, clean, and organize data from internal and external sources
- Analyze large datasets to identify trends, patterns, and opportunities
- Prepare regular and ad-hoc reports for business stakeholders
- Create dashboards and visualizations using tools like Power BI or Tableau
- Work closely with cross-functional teams to understand data requirements
- Ensure data accuracy, consistency, and quality across reports
- Document data processes and analysis methods
Job Description
Role: Data Analyst
Experience: 6 - 9 Years
Location: Hyderabad
WorkMode: Work from Office (5 Days)
Overview
We are seeking a highly skilled Data Analyst with 6+ years of experience in analytics, data modeling, and advanced SQL. The ideal candidate has strong expertise in building scalable data models using dbt, writing efficient Python scripts, and delivering high-quality insights that support data-driven decision-making.
Key Responsibilities
Design, develop, and maintain data models using dbt (Core and dbt Cloud).
Build and optimize complex SQL queries to support reporting, analytics, and data pipelines.
Write Python scripts for data transformation, automation, and analytics workflows.
Ensure data quality, integrity, and consistency across multiple data sources.
Collaborate with cross-functional teams (Engineering, Product, Business) to understand data needs.
Develop dashboards and reports to visualize insights (using tools such as Tableau, Looker, or Power BI).
Perform deep-dive exploratory analysis to identify trends, patterns, and business opportunities.
Document data models, pipelines, and processes.
Contribute to scaling the analytics stack and improving data architecture.
Required Qualifications
6 - 9 years of hands-on experience in data analytics or data engineering.
Expert-level skills in SQL (complex joins, window functions, performance tuning).
Strong experience building and maintaining dbt data models.
Proficiency in Python for data manipulation, scripting, and automation.
Solid understanding of data warehousing concepts (e.g., dimensional modeling, ELT/ETL pipelines).
Understanding with cloud data platforms (Snowflake, BigQuery, Redshift, etc.).
Strong analytical thinking and problem-solving skills.
Excellent communication skills with the ability to present insights to stakeholders.
Trino and lakehouse architecture experience good to have
Required Skills: CI/CD Pipeline, Kubernetes, SQL Database, Excellent Communication & Stakeholder Management, Python
Criteria:
Looking for 15days and max 30 days of notice period candidates.
looking candidates from Hyderabad location only
Looking candidates from EPAM company only
1.4+ years of software development experience
2. Strong experience with Kubernetes, Docker, and CI/CD pipelines in cloud-native environments.
3. Hands-on with NATS for event-driven architecture and streaming.
4. Skilled in microservices, RESTful APIs, and containerized app performance optimization.
5. Strong in problem-solving, team collaboration, clean code practices, and continuous learning.
6. Proficient in Python (Flask) for building scalable applications and APIs.
7. Focus: Java, Python, Kubernetes, Cloud-native development
8. SQL database
Description
Position Overview
We are seeking a skilled Developer to join our engineering team. The ideal candidate will have strong expertise in Java and Python ecosystems, with hands-on experience in modern web technologies, messaging systems, and cloud-native development using Kubernetes.
Key Responsibilities
- Design, develop, and maintain scalable applications using Java and Spring Boot framework
- Build robust web services and APIs using Python and Flask framework
- Implement event-driven architectures using NATS messaging server
- Deploy, manage, and optimize applications in Kubernetes environments
- Develop microservices following best practices and design patterns
- Collaborate with cross-functional teams to deliver high-quality software solutions
- Write clean, maintainable code with comprehensive documentation
- Participate in code reviews and contribute to technical architecture decisions
- Troubleshoot and optimize application performance in containerized environments
- Implement CI/CD pipelines and follow DevOps best practices
Required Qualifications
- Bachelor's degree in Computer Science, Information Technology, or related field
- 4+ years of experience in software development
- Strong proficiency in Java with deep understanding of web technology stack
- Hands-on experience developing applications with Spring Boot framework
- Solid understanding of Python programming language with practical Flask framework experience
- Working knowledge of NATS server for messaging and streaming data
- Experience deploying and managing applications in Kubernetes
- Understanding of microservices architecture and RESTful API design
- Familiarity with containerization technologies (Docker)
- Experience with version control systems (Git)
Skills & Competencies
- Skills Java (Spring Boot, Spring Cloud, Spring Security)
- Python (Flask, SQL Alchemy, REST APIs)
- NATS messaging patterns (pub/sub, request/reply, queue groups)
- Kubernetes (deployments, services, ingress, ConfigMaps, Secrets)
- Web technologies (HTTP, REST, WebSocket, gRPC)
- Container orchestration and management
- Soft Skills Problem-solving and analytical thinking
- Strong communication and collaboration
- Self-motivated with ability to work independently
- Attention to detail and code quality
- Continuous learning mindset
- Team player with mentoring capabilities
Functional Testing & Validation
- Web Application Testing: Design, document, and execute comprehensive functional test plans and test cases for complex, highly interactive web applications, ensuring they meet specified requirements and provide an excellent user experience.
- Backend API Testing: Possess deep expertise in validating backend RESTful and/or SOAP APIs. This includes testing request/response payloads, status codes, data integrity, security, and robust error handling mechanisms.
- Data Validation with SQL: Write and execute complex SQL queries (joins, aggregations, conditional logic) to perform backend data checks, verify application states, and ensure data integrity across integration points.
- I Automation (Playwright & TypeScript):
- Design, develop, and maintain robust, scalable, and reusable UI automation scripts using Playwright and TypeScript.
- Integrate automation suites into Continuous Integration/Continuous Deployment (CI/CD) pipelines.
- Implement advanced automation patterns and frameworks (e.g., Page Object Model) to enhance maintainability.
- Prompt-Based Automation: Demonstrate familiarity or hands-on experience with emerging AI-driven or prompt-based automation approaches and tools to accelerate test case generation and execution.
- API Automation: Develop and maintain automated test suites for APIs to ensure reliability and performance.
3. Performance & Load Testing
- JMeter Proficiency: Utilize Apache JMeter to design, script, and execute robust API load testing and stress testing scenarios.
- Analyse performance metrics, identify bottlenecks (e.g., response time, throughput), and provide actionable reports to development teams.
🛠️ Required Skills and Qualifications
- Experience: 4+ years of professional experience in Quality Assurance and Software Testing, with a strong focus on automation.
- Automation Stack: Expert-level proficiency in developing and maintaining automation scripts using Playwright and TypeScript.
- Testing Tools: Proven experience with API testing tools (e.g., Postman, Swagger) and strong functional testing methodologies.
- Database Skills: Highly proficient in writing and executing complex SQL queries for data validation and backend verification.
- Performance: Hands-on experience with Apache JMeter for API performance and load testing.
- Communication: Excellent communication and collaboration skills to work effectively with cross-functional teams (Developers, Product Managers).
- Problem-Solving: Strong analytical and debugging skills to efficiently isolate and report defects.

Global digital transformation solutions provider.
Job Description – Senior Technical Business Analyst
Location: Trivandrum (Preferred) | Open to any location in India
Shift Timings - 8 hours window between the 7:30 PM IST - 4:30 AM IST
About the Role
We are seeking highly motivated and analytically strong Senior Technical Business Analysts who can work seamlessly with business and technology stakeholders to convert a one-line problem statement into a well-defined project or opportunity. This role is ideal for fresh graduates who have a strong foundation in data analytics, data engineering, data visualization, and data science, along with a strong drive to learn, collaborate, and grow in a dynamic, fast-paced environment.
As a Technical Business Analyst, you will be responsible for translating complex business challenges into actionable user stories, analytical models, and executable tasks in Jira. You will work across the entire data lifecycle—from understanding business context to delivering insights, solutions, and measurable outcomes.
Key Responsibilities
Business & Analytical Responsibilities
- Partner with business teams to understand one-line problem statements and translate them into detailed business requirements, opportunities, and project scope.
- Conduct exploratory data analysis (EDA) to uncover trends, patterns, and business insights.
- Create documentation including Business Requirement Documents (BRDs), user stories, process flows, and analytical models.
- Break down business needs into concise, actionable, and development-ready user stories in Jira.
Data & Technical Responsibilities
- Collaborate with data engineering teams to design, review, and validate data pipelines, data models, and ETL/ELT workflows.
- Build dashboards, reports, and data visualizations using leading BI tools to communicate insights effectively.
- Apply foundational data science concepts such as statistical analysis, predictive modeling, and machine learning fundamentals.
- Validate and ensure data quality, consistency, and accuracy across datasets and systems.
Collaboration & Execution
- Work closely with product, engineering, BI, and operations teams to support the end-to-end delivery of analytical solutions.
- Assist in development, testing, and rollout of data-driven solutions.
- Present findings, insights, and recommendations clearly and confidently to both technical and non-technical stakeholders.
Required Skillsets
Core Technical Skills
- 6+ years of Technical Business Analyst experience within an overall professional experience of 8+ years
- Data Analytics: SQL, descriptive analytics, business problem framing.
- Data Engineering (Foundational): Understanding of data warehousing, ETL/ELT processes, cloud data platforms (AWS/GCP/Azure preferred).
- Data Visualization: Experience with Power BI, Tableau, or equivalent tools.
- Data Science (Basic/Intermediate): Python/R, statistical methods, fundamentals of ML algorithms.
Soft Skills
- Strong analytical thinking and structured problem-solving capability.
- Ability to convert business problems into clear technical requirements.
- Excellent communication, documentation, and presentation skills.
- High curiosity, adaptability, and eagerness to learn new tools and techniques.
Educational Qualifications
- BE/B.Tech or equivalent in:
- Computer Science / IT
- Data Science
What We Look For
- Demonstrated passion for data and analytics through projects and certifications.
- Strong commitment to continuous learning and innovation.
- Ability to work both independently and in collaborative team environments.
- Passion for solving business problems using data-driven approaches.
- Proven ability (or aptitude) to convert a one-line business problem into a structured project or opportunity.
Why Join Us?
- Exposure to modern data platforms, analytics tools, and AI technologies.
- A culture that promotes innovation, ownership, and continuous learning.
- Supportive environment to build a strong career in data and analytics.
Skills: Data Analytics, Business Analysis, Sql
Must-Haves
Technical Business Analyst (6+ years), SQL, Data Visualization (Power BI, Tableau), Data Engineering (ETL/ELT, cloud platforms), Python/R
******
Notice period - 0 to 15 days (Max 30 Days)
Educational Qualifications: BE/B.Tech or equivalent in: (Computer Science / IT) /Data Science
Location: Trivandrum (Preferred) | Open to any location in India
Shift Timings - 8 hours window between the 7:30 PM IST - 4:30 AM IST
About the Role
We are looking for a strong, self-driven QA Engineer who can perform a hybrid role in the new testing paradigm — acting as both a Business Analyst (BA) and a Quality Assurance (QA) professional. The ideal candidate should be capable of understanding business needs under direction, translating them into clear requirements, and then validating them through effective QA practices.
This role requires someone who can leverage AI tools extensively to automate and optimize both requirements documentation and QA activities, reducing manual effort while improving speed and accuracy.
Key Responsibilities
Business Analysis Responsibilities
- Work under direction to understand business problems, workflows, and client expectations
- Elicit, analyze, and document business and functional requirements
- Create and maintain BRDs, FRDs, user stories, acceptance criteria, and process flows
- Collaborate with stakeholders, developers, and product teams to clarify requirements
- Use AI tools to assist with requirement generation, refinement, documentation, and validation
Quality Assurance Responsibilities
- Design, develop, and execute manual and automated test cases based on documented requirements
- Perform functional, regression, smoke, sanity, and UAT testing
- Ensure traceability between requirements and test cases
- Identify, log, track, and retest defects using defect tracking tools
- Collaborate closely with development teams to ensure quality delivery
- Use AI-powered QA tools to automate test case creation, execution, and maintenance
AI & Automation Focus
- Use AI tools to:
- Generate and refine requirements and user stories
- Auto-create test cases from requirements
- Optimize regression test suites
- Perform test data generation and defect analysis
- Continuously identify areas where AI can reduce manual effort and improve efficiency
- Ensure quality, accuracy, and business alignment of AI-generated outputs
Required Skills & Qualifications
- 1–3 years of experience in QA / Software Testing, with exposure to Business Analysis activities
- Strong understanding of SDLC, STLC, and Agile methodologies
- Proven ability to understand requirements and translate them into effective test scenarios
- Experience with QA Automation tools (Selenium, Cypress, Playwright, or similar)
- Hands-on experience using AI tools for QA and documentation (AI test generators, AI copilots, testRigor, Gen AI tools, etc.)
- Good knowledge of test case design techniques and requirement traceability
- Basic to intermediate knowledge of programming/scripting languages (Java, JavaScript, or Python)
- Experience with API testing (Postman or similar tools)
- Familiarity with JIRA, Confluence, or similar tools
- Strong analytical, problem-solving, and documentation skills
- Ability to take direction, work independently, and deliver with minimal supervision
Educational Qualifications
- B.Tech / B.E in IT, CSE, AI/ML, ECE
- M.Tech / M.E in IT, CSE, AI/ML, ECE
- Strong academic foundation in programming, software engineering, or testing concepts is preferred
- Certifications in Software Testing, Automation, or AI tools (optional but an added advantage)
1、Job Responsibilities:
Backend Development (.NET)
- Design and implement ASP.NET Core WebAPIs
- Design and implement background jobs using Azure Function Apps
- Optimize performance for long-running operations, ensuring high concurrency and system stability.
- Develop efficient and scalable task scheduling solutions to execute periodic tasks
Frontend Development (React)
- Build high-performance, maintainable React applications and optimize component rendering.
- Continuously improving front-end performance using best practices
- Deployment & Operations
- Deploy React applications on Azure platforms (Azure Web Apps), ensuring smooth and reliable delivery.
- Collaborate with DevOps teams to enhance CI/CD pipelines and improve deployment efficiency.
2、Job Requirements:
Tech Stack:
- Backend: ASP.NET Core Web API, C#
- Frontend: React, JavaScript/TypeScript, Redux or other state management libraries
- Azure: Function Apps, Web Apps, Logic Apps
- Database: Cosmos DB, SQL Server
Strong knowledge of asynchronous programming, performance optimization, and task scheduling
- Proficiency in React performance optimization techniques, understanding of virtual DOM and component lifecycle.
- Experience with cloud deployment, preferably Azure App Service or Azure Static Web Apps.
- Familiarity with Git and CI/CD workflows, with strong coding standards.
3、Project Background:
Mission: Transform Microsoft Cloud customers into fans by delivering exceptional support and engagement.
- Scope:
- Customer reliability engineering
- Advanced cloud engineering and supportability
- Business management and operations
- Product and platform orchestration
- Activities:
- Technical skilling programs
- AI strategy for customer experience
- Handling escalations and service reliability issues
4、Project Highlights:
React Js, ASP.NET Core Web API; Azure Function Apps, Cosmos DB
We are seeking a Technical Lead with strong expertise in backend engineering, real-time data streaming, and platform/infrastructure development to lead the architecture and delivery of our on-premise systems.
You will design and build high-throughput streaming pipelines (Apache Pulsar, Apache Flink), backend services (FastAPI), data storage models (MongoDB, ClickHouse), and internal dashboards/tools (Angular).
In this role, you will guide engineers, drive architectural decisions, and ensure reliable systems deployed on Docker + Kubernetes clusters.
Key Responsibilities
1. Technical Leadership & Architecture
- Own the end-to-end architecture for backend, streaming, and data systems.
- Drive system design decisions for ingestion, processing, storage, and DevOps.
- Review code, enforce engineering best practices, and ensure production readiness.
- Collaborate closely with founders and domain experts to translate requirements into technical deliverables.
2. Data Pipeline & Streaming Systems
- Architect and implement real-time, high-throughput data pipelines using Apache Pulsar and Apache Flink.
- Build scalable ingestion, enrichment, and stateful processing workflows.
- Integrate multi-sensor maritime data into reliable, unified streaming systems.
3. Backend Services & Platform Engineering
- Lead development of microservices and internal APIs using FastAPI (or equivalent backend frameworks).
- Build orchestration, ETL, and system-control services.
- Optimize backend systems for latency, throughput, resilience, and long-term maintainability.
4. Data Storage & Modeling
- Design scalable, efficient data models using MongoDB, ClickHouse, and other on-prem databases.
- Implement indexing, partitioning, retention, and lifecycle strategies for large datasets.
- Ensure high-performance APIs and analytics workflows.
5. Infrastructure, DevOps & Containerization
- Deploy and manage distributed systems using Docker and Kubernetes.
- Own observability, monitoring, logging, and alerting for all critical services.
- Implement CI/CD pipelines tailored for on-prem and hybrid cloud environments.
6. Team Management & Mentorship
- Provide technical guidance to engineers across backend, data, and DevOps teams.
- Break down complex tasks, review designs, and ensure high-quality execution.
- Foster a culture of clarity, ownership, collaboration, and engineering excellence.
Required Skills & Experience
- 5–10+ years of strong software engineering experience.
- Expertise with streaming platforms like Apache Pulsar, Apache Flink, or similar technologies.
- Strong backend engineering proficiency — preferably FastAPI, Python, Java, or Scala.
- Hands-on experience with MongoDB and ClickHouse.
- Solid experience deploying, scaling, and managing services on Docker + Kubernetes.
- Strong understanding of distributed systems, high-performance data flows, and system tuning.
- Experience working with Angular for internal dashboards is a plus.
- Excellent system-design, debugging, and performance-optimization skills.
- Prior experience owning critical technical components or leading engineering teams.
Nice to Have
- Experience with sensor data (AIS, Radar, SAR, EO/IR).
- Exposure to maritime, defence, or geospatial technology.
- Experience with bare-metal / on-premise deployments.
Review Criteria
- Strong Dremio / Lakehouse Data Architect profile
- 5+ years of experience in Data Architecture / Data Engineering, with minimum 3+ years hands-on in Dremio
- Strong expertise in SQL optimization, data modeling, query performance tuning, and designing analytical schemas for large-scale systems
- Deep experience with cloud object storage (S3 / ADLS / GCS) and file formats such as Parquet, Delta, Iceberg along with distributed query planning concepts
- Hands-on experience integrating data via APIs, JDBC, Delta/Parquet, object storage, and coordinating with data engineering pipelines (Airflow, DBT, Kafka, Spark, etc.)
- Proven experience designing and implementing lakehouse architecture including ingestion, curation, semantic modeling, reflections/caching optimization, and enabling governed analytics
- Strong understanding of data governance, lineage, RBAC-based access control, and enterprise security best practices
- Excellent communication skills with ability to work closely with BI, data science, and engineering teams; strong documentation discipline
- Candidates must come from enterprise data modernization, cloud-native, or analytics-driven companies
Preferred
- Preferred (Nice-to-have) – Experience integrating Dremio with BI tools (Tableau, Power BI, Looker) or data catalogs (Collibra, Alation, Purview); familiarity with Snowflake, Databricks, or BigQuery environments
Job Specific Criteria
- CV Attachment is mandatory
- How many years of experience you have with Dremio?
- Which is your preferred job location (Mumbai / Bengaluru / Hyderabad / Gurgaon)?
- Are you okay with 3 Days WFO?
- Virtual Interview requires video to be on, are you okay with it?
Role & Responsibilities
You will be responsible for architecting, implementing, and optimizing Dremio-based data lakehouse environments integrated with cloud storage, BI, and data engineering ecosystems. The role requires a strong balance of architecture design, data modeling, query optimization, and governance enablement in large-scale analytical environments.
- Design and implement Dremio lakehouse architecture on cloud (AWS/Azure/Snowflake/Databricks ecosystem).
- Define data ingestion, curation, and semantic modeling strategies to support analytics and AI workloads.
- Optimize Dremio reflections, caching, and query performance for diverse data consumption patterns.
- Collaborate with data engineering teams to integrate data sources via APIs, JDBC, Delta/Parquet, and object storage layers (S3/ADLS).
- Establish best practices for data security, lineage, and access control aligned with enterprise governance policies.
- Support self-service analytics by enabling governed data products and semantic layers.
- Develop reusable design patterns, documentation, and standards for Dremio deployment, monitoring, and scaling.
- Work closely with BI and data science teams to ensure fast, reliable, and well-modeled access to enterprise data.
Ideal Candidate
- Bachelor’s or master’s in computer science, Information Systems, or related field.
- 5+ years in data architecture and engineering, with 3+ years in Dremio or modern lakehouse platforms.
- Strong expertise in SQL optimization, data modeling, and performance tuning within Dremio or similar query engines (Presto, Trino, Athena).
- Hands-on experience with cloud storage (S3, ADLS, GCS), Parquet/Delta/Iceberg formats, and distributed query planning.
- Knowledge of data integration tools and pipelines (Airflow, DBT, Kafka, Spark, etc.).
- Familiarity with enterprise data governance, metadata management, and role-based access control (RBAC).
- Excellent problem-solving, documentation, and stakeholder communication skills.
As a Data Quality Engineer at PalTech, you will be responsible for designing and executing comprehensive test strategies for end-to-end data validation. Your role will ensure data completeness, accuracy, and integrity across ETL processes, data warehouses, and reporting environments. You will automate data validation using Python, validate fact and dimension tables, large datasets, file ingestions, and data exports, while ensuring adherence to data security standards, including encryption and authorization. This role requires strong analytical abilities, proficiency in SQL and Python, and the capability to collaborate effectively with cross-functional teams to drive continuous improvements through automation and best practices.
Key Responsibilities
- Create test strategies, test plans, business scenarios, and data validation scripts for end-to-end data validation.
- Verify data completeness, accuracy, and integrity throughout ETL processes, data pipelines, and reports.
- Evaluate and monitor the performance of ETL jobs to ensure adherence to defined SLAs.
- Automate data testing processes using Python or other relevant technologies.
- Validate various types of fact and dimension tables within data warehouse environments.
- Apply strong data warehousing (DWH) skills to ensure accurate data modeling and validation.
- Validate large datasets and ensure accuracy across relational databases.
- Validate file ingestions and data exports across different data sources.
- Assess and validate implementation of data security standards (encryption, authorization, anonymization).
- Demonstrate proficiency in SQL, Python, and ETL/ELT validation techniques.
- Validate reports and dashboards built on Power BI, Tableau, or similar platforms.
- Write complex scripts to validate business logic and KPIs across datasets.
- Create test data as required based on business use cases and scenarios.
- Identify, validate, and test corner business cases and edge scenarios.
- Prepare comprehensive test documentation including test cases, test results, and test summary reports.
- Collaborate closely with developers, business analysts, data architects, and other stakeholders.
- Recommend enhancements and implement best practices to strengthen and streamline testing processes.
Required Skills and Qualifications
- Education: Bachelor’s degree in Computer Science, Information Technology, or a related discipline.
- Technical Expertise: Strong understanding of ETL processes, data warehousing concepts, SQL, and Python.
- Experience: 4–6 years of experience in ETL testing, data validation, and report/dashboard validation; prior experience in automating data validation processes.
- Tools: Hands-on experience with ETL tools such as ADF, DBT, etc., defect tracking systems like JIRA, and reporting platforms such as Power BI or Tableau.
- Soft Skills: Excellent communication and teamwork abilities, with strong analytical and problem-solving skills.
Why Join PalTech?
- Great Place to Work Certified: We prioritize employee well-being and nurture an inclusive, collaborative environment where everyone can thrive.
- Competitive compensation, strong learning and professional g
As a Data Engineer at PalTech, you will design, develop, and maintain scalable and reliable data pipelines to ensure seamless data flow across systems. You will leverage SQL and leading ETL tools (such as Informatica, ADF, etc.) to support data integration and transformation needs. This role involves building and optimizing data warehouse architectures, performing performance tuning, and ensuring high levels of data quality, accuracy, and consistency throughout the data lifecycle.
You will collaborate closely with cross-functional teams to understand business requirements and translate them into effective data solutions. The ideal candidate should possess strong problem-solving skills, sound knowledge of data architecture principles, and a passion for building clean and efficient data systems.
Key Responsibilities
- Design, develop, and maintain ETL/ELT pipelines using SQL and tools such as Informatica, ADF, etc.
- Build and optimize data warehouse and data lake solutions for reporting, analytics, and operational usage.
- Apply strong understanding of data warehousing concepts to architect scalable data solutions.
- Handle large datasets and design effective load/update strategies.
- Collaborate with data analysts, business users, and data scientists to understand requirements and deliver scalable solutions.
- Implement data quality checks and validation frameworks to ensure data reliability and integrity.
- Perform SQL and ETL performance tuning and optimization.
- Work with structured and semi-structured data from various source systems.
- Monitor, troubleshoot, and resolve issues in data workflows.
- Maintain documentation for data pipelines, data flows, and data definitions.
- Follow best practices in data engineering including security, logging, and error handling.
Required Skills & Qualifications
Education:
- Bachelor’s degree in Computer Science, Information Technology, or a related field.
Technical Skills:
- Strong proficiency in SQL and data manipulation.
- Hands-on experience with ETL tools (e.g., Informatica, Talend, ADF).
- Experience with cloud data warehouse platforms such as BigQuery, Redshift, or Snowflake.
- Strong understanding of data warehousing concepts and data modeling.
- Proficiency in Python or a similar programming language.
- Experience working with RDBMS platforms (e.g., SQL Server, Oracle).
- Familiarity with version control systems and job schedulers.
Experience:
- 4 to 8 years of relevant experience in data engineering and ETL development.
Soft Skills:
- Strong problem-solving skills.
- Excellent communication and collaboration abilities.
- Ability to work effectively in a cross-functional team environment.
Responsibilities:
Build and optimize batch and streaming data pipelines using Apache Beam (Dataflow)
Design and maintain BigQuery datasets using best practices in partitioning, clustering, and materialized views
Develop and manage Airflow DAGs in Cloud Composer for workflow orchestration
Implement SQL-based transformations using Dataform (or dbt)
Leverage Pub/Sub for event-driven ingestion and Cloud Storage for raw/lake layer data architecture
Drive engineering best practices across CI/CD, testing, monitoring, and pipeline observability
Partner with solution architects and product teams to translate data requirements into technical designs
Mentor junior data engineers and support knowledge-sharing across the team
Contribute to documentation, code reviews, sprint planning, and agile ceremonies
Requirements
2+ years of hands-on experience in data engineering, with at least 2 years on GCP
Proven expertise in BigQuery, Dataflow (Apache Beam), Cloud Composer (Airflow)
Strong programming skills in Python and/or Java
Experience with SQL optimization, data modeling, and pipeline orchestration
Familiarity with Git, CI/CD pipelines, and data quality monitoring frameworks
Exposure to Dataform, dbt, or similar tools for ELT workflows
Solid understanding of data architecture, schema design, and performance tuning
Excellent problem-solving and collaboration skills
Bonus Skills:
GCP Professional Data Engineer certification
Experience with Vertex AI, Cloud Functions, Dataproc, or real-time streaming architectures
Familiarity with data governance tools (e.g., Atlan, Collibra, Dataplex)
Exposure to Docker/Kubernetes, API integration, and infrastructure-as-code (Terraform)
Senior Software Engineer
Location: Hyderabad, India
Who We Are:
Since our inception back in 2006, Navitas has grown to be an industry leader in the digital transformation space, and we’ve served as trusted advisors supporting our client base within the commercial, federal, and state and local markets.
What We Do:
At our very core, we’re a group of problem solvers providing our award-winning technology solutions to drive digital acceleration for our customers! With proven solutions, award-winning technologies, and a team of expert problem solvers, Navitas has consistently empowered customers to use technology as a competitive advantage and deliver cutting-edge transformative solutions.
What You’ll Do:
Build, Innovate, and Own:
- Design, develop, and maintain high-performance microservices in a modern .NET/C# environment.
- Architect and optimize data pipelines and storage solutions that power our AI-driven products.
- Collaborate closely with AI and data teams to bring machine learning models into production systems.
- Build integrations with external services and APIs to enable scalable, interoperable solutions.
- Ensure robust security, scalability, and observability across distributed systems.
- Stay ahead of the curve — evaluating emerging technologies and contributing to architectural decisions for our next-gen platform.
Responsibilities will include but are not limited to:
- Provide technical guidance and code reviews that raise the bar for quality and performance.
- Help create a growth-minded engineering culture that encourages experimentation, learning, and accountability.
What You’ll Need:
- Bachelor’s degree in Computer Science or equivalent practical experience.
- 8+ years of professional experience, including 5+ years designing and maintaining scalable backend systems using C#/.NET and microservices architecture.
- Strong experience with SQL and NoSQL data stores.
- Solid hands-on knowledge of cloud platforms (AWS, GCP, or Azure).
- Proven ability to design for performance, reliability, and security in data-intensive systems.
- Excellent communication skills and ability to work effectively in a global, cross-functional environment.
Set Yourself Apart With:
- Startup experience - specifically in building product from 0-1
- Exposure to AI/ML-powered systems, data engineering, or large-scale data processing.
- Experience in healthcare or fintech domains.
- Familiarity with modern DevOps practices, CI/CD pipelines, and containerization (Docker/Kubernetes).
Equal Employer/Veterans/Disabled
Navitas Business Consulting is an affirmative action and equal opportunity employer. If reasonable accommodation is needed to participate in the job application or interview process, to perform essential job functions, and/or to receive other benefits and privileges of employment, please contact Navitas Human Resources.
Navitas is an equal opportunity employer. We provide employment and opportunities for advancement, compensation, training, and growth according to individual merit, without regard to race, color, religion, sex (including pregnancy), national origin, sexual orientation, gender identity or expression, marital status, age, genetic information, disability, veteran-status veteran or military status, or any other characteristic protected under applicable Federal, state, or local law. Our goal is for each staff member to have the opportunity to grow to the limits of their abilities and to achieve personal and organizational objectives. We will support positive programs for equal treatment of all staff and full utilization of all qualified employees at all levels within Navita
Who We Are
At Sonatype, we help organizations build better, more secure software by enabling them to understand and control their software supply chains. Our products are trusted by thousands of engineering teams globally, providing critical insights into dependency health, license risk, and software security. We’re passionate about empowering developers—and we back it with data.
The Opportunity
We’re looking for a Data Engineer with full stack expertise to join our growing Data Platform team. This role blends data engineering, microservices, and full-stack development to deliver end-to-end services that power analytics, machine learning, and advanced search across Sonatype.
You will design and build data-driven microservices and workflows using Java, Python, and Spring Batch, implement frontends for data workflows, and deploy everything through CI/CD pipelines into AWS ECS/Fargate. You’ll also ensure services are monitorable, debuggable, and reliable at scale, while clearly documenting designs with Mermaid-based sequence and dataflow diagrams.
This is a hands-on engineering role for someone who thrives at the intersection of data systems, fullstack development, ML, and cloud-native platforms.
What You’ll Do
- Design, build, and maintain data pipelines, ETL/ELT workflows, and scalable microservices.
- Development of complex web scraping (Playwright) and realtime pipelines (Kafka/Queues/Flink).
- Develop end-to-end microservices with backend (Java 5+, Python 5+, Spring Batch 2+) and frontend (React or any).
- Deploy, publish, and operate services in AWS ECS/Fargate using CI/CD pipelines (Jenkins, GitOps).
- Architect and optimize data storage models in SQL (MySQL, PostgreSQL) and NoSQL stores.
- Implement web scraping and external data ingestion pipelines.
- Enable Databricks and PySpark-based workflows for large-scale analytics.
- Build advanced data search capabilities (fuzzy matching, vector similarity search, semantic retrieval).
- Apply ML techniques (scikit-learn, classification algorithms, predictive modeling) to data-driven solutions.
- Implement observability, debugging, monitoring, and alerting for deployed services.
- Create Mermaid sequence diagrams, flowcharts, and dataflow diagrams to document system architecture and workflows.
- Drive best practices in fullstack data service development, including architecture, testing, and documentation.
What We’re Looking For
Minimum Qualifications
- 2+ years of experience as a Data Engineer or a Software Backend engineering role
- Strong programming skills in Python, Scala, or Java
- Hands-on experience with HBase or similar NoSQL columnar stores
- Hands-on experience with distributed data systems like Spark, Kafka, or Flink
- Proficient in writing complex SQL and optimizing queries for performance
- Experience building and maintaining robust ETL/ELT pipelines in production
- Familiarity with workflow orchestration tools (Airflow, Dagster, or similar)
- Understanding of data modeling techniques (star schema, dimensional modeling, etc.)
- Familiarity with CI/CD pipelines (Jenkins or similar)
- Ability to visualize and communicate architectures using Mermaid diagrams
Bonus Points
- Experience working with Databricks, dbt, Terraform, or Kubernetes
- Familiarity with streaming data pipelines or real-time processing
- Exposure to data governance frameworks and tools
- Experience supporting data products or ML pipelines in production
- Strong understanding of data privacy, security, and compliance best practices
Why You’ll Love Working Here
- Data with purpose: Work on problems that directly impact how the world builds secure software
- Modern tooling: Leverage the best of open-source and cloud-native technologies
- Collaborative culture: Join a passionate team that values learning, autonomy, and impact
API Developer (.NET Core 8/9)
Location: Hyderabad/Vijayawada- India
Navitas is seeking a Senior API Developer (.NET Core 8/9) to join our development team in building robust, high-performance microservices and APIs. You will play a key role in designing scalable, secure, and maintainable backend services that power our web and mobile applications. In this role, you will collaborate with product managers, front-end developers, and DevOps engineers to deliver seamless digital experiences and ensure smooth partner integration. This is a mission-critical position that contributes directly to our organization’s digital transformation initiatives.
Responsibilities will include but are not limited to:
- Microservices & API Development: Design, develop, and maintain RESTful APIs and microservices using .NET Core 8/9 and ASP.NET Core Web API.
- API Design & Documentation: Create secure, versioned, and well-documented endpoints for internal and external consumption.
- Asynchronous Processing: Build and manage background jobs and message-driven workflows using Azure Service Bus and Azure Storage Queues.
- Authentication & Security: Implement OAuth2.0, JWT, Azure AD for securing APIs; enforce best practices for secure coding.
- Caching Integration: Enhance performance through caching mechanisms (Redis, in-memory caching).
- Performance Optimization: Profile APIs and database queries to identify bottlenecks; tune services for speed, scalability, and resilience.
- Clean Code & Architecture: Follow SOLID principles, Clean Architecture, and domain-driven design to write modular, testable code.
- Technical Collaboration: Participate in Agile development processes; collaborate with cross-functional teams to plan and deliver solutions.
- Troubleshooting & Maintenance: Use debugging tools and logging strategies to maintain uptime and resolve production issues.
- Documentation: Maintain clear, accessible technical documentation for services, endpoints, and integration requirements.
What You’ll Need:
- Bachelor’s degree in Computer Science, Information Systems, or a related technical field.
- 8+ years of backend development experience using .NET Core (6+ preferred, experience with 8/9 strongly desired).
- Strong understanding of RESTful API design, versioning, and integration.
- Experience with Clean Architecture and Domain-Driven Design (DDD).
- Deep knowledge of SOLID principles, design patterns, and reusable code practices.
- Skilled in SQL Server, including schema design, query tuning, and optimization.
- Proficiency in Entity Framework Core and Dapper for data access.
- Familiarity with API security standards (OAuth2.0, JWT, API keys).
- Experience writing unit/integration tests using xUnit, Moq, or similar frameworks.
- Basic experience with Azure services, including message queues and storage.
- Proficiency with Git, Agile workflows, and collaboration tools.
- Strong communication and problem-solving skills.
Set Yourself Apart With:
- Hands-on experience with Azure components (e.g., Service Bus, Functions, App Services, AKS).
- Experience with Azure Application Insights, Datadog, or other observability tools.
- Familiarity with Docker, containerization, and CI/CD pipelines.
- Performance testing and load testing experience.
- Familiarity with Postman, Swagger/OpenAPI, and other dev/test tools.
- Exposure to Agile/Scrum methodologies and sprint planning processes.
Equal Employer/Veterans/Disabled
Navitas Business Consulting is an affirmative action and equal opportunity employer. If reasonable accommodation is needed to participate in the job application or interview process, to perform essential job functions, and/or to receive other benefits and privileges of employment, please contact Navitas Human Resources.
Navitas is an equal opportunity employer. We provide employment and opportunities for advancement, compensation, training, and growth according to individual merit, without regard to race, color, religion, sex (including pregnancy), national origin, sexual orientation, gender identity or expression, marital status, age, genetic information, disability, veteran-status veteran or military status, or any other characteristic protected under applicable Federal, state, or local law. Our goal is for each staff member to have the opportunity to grow to the limits of their abilities and to achieve personal and organizational objectives. We will support positive programs for equal treatment of all staff and full utilization of all qualified employees at all levels within Navitas.
About the Role:
We are seeking a highly skilled Integration Specialist / Full Stack Developer with strong experience in .NET Core, API integrations, and modern front-end development. The ideal candidate will build and integrate scalable web and mobile applications, manage end-to-end delivery, and ensure smooth data exchange across platforms.
Key Responsibilities:
- Design, develop, and maintain backend APIs using .NET Core / C#.
- Build and integrate REST and SOAP-based services (JSON, XML, OAuth2, JWT, API Key).
- Implement file-based integrations (Flat file, CSV, Excel, XML, JSON) and manage FTP/SFTP transfers.
- Work with databases such as MSSQL, PostgreSQL, Oracle, and SQLite — including writing queries, stored procedures, and using ADO.NET.
- Handle data serialization/deserialization using Newtonsoft.Json or System.Text.Json.
- Implement robust error handling and logging with Serilog, NLog, or log4net.
- Automate and schedule processes using Quartz.NET, Hangfire, or Windows Task Scheduler.
- Manage version control and CI/CD pipelines via Git and Azure DevOps.
- Develop front-end interfaces with React and React Native ensuring responsive, modular UI.
- Implement offline-first functionality for mobile apps (sync logic, caching, etc.).
- Collaborate with cross-functional teams or independently handle full project ownership.
- Utilize AI-assisted development tools (e.g., Cursor, GitHub Copilot, Claude Code) to enhance productivity.
- Apply integration best practices including middleware, API gateways, and optionally message queues (MSMQ, RabbitMQ).
- Ensure scalability, security, and performance in all deliverables.
Key Skills & Technologies:
- Backend: .NET Core, C#, REST/SOAP APIs, WCF, ADO.NET
- Frontend: React, React Native
- Databases: MSSQL, PostgreSQL, Oracle, SQLite
- Tools: Git, Azure DevOps, Hangfire, Quartz.NET, Serilog/NLog
- Integration: JSON, XML, CSV, FTP/SFTP, OAuth2, JWT
- DevOps: CI/CD automation, deployment pipelines
- Optional: Middleware, API Gateway, Message Queues (MSMQ, RabbitMQ)
Qualifications:
- Bachelor’s degree in Computer Science, Engineering, or a related field.
- Minimum 5 years of hands-on experience in software development and integration.
- Proven expertise in designing and implementing scalable applications.
- Strong analytical and problem-solving skills with a proactive approach.
Nice to Have:
- Experience with cloud services (Azure, AWS, GCP).
- Knowledge of containerization tools like Docker or Kubernetes.
- Familiarity with mobile deployment workflows and app store publishing.
Job Summary:
We are looking for technically skilled and customer-oriented SME Voice – Technical Support Associates to provide voice-based support to enterprise clients. The role involves real-time troubleshooting of complex issues across servers, networks, cloud platforms (Azure), databases, and more. Strong communication and problem-solving skills are essential.
Key Responsibilities:
- Provide technical voice support to B2B (enterprise) customers.
- Troubleshoot and resolve issues related to:
- SQL, DNS, VPN, Server Support (Windows/Linux)
- Networking (TCP/IP, routing, firewalls)
- Cloud Services – especially Microsoft Azure
- Application and system-level issues
- Assist with technical configurations and product usage.
- Accurately document cases and escalate unresolved issues.
- Ensure timely resolution while meeting SLAs and quality standards.
Required Skills & Qualifications:
- 2.5 to 5 years in technical support (voice-based, B2B preferred)
Proficiency in:
- SQL, DNS, VPN, Server Support
- Networking, Microsoft Azure
- Basic understanding of coding/scripting
- Strong troubleshooting and communication skills
- Ability to work in a 24x7 rotational shift environment
Job Summary:
Technical Support Associates
We are looking for technically skilled and customer-oriented SME Voice – Technical Support Associates to provide voice-based support to enterprise clients. The role involves real-time troubleshooting of complex issues across servers, networks, cloud platforms (Azure), databases, and more. Strong communication and problem-solving skills are essential.
Key Responsibilities:
- Provide technical voice support to B2B (enterprise) customers.
- Troubleshoot and resolve issues related to:
- SQL, DNS, VPN, Server Support (Windows/Linux)
- Networking (TCP/IP, routing, firewalls)
- Cloud Services – especially Microsoft Azure
- Application and system-level issues
- Assist with technical configurations and product usage.
- Accurately document cases and escalate unresolved issues.
- Ensure timely resolution while meeting SLAs and quality standards.
Required Skills & Qualifications:
- 2.5 to 5 years in technical support (voice-based, B2B preferred)
Proficiency in:
- SQL, DNS, VPN, Server Support
- Networking, Microsoft Azure
- Basic understanding of coding/scripting
- Strong troubleshooting and communication skills
- Ability to work in a 24x7 rotational shift environment
Job Title: Mid-Level .NET Developer (Agile/SCRUM)
Location: Mohali, Bangalore, Pune, Navi Mumbai, Chennai, Hyderabad, Panchkula, Gurugram (Delhi NCR), Dehradun
Night Shift from 6:30 pm to 3:30 am IST
Experience: 5+ Years
Job Summary:
We are seeking a proactive and detail-oriented Mid-Level .NET Developer to join our dynamic team. The ideal candidate will be responsible for designing, developing, and maintaining high-quality applications using Microsoft technologies with a strong emphasis on .NET Core, C#, Web API, and modern front-end frameworks. You will collaborate with cross-functional teams in an Agile/SCRUM environment and participate in the full software development lifecycle—from requirements gathering to deployment—while ensuring adherence to best coding and delivery practices.
Key Responsibilities:
- Design, develop, and maintain applications using C#, .NET, .NET Core, MVC, and databases such as SQL Server, PostgreSQL, and MongoDB.
- Create responsive and interactive user interfaces using JavaScript, TypeScript, Angular, HTML, and CSS.
- Develop and integrate RESTful APIs for multi-tier, distributed systems.
- Participate actively in Agile/SCRUM ceremonies, including sprint planning, daily stand-ups, and retrospectives.
- Write clean, efficient, and maintainable code following industry best practices.
- Conduct code reviews to ensure high-quality and consistent deliverables.
- Assist in configuring and maintaining CI/CD pipelines (Jenkins or similar tools).
- Troubleshoot, debug, and resolve application issues effectively.
- Collaborate with QA and product teams to validate requirements and ensure smooth delivery.
- Support release planning and deployment activities.
Required Skills & Qualifications:
- 4–6 years of professional experience in .NET development.
- Strong proficiency in C#, .NET Core, MVC, and relational databases such as SQL Server.
- Working knowledge of NoSQL databases like MongoDB.
- Solid understanding of JavaScript/TypeScript and the Angular framework.
- Experience in developing and integrating RESTful APIs.
- Familiarity with Agile/SCRUM methodologies.
- Basic knowledge of CI/CD pipelines and Git version control.
- Hands-on experience with AWS cloud services.
- Strong analytical, problem-solving, and debugging skills.
- Excellent communication and collaboration skills.
Preferred / Nice-to-Have Skills:
- Advanced experience with AWS services.
- Knowledge of Kubernetes or other container orchestration platforms.
- Familiarity with IIS web server configuration and management.
- Experience in the healthcare domain.
- Exposure to AI-assisted code development tools (e.g., GitHub Copilot, ChatGPT).
- Experience with application security and code quality tools such as Snyk or SonarQube.
- Strong understanding of SOLID principles and clean architecture patterns.
Technical Proficiencies:
- ASP.NET Core, ASP.NET MVC
- C#, Entity Framework, Razor Pages
- SQL Server, MongoDB
- REST API, jQuery, AJAX
- HTML, CSS, JavaScript, TypeScript, Angular
- Azure Services, Azure Functions, AWS
- Visual Studio
- CI/CD, Git
We are looking for a highly skilled Sr. Big Data Engineer with 3-5 years of experience in
building large-scale data pipelines, real-time streaming solutions, and batch/stream
processing systems. The ideal candidate should be proficient in Spark, Kafka, Python, and
AWS Big Data services, with hands-on experience in implementing CDC (Change Data
Capture) pipelines and integrating multiple data sources and sinks.
Responsibilities
- Design, develop, and optimize batch and streaming data pipelines using Apache Spark and Python.
- Build and maintain real-time data ingestion pipelines leveraging Kafka and AWS Kinesis.
- Implement CDC (Change Data Capture) pipelines using Kafka Connect, Debezium or similar frameworks.
- Integrate data from multiple sources and sinks (databases, APIs, message queues, file systems, cloud storage).
- Work with AWS Big Data ecosystem: Glue, EMR, Kinesis, Athena, S3, Lambda, Step Functions.
- Ensure pipeline scalability, reliability, and performance tuning of Spark jobs and EMR clusters.
- Develop data transformation and ETL workflows in AWS Glue and manage schema evolution.
- Collaborate with data scientists, analysts, and product teams to deliver reliable and high-quality data solutions.
- Implement monitoring, logging, and alerting for critical data pipelines.
- Follow best practices for data security, compliance, and cost optimization in cloud environments.
Required Skills & Experience
- Programming: Strong proficiency in Python (PySpark, data frameworks, automation).
- Big Data Processing: Hands-on experience with Apache Spark (batch & streaming).
- Messaging & Streaming: Proficient in Kafka (brokers, topics, partitions, consumer groups) and AWS Kinesis.
- CDC Pipelines: Experience with Debezium / Kafka Connect / custom CDC frameworks.
- AWS Services: AWS Glue, EMR, S3, Athena, Lambda, IAM, CloudWatch.
- ETL/ELT Workflows: Strong knowledge of data ingestion, transformation, partitioning, schema management.
- Databases: Experience with relational databases (MySQL, Postgres, Oracle) and NoSQL (MongoDB, DynamoDB, Cassandra).
- Data Formats: JSON, Parquet, Avro, ORC, Delta/Iceberg/Hudi.
- Version Control & CI/CD: Git, GitHub/GitLab, Jenkins, or CodePipeline.
- Monitoring/Logging: CloudWatch, Prometheus, ELK/Opensearch.
- Containers & Orchestration (nice-to-have): Docker, Kubernetes, Airflow/Step
- Functions for workflow orchestration.
Preferred Qualifications
- Bachelor’s or Master’s degree in Computer Science, Data Engineering, or related field.
- Experience in large-scale data lake / lake house architectures.
- Knowledge of data warehousing concepts and query optimisation.
- Familiarity with data governance, lineage, and cataloging tools (Glue Data Catalog, Apache Atlas).
- Exposure to ML/AI data pipelines is a plus.
Tools & Technologies (must-have exposure)
- Big Data & Processing: Apache Spark, PySpark, AWS EMR, AWS Glue
- Streaming & Messaging: Apache Kafka, Kafka Connect, Debezium, AWS Kinesis
- Cloud & Storage: AWS (S3, Athena, Lambda, IAM, CloudWatch)
- Programming & Scripting: Python, SQL, Bash
- Orchestration: Airflow / Step Functions
- Version Control & CI/CD: Git, Jenkins/CodePipeline
- Data Formats: Parquet, Avro, ORC, JSON, Delta, Iceberg, Hudi
Key Responsibilities
- Design, develop, and maintain scalable microservices and RESTful APIs using Python (Flask, FastAPI, or Django).
- Architect data models for SQL and NoSQL databases (PostgreSQL, ClickHouse, MongoDB, DynamoDB) to optimize performance and reliability.
- Implement efficient and secure data access layers, caching, and indexing strategies.
- Collaborate closely with product and frontend teams to deliver seamless user experiences.
- Build responsive UI components using HTML, CSS, JavaScript, and frameworks like React or Angular.
- Ensure system reliability, observability, and fault tolerance across services.
- Lead code reviews, mentor junior engineers, and promote engineering best practices.
- Contribute to DevOps and CI/CD workflows for smooth deployments and testing automation.
Required Skills & Experience
- 10+ years of professional software development experience.
- Strong proficiency in Python, with deep understanding of OOP, asynchronous programming, and performance optimization.
- Proven expertise in building FAST API based microservices architectures.
- Solid understanding of SQL and NoSQL data modeling, query optimization, and schema design.
- Excellent hands on proficiency in frontend proficiency with HTML, CSS, JavaScript, and a modern framework (React, Angular, or Vue).
- Experience working with cloud platforms (AWS, GCP, or Azure) and containerized deployments (Docker, Kubernetes).
- Familiarity with distributed systems, event-driven architectures, and messaging queues (Kafka, RabbitMQ).
- Excellent problem-solving, communication, and system design skills.
- 8+ years of Data Engineering experience
- Strong SQL and Redshift experience
- CI/CD and orchestration experience using Bitbucket, Jenkins and Control-M
- Reporting experience preferably Tableau
- Location – Pune, Hyderabad, Bengaluru
Integration Developer
ROLE TITLE
Integration Developer
ROLE LOCATION(S)
Bangalore/Hyderabad/Chennai/Coimbatore/Noida/Kolkata/Pune/Indore
ROLE SUMMARY
The Integration Developer is a key member of the operations team, responsible for ensuring the smooth integration and functioning of various systems and software within the organization. This role involves technical support, system troubleshooting, performance monitoring, and assisting with the implementation of integration solutions.
ROLE RESPONSIBILITIES
· Design, develop, and maintain integration solutions using Spring Framework, Apache Camel, and other integration patterns such as RESTful APIs, SOAP services, file-based FTP/SFTP, and OAuth authentication.
· Collaborate with architects and cross-functional teams to design integration solutions that are scalable, secure, and aligned with business requirements.
· Resolve complex integration issues, performance bottlenecks, and data discrepancies across multiple systems. Support Production issues and fixes.
· Document integration processes, technical designs, APIs, and workflows to ensure clarity and ease of use.
· Participate in on-call rotation to provide 24/7 support for critical production issues.
· Develop source code / version control management experience in a collaborative work environment.
TECHNICAL QUALIFICATIONS
· 5+ years of experience in Java development with strong expertise in Spring Framework and Apache Camel for building enterprise-grade integrations.
· Proficient with Azure DevOps (ADO) for version control, CI/CD pipeline implementation, and project management.
· Hands-on experience with RESTful APIs, SOAP services, and file-based integrations using FTP and SFTP protocols.
· Strong analytical and troubleshooting skills for resolving complex integration and system issues.
· Experience in Azure Services, including Azure Service Bus, Azure Kubernetes Service (AKS), Azure Container Apps, and ideally Azure API Management (APIM) is a plus.
· Good understanding of containerization and cloud-native development, with experience in using Docker, Kubernetes, and Azure AKS.
· Experience with OAuth for secure authentication and authorization in integration solutions.
· Strong experience level using GitHub Source Control application.
· Strong background in SQL databases (e.g., T-SQL, Stored Procedures) and working with data in an integration context.
· Skilled with Azure DevOps (ADO) for version control, CI/CD pipeline implementation, and project management.
· Experience in Azure Services, including Azure Service Bus, Azure Kubernetes Service (AKS), Azure Container Apps, and ideally Azure API Management (APIM) is a plus.
GENERAL QUALIFICATIONS
· Excellent analytical and problem-solving skills, with a keen attention to detail.
· Effective communication skills, with the ability to collaborate with technical and non-technical stakeholders.
· Experience working in a fast paced, production support environment with a focus on incident management and resolution.
· Experience in the insurance domain is considered a plus.
EDUCATION REQUIREMENTS
· Bachelor’s degree in Computer Science, Information Technology, or related field.
Shift timings : Afternoon
Job Summary
We are seeking an experienced Senior Java Developer with strong expertise in legacy system migration, server management, and deployment. The candidate will be responsible for maintaining, enhancing, and migrating an existing Java/JSF (PrimeFaces), EJB, REST API, and SQL Server-based application to a modern Spring Boot architecture. The role involves ensuring smooth production deployments, troubleshooting server issues, and optimizing the existing infrastructure.
Key Responsibilities
● Maintain & Enhance the existing Java, JSF (PrimeFaces), EJB, REST API, andSQL Server application.
● Migrate the legacy system to Spring Boot while ensuring minimal downtime.
● Manage deployments using Ansible, GlassFish/Payara, and deployer.sh scripts.
● Optimize and troubleshoot server performance (Apache, Payara, GlassFish).
● Handle XML file generation, email integrations, and REST API maintenance.
● Database management (SQL Server) including query optimization and schema updates.
● Collaborate with teams to ensure smooth transitions during migration.
● Automate CI/CD pipelines using Maven, Ansible, and shell scripts.
● Document migration steps, deployment processes, and system architecture.
Required Skills & Qualifications
● 8+ years of hands-on experience with Java, JSF (PrimeFaces), EJB, and REST APIs.
● Strong expertise in Spring Boot (migration experience from legacy Java is a must).
● Experience with Payara/GlassFish server management and deployment.
● Proficient in Apache, Ansible, and shell scripting (deployer.sh).
● Solid knowledge of SQL Server (queries, stored procedures, optimization).
● Familiarity with XML processing, email integrations, and Maven builds.
● Experience in production deployments, server troubleshooting, and performance tuning.
● Ability to work independently and lead migration efforts.
Preferred Skills
● Knowledge of microservices architecture (helpful for modernization).
● Familiarity with cloud platforms (AWS/Azure) is a plus.
Job Title: Python Developer (Full Time)
Location: Hyderabad (Onsite)
Interview: Virtual and Face to Face Interview (Last round)
Experience Required: 4 + Years
Working Days: 5 Days
About the Role
We are seeking a highly skilled Lead Python Developer with a strong background in building scalable and secure applications. The ideal candidate will have hands-on expertise in Python frameworks, API integrations, and modern application architectures. This role requires a tech leader who can balance innovation, performance, and compliance while driving successful project delivery.
Key Responsibilities
- Application Development
- Architect and develop robust, high-performance applications using Django, Flask, and FastAPI.
- API Integration
- Design and implement seamless integration with third-party APIs (including travel-related APIs, payment gateways, and external service providers).
- Data Management
- Develop and optimize ETL pipelines for structured and unstructured data using data lakes and distributed storage solutions.
- Microservices Architecture
- Build modular, scalable applications using microservices principles for independent deployment and high availability.
- Performance Optimization
- Enhance application performance through load balancing, caching, and query optimization to deliver superior user experiences.
- Security & Compliance
- Apply secure coding practices, implement data encryption, and ensure compliance with industry security and privacy standards (e.g., PCI DSS, GDPR).
- Automation & Deployment
- Utilize CI/CD pipelines, Docker/Kubernetes, and monitoring tools for automated testing, deployment, and production monitoring.
- Collaboration
- Partner with front-end developers, product managers, and stakeholders to deliver user-centric, business-aligned solutions.
Requirements
Education
- Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field.
Technical Expertise
- 4+ years of hands-on experience with Python frameworks (Django, Flask, FastAPI).
- Proficiency in RESTful APIs, GraphQL, and asynchronous programming.
- Strong knowledge of SQL/NoSQL databases (PostgreSQL, MongoDB) and big data tools (Spark, Kafka).
- Familiarity with Kibana, Grafana, Prometheus for monitoring and visualization.
- Experience with AWS, Azure, or Google Cloud, containerization (Docker, Kubernetes), and CI/CD tools (Jenkins, GitLab CI).
- Working knowledge of testing tools: PyTest, Selenium, SonarQube.
- Experience with API integrations, booking flows, and payment gateway integrations (travel domain knowledge is a plus, but not mandatory).
Soft Skills
- Strong problem-solving and analytical skills.
- Excellent communication, presentation, and teamwork abilities.
- Proactive, ownership-driven mindset with the ability to perform under pressure.
Job Description :
We are seeking a talented and experienced Full Stack Developer with 6+ years of experience to join our dynamic team in Hyderabad. The ideal candidate will have a passion for building scalable and efficient web applications, a strong understanding of modern frameworks and technologies, and a keen eye for user experience and design.
Key Responsibilities :
- Design, develop, and maintain web-based applications using React JS, NodeJS and other modern frameworks.
- Develop hybrid mobile applications and responsive web interfaces using Bootstrap and JavaScript.
- Build and optimize back-end services with frameworks such as Express.js or Restify.
- Work with SQL databases, including schema design and query optimization.
- Utilize ORM tools like Sequelize for database management.
- Implement real-time communication features and ensure browser compatibility.
- Collaborate with cross-functional teams to participate in the product development lifecycle, including prototyping, testing, and deployment.
- Adapt to and learn alternative technologies based on project requirements.
Required Skills & Experience :
- 6+ years of experience in full-stack web development.
- Proficient in Angular, NodeJS, React.JS, and JavaScript, TypeScript
- Strong experience with Express.js or Restify frameworks.
- Solid understanding of SQL databases and ORM tools like Sequelize.
- Knowledge of responsive design principles and hands-on experience in developing responsive web applications.
- Familiarity with React Native for mobile development (a plus)
- Strong understanding of real-time communication technologies.
Additional Skills & Experience :
-Exposure to Dotnet
- Experience with NoSQL databases such as MongoDB or Cassandra.
- Awareness of internationalization (i18n) and the latest trends in UI/UX design.
- Familiarity with other JavaScript libraries/frameworks like VueJS.
- Hands-on experience with implementing payment gateways for different regions.
- Excellent facilitation, verbal, and written communication skills.
- Eagerness to contribute to functional and user experience design discussions.
Education
B.Tech/M.Tech in CSE/IT.ECE
🚀 Hiring: Tableau Developer
⭐ Experience: 5+ Years
📍 Location: Pune, Gurgaon, Bangalore, Chennai, Hyderabad
⭐ Work Mode:- Hybrid
⏱️ Notice Period: Immediate Joiners or 15 Days
(Only immediate joiners & candidates serving notice period)
We are looking for a skilled Tableau Developer to join our team. The ideal candidate will have hands-on experience in creating, maintaining, and optimizing dashboards, reports, and visualizations that enable data-driven decision-making across the organization.
⭐ Key Responsibilities:
✅Develop and maintain Tableau dashboards & reports
✅Translate business needs into data visualizations
✅Work with SQL & multiple data sources for insights
✅Optimize dashboards for performance & usability
✅Collaborate with stakeholders for BI solutions
We are seeking a skilled SQL Developer to join our team. This role serves as a key bridge between insurance operations and technical solutions, ensuring business requirements are accurately translated into efficient system functionality. The SQL Developer will play a critical part in maintaining and enhancing underwriting software products and system integrations—helping deliver reliable, high-quality solutions to clients in the insurtech space.
The ideal candidate possesses strong SQL expertise, advanced data mapping capabilities, and hands-on experience working with APIs, JSON, XML, and other data exchange formats. Experience with insurance technology platforms, such as ConceptOne or similar underwriting systems, is preferred. In this role, you will regularly develop, maintain, and troubleshoot stored procedures and functions, perform data validation, support integration efforts across multiple systems, and configure insurance workflows. You will work closely with business analysts, underwriters, and technical teams to ensure smooth product updates and continuous improvement of system functionality.
What We’re Looking For:
- 3+ years of experience in a technical, insurance, or insurtech-focused role
- Strong proficiency in writing SQL, including complex queries, stored procedures, and performance tuning
- Expertise in data mapping, data validation, and reporting
- Experience working with APIs, JSON, XML, and system-to-system integrations
- Strong analytical and problem-solving skills with the ability to troubleshoot and optimize complex workflows
- Clear and effective communication skills, able to translate technical concepts for non-technical stakeholders
- Ability to work independently and manage multiple tasks in a fast-paced environment
- Keen attention to detail and commitment to delivering accurate, high-quality results
Bonus:
- Hands-on experience with underwriting or policy administration systems (e.g., ConceptOne or similar platforms)
- Familiarity with core insurance processes, such as policy issuance, endorsements, raters, claims, and reporting
- Experience with the U.S. P&C (Property & Casualty) insurance
What You’ll Be Doing:
- Develop and optimize SQL stored procedures, functions, and triggers to support underwriting and compliance requirements
- Create and maintain reports, quote covers, and validations or map and configure forms, raters, and system workflows to ensure accurate data processes
- Set up, troubleshoot, and optimize underwriting platforms (ConceptOne/others) for performance and accuracy
- Manage integrations with APIs, JSON, and XML to connect external services and streamline data exchange
- Collaborate with BAs, QAs, and Developers to translate requirements, test outputs, and resolve issues
- Provide technical support and training to internal teams and clients to ensure effective system usage
Required Skills/Experience:
- 6+ years of experience in designing and developing enterprise and/or consumer-facing applications using technologies and frameworks like JavaScript, Node.js (Javascript), ReactJS, Angular, SCSS, CSS, React Native
- 3+ years experience in leading teams (guide, design, track), taking responsibilities to deliver as per the agreed-upon schedules
- Hands-on experience with SQL and NoSQL databases
- Hands-on experience working in Linux OS
- Very good debugging and problem resolution experience
- Experience developing responsive web applications
- Very good communication (verbal and written) to interact with our customers
- Ability and interest to learn alternative technologies based on need
- Experienced in product development lifecycle (prototyping, hardening, testing etc.)
Additional Skills/Experience:
- Working experience with Python and NoSQL databases such as MongoDB, Cassandra
- Eagerness to participate in product functional and user experience designs
- Experience in AI, ML, NLP, and Predictive Analytics domains
- Familiarity with i18n, latest trends in UI and UX designs
- Experience with implementation of payment gateways applicable in different countries
- Experience with CI/CD, Jenkins, Nginx
· 5 years of experience as a Product Specialist, Business Analyst, or any other occupation/title providing experience with business process and data analysis
· Proven experience with and understanding of relational databases, and ability to construct basic to intermediate query logic
· 2 years of experience in asset management and/or fintech domain in R&D or product capacity
Responsibilities
• Enhance and maintain our main web application, ASCLOGS, built on ASP.NET MVC
(.NET Framework 4.8) with features such as authentication, PDF generation via
IronPdf, audit logging with log4net, Twilio SMS integration, and data access through
PetaPoco.
• Support multiple companion projects including:
• CopyForm, an MVC tool that copies form templates between SQL Server
databases using AJAX and PetaPoco data access.
• SQLImportApp, a WinForms importer leveraging ExcelDataReader and
Z.Dapper.Plus for bulk inserts.
• DEAVerification, a WinForms app automating data retrieval via Selenium
WebDriver and storing results with PetaPoco.
• UniversalScrapperAPI, an ASP.NET Web API that scrapes licensing
information using Selenium and logs results with log4net.
• HL7DocAssistantSync, a VB.NET library for HL7 message processing and
PDF generation with PdfSharp.
• ChatGPT Implementation, a .NET 8 Web API example showing how we
integrate with OpenAI's ChatGPT service.
• S3MicroService, a .NET 8 Web API using AWS SDK packages
(AWSSDK.S3, AWSSDK.Extensions.NETCore.Setup) and Entity Framework
Core for storage.
• Maintain PowerShell utilities used for onboarding tasks such as document-to-PDF
conversion and CSV generation.
• Review existing code to improve reliability, enhance testability, and refactor large
code files (for example, BusinessLayer/BusinessLayer.cs is roughly 28k lines).
• Work closely with stakeholders to gather requirements for new features and ensure
compatibility with our SQL Server backend.
• Assist in modernizing legacy components and implementing best practices for
security, error handling, logging, and deployment automation.
Required Skills
• 10+ years of development experience
• Extensive experience with C# and the .NET ecosystem, including both legacy .NET
Framework and modern .NET Core / .NET 8.
• Solid understanding of ASP.NET MVC, Web API, and Windows Forms development.
• Familiarity with PetaPoco, Entity Framework Core, and SQL Server.
• Experience integrating third-party services such as Twilio, OpenAI, Selenium
WebDriver.
• Ability to write and troubleshoot PowerShell scripts for automation tasks.
• Comfortable navigating large codebases and improving code quality through
refactoring, unit testing, and documentation.
• Proficiency with version control (Git) and the Visual Studio toolchain.
Preferred Skills
• Background in healthcare or regulated industries, since many applications manage
sensitive data (e.g., DEA verification, HL7 messaging).
• Knowledge of PDF generation libraries (IronPdf, PdfSharp) and logging frameworks
(log4net).
• Experience with CI/CD pipelines for .NET applications.
• Nice to have Unit Testing frameworks (i.e. NUnit, MS Testing Framework )
Location & Work Environment
This role requires working with Visual Studio on Windows for the .NET Framework solutions
and .NET 8 projects. Experience with IIS or IIS Express is helpful for local development and
testing.
- 5 -10 years of experience in ETL Testing, Snowflake, DWH Concepts.
- Strong SQL knowledge & debugging skills are a must.
- Experience on Azure and Snowflake Testing is plus
- Experience with Qlik Replicate and Compose tools (Change Data Capture) tools is considered a plus
- Strong Data warehousing Concepts, ETL tools like Talend Cloud Data Integration, Pentaho/Kettle tool
- Experience in JIRA, Xray defect management toolis good to have.
- Exposure to the financial domain knowledge is considered a plus
- Testing the data-readiness (data quality) address code or data issues
- Demonstrated ability to rationalize problems and use judgment and innovation to define clear and concise solutions
- Demonstrate strong collaborative experience across regions (APAC, EMEA and NA) to effectively and efficiently identify root cause of code/data issues and come up with a permanent solution
- Prior experience with State Street and Charles River Development (CRD) considered a plus
- Experience in tools such as PowerPoint, Excel, SQL
- Exposure to Third party data providers such as Bloomberg, Reuters, MSCI and other Rating agencies is a plus
Key Attributes include:
- Team player with professional and positive approach
- Creative, innovative and able to think outside of the box
- Strong attention to detail during root cause analysis and defect issue resolution
- Self-motivated & self-sufficient
- Effective communicator both written and verbal
- Brings a high level of energy with enthusiasm to generate excitement and motivate the team
- Able to work under pressure with tight deadlines and/or multiple projects
- Experience in negotiation and conflict resolution
🔍 Job Description:
We are looking for an experienced and highly skilled Technical Lead to guide the development and enhancement of a large-scale Data Observability solution built on AWS. This platform is pivotal in delivering monitoring, reporting, and actionable insights across the client's data landscape.
The Technical Lead will drive end-to-end feature delivery, mentor junior engineers, and uphold engineering best practices. The position reports to the Programme Technical Lead / Architect and involves close collaboration to align on platform vision, technical priorities, and success KPIs.
🎯 Key Responsibilities:
- Lead the design, development, and delivery of features for the data observability solution.
- Mentor and guide junior engineers, promoting technical growth and engineering excellence.
- Collaborate with the architect to align on platform roadmap, vision, and success metrics.
- Ensure high quality, scalability, and performance in data engineering solutions.
- Contribute to code reviews, architecture discussions, and operational readiness.
🔧 Primary Must-Have Skills (Non-Negotiable):
- 5+ years in Data Engineering or Software Engineering roles.
- 3+ years in a technical team or squad leadership capacity.
- Deep expertise in AWS Data Services: Glue, EMR, Kinesis, Lambda, Athena, S3.
- Advanced programming experience with PySpark, Python, and SQL.
- Proven experience in building scalable, production-grade data pipelines on cloud platforms.
🚀 Hiring: Manual Tester
⭐ Experience: 5+ Years
📍 Location: Pan India
⭐ Work Mode:- Hybrid
⏱️ Notice Period: Immediate Joiners
(Only immediate joiners & candidates serving notice period)
Must-Have Skills:
✅5+ years of experience in Manual Testing
✅Solid experience in ETL, Database, and Report Testing
✅Strong expertise in SQL queries, RDBMS concepts, and DML/DDL operations
✅Working knowledge of BI tools such as Power BI
✅Ability to write effective Test Cases and Test Scenarios

Product company for financial operations automation platform
Mandatory Criteria (Can't be neglected during screening) :
- Candidate Must have Project management experience.
- Strong hands-on experience with SQL, including the ability to write, optimize, and debug complex queries (joins, CTEs, subqueries).
- Must have experience in Treasury Module.
- Should have a basic understanding of accounting principles and financial workflows
- 3+ years of implementation experience is required.
- Looking candidates from Fintech company ONLY. ( Candidate should have Strong knowledge of fintech products, financial workflows, and integrations )
- Candidate should have Hands-on experience with tools such as Jira, Confluence, Excel, and project management platforms.
- Candidate should have Experience in managing multi-stakeholder projects from scratch.
Position Overview
We are looking for an experienced Implementation Lead to drive the onboarding and implementation of our platform for new and existing fintech clients. This role is ideal for someone with a strong understanding of financial systems, implementation methodologies, and client management. You’ll collaborate closely with product, engineering, and customer success teams to ensure timely, accurate, and seamless deployments.
Key Responsibilities
- Lead end-to-end implementation projects for enterprise fintech clients
- Translate client requirements into detailed implementation plans and configure solutions accordingly.
- Write and optimize complex SQL queries for data analysis, validation, and integration
- Oversee ETL processes – extract, transform, and load financial data across systems
- Collaborate with cross-functional teams including Product, Engineering, and Support
- Ensure timely, high-quality delivery across multiple stakeholders and client touchpoints
- Document processes, client requirements, and integration flows in detail.
Required Qualifications
- Bachelor’s degree in Finance, Business Administration, Information Systems, or related field
- 3+ years of hands-on implementation/project management experience
- Proven experience delivering projects in Fintech, SaaS, or ERP environments
- Strong understanding of accounting principles and financial workflows
- Hands-on SQL experience, including the ability to write and debug complex queries (joins, CTEs, subqueries)
- Experience working with ETL pipelines or data migration processes
- Proficiency in tools like Jira, Confluence, Excel, and project tracking systems
- Strong communication and stakeholder management skills
- Ability to manage multiple projects simultaneously and drive client success
Preferred Qualifications
- Prior experience implementing financial automation tools (e.g., SAP, Oracle, Anaplan, Blackline)
- Familiarity with API integrations and basic data mapping
- Experience in agile/scrum-based implementation environments
- Exposure to reconciliation, book closure, AR/AP, and reporting systems
- PMP, CSM, or similar certifications
Skills & Competencies
Functional Skills
- Financial process knowledge (e.g., reconciliation, accounting, reporting)
- Business analysis and solutioning
- Client onboarding and training
- UAT coordination
- Documentation and SOP creation
Project Skills
- Project planning and risk management
- Task prioritization and resource coordination
- KPI tracking and stakeholder reporting
Soft Skills
- Cross-functional collaboration
- Communication with technical and non-technical teams
- Attention to detail and customer empathy
- Conflict resolution and crisis management
What We Offer
- An opportunity to shape fintech implementations across fast-growing companies
- Work in a dynamic environment with cross-functional experts
- Competitive compensation and rapid career growth
- A collaborative and meritocratic culture
Job Title: PostgreSQL Database Administrator
Experience: 6–8 Years
Work Mode: Hybrid
Locations: Hyderabad / Pune
Joiners: Only immediate joiners & candidates who have completed notice period
Required Skills
- Strong hands-on experience in PostgreSQL administration (6+ years).
- Excellent understanding of SQL and query optimization techniques.
- Deep knowledge of database services, architecture, and internals.
- Experience in performance tuning at both DB and OS levels.
- Familiarity with DataGuard or similar high-availability solutions.
- Strong experience in job scheduling and automation.
- Comfortable with installing, configuring, and upgrading PostgreSQL.
- Basic to intermediate knowledge of Linux system administration.
- Hands-on experience with shell scripting for automation and monitoring tasks.
Key Responsibilities
- Administer and maintain PostgreSQL databases with 6+ years of hands-on experience.
- Write and optimize complex SQL queries for performance and scalability.
- Manage database storage structures and ensure optimal disk usage and performance.
- Monitor, analyze, and resolve database performance issues using tools and logs.
- Perform database tuning, configuration adjustments, and query optimization.
- Plan, schedule, and automate jobs using cron or other job scheduling tools at DB and OS levels.
- Install and upgrade PostgreSQL database software to new versions as required.
- Manage high availability and disaster recovery setups, including replication and DataGuard administration (or equivalent techniques).
- Perform regular database backups and restorations to ensure data integrity and availability.
- Apply security patches and updates on time.
- Collaborate with developers for schema design, stored procedures, and access privileges.
- Document configurations, processes, and performance tuning results.
🚀 Hiring: Postgres DBA at Deqode
⭐ Experience: 6+ Years
📍 Location: Pune & Hyderabad
⭐ Work Mode:- Hybrid
⏱️ Notice Period: Immediate Joiners
(Only immediate joiners & candidates serving notice period)
Looking for an experienced Postgres DBA with:-
✅ 6+ years in Postgres & strong SQL skills
✅ Good understanding of database services & storage management
✅ Performance tuning & monitoring expertise
✅ Knowledge of Dataguard admin, backups, upgrades
✅ Basic Linux admin & shell scripting
Immediate Hiring for Business Analyst
Position: Business Analyst
Experiance : 5 - 8 Years
Location:Hyderabad
Job Summary:
We are seeking a motivated and detail-oriented Business Analyst with 5 years of experience in the Travel domain. The ideal candidate will have a strong understanding of the travel industry, including airlines, travel agencies, and online booking systems. You will work closely with cross-functional teams to gather business requirements, analyze processes, and deliver solutions that improve customer experience and operational efficiency.
Key Responsibilities:
- Requirement Gathering & Analysis: Collaborate with stakeholders to gather, document, and analyze business requirements, ensuring alignment with business goals.
- Process Improvement: Identify opportunities for process improvement and optimization in travel booking, ticketing, and customer support systems.
- Stakeholder Communication: Act as the bridge between the business stakeholders and technical teams, ensuring clear communication of requirements, timelines, and deliverables.
- Solution Design: Participate in the design and development of solutions, collaborating with IT and development teams to ensure business needs are met.
- Data Analysis: Analyze data related to customer journeys, bookings, and cancellations to identify trends and insights for decision-making.
- Documentation: Prepare detailed documentation including business requirements documents (BRD), user stories, process flows, and functional specifications.
- Testing & Validation: Support testing teams during User Acceptance Testing (UAT) to ensure solutions meet business needs, and facilitate issue resolution.
- Market Research: Stay up to date with travel industry trends, customer preferences, and competitor offerings to ensure innovative solutions are delivered.
Qualifications & Skills:
- Education: Bachelor’s degree in Business Administration, Information Technology, or a related field.
- Experience:
- 5 years of experience as a Business Analyst in the travel industry.
- Hands-on experience in working with travel booking systems (GDS, OTA) is highly preferred.
- Domain Knowledge:
- Strong understanding of the travel industry, including booking engines, reservations, ticketing, cancellations, and customer support.
- Familiarity with industry-specific regulations and best practices.
- Analytical Skills: Excellent problem-solving skills with the ability to analyze complex data and business processes.
- Technical Skills:
- Proficiency in Microsoft Office (Word, Excel, PowerPoint).
- Knowledge of SQL or data visualization tools (Power BI, Tableau) is a plus.
- Communication: Strong verbal and written communication skills with the ability to convey complex information clearly.
- Attention to Detail: Strong focus on accuracy and quality of work, ensuring that solutions meet business requirements.
Preferred:
- Prior experience with Agile methodologies.
- Certification in Business Analysis (CBAP or similar).
- A minimum of 4-10 years of experience into data integration/orchestration services, service architecture and providing data driven solutions for client requirements
- Experience on Microsoft Azure cloud and Snowflake SQL, database query/performance tuning.
- Experience with Qlik Replicate and Compose tools(Change Data Capture) tools is considered a plus
- Strong Data warehousing Concepts, ETL tools such as Talend Cloud Data Integration tool is must
- Exposure to the financial domain knowledge is considered a plus.
- Cloud Managed Services such as source control code Github, MS Azure/Devops is considered a plus.
- Prior experience with State Street and Charles River Development ( CRD) considered a plus.
- Experience in tools such as Visio, PowerPoint, Excel.
- Exposure to Third party data providers such as Bloomberg, Reuters, MSCI and other Rating agencies is a plus.
- Strong SQL knowledge and debugging skills is a must.
Job description
🔧 Key Responsibilities:
- Design and implement robust backend services using Node.js.
- Develop and maintain RESTful APIs to support front-end applications and third-party integrations
- Manage and optimize SQL/NoSQL databases (e.g., PostgreSQL, MongoDB, Snowflake)
- Collaborate with front-end developers to ensure seamless integration and data flow
- Implement caching, logging, and monitoring strategies for performance and reliability
- Ensure application security, scalability, and maintainability
- Participate in code reviews, architecture discussions, and agile ceremonies
✅ Required Skills:
- Proficiency in backend programming languages (Node.js, Java, .NET Core)
- Experience with API development and tools like Postman, Swagger
- Strong understanding of database design and query optimization
- Familiarity with microservices architecture and containerization (Docker, Kubernetes)
- Knowledge of cloud platforms (Azure, AWS) and CI/CD pipelines.
About Cognida.ai:
Our Purpose is to boost your competitive advantage using AI and Analytics.
We Deliver tangible business impact with data-driven insights powered by AI. Drive revenue growth, increase profitability and improve operational efficiencies.
We Are technologists with keen business acumen - Forever curious, always on the front lines of technological advancements. Applying our latest learnings, and tools to solve your everyday business challenges.
We Believe the power of AI should not be the exclusive preserve of the few. Every business, regardless of its size or sector deserves the opportunity to harness the power of AI to make better decisions and drive business value.
We See a world where our AI and Analytics solutions democratise decision intelligence for all businesses. With Cognida.ai, our motto is ‘No enterprise left behind’.
Position: Python Fullstack Architect
Location: Hyderabad
Job Summary
We’re seeking a seasoned Python Fullstack Architect with 15+ years of experience to lead solution design, mentor teams, and drive technical excellence across projects. You'll work closely with stakeholders, contribute to architecture governance, and integrate modern technologies across the stack.
Key Responsibilities
- Design and review Python-based fullstack solution architectures.
- Guide development teams on best practices, modern frameworks, and cloud-native patterns.
- Engage with clients to translate business needs into scalable technical solutions.
- Stay current with tech trends and contribute to internal innovation initiatives.
Required Skills
- Strong expertise in Python (Django/Flask/FastAPI) and frontend frameworks (React, Angular, etc.).
- Cloud experience (AWS, Azure, or GCP) and DevOps/CI-CD setup.
- Familiarity with enterprise tools: RabbitMQ, Kafka, OAuth2, PostgreSQL, MongoDB.
- Solid understanding of microservices, API design, batch/stream processing.
- Strong leadership, mentoring, and architectural problem-solving skills.
Position Summary:
As a CRM ETL Developer, you will be responsible for the analysis, transformation, and integration of data from legacy and external systems into CRM application. This includes developing ETL/ELT workflows, ensuring data quality through cleansing and survivorship rules, and supporting daily production loads. You will work in an Agile environment and play a vital role in building scalable, high-quality data integration solutions.
Key Responsibilities:
- Analyze data from legacy and external systems; develop ETL/ELT pipelines to ingest and process data.
- Cleanse, transform, and apply survivorship rules before loading into the CRM platform.
- Monitor, support, and troubleshoot production data loads (Tier 1 & Tier 2 support).
- Contribute to solution design, development, integration, and scaling of new/existing systems.
- Promote and implement best practices in data integration, performance tuning, and Agile development.
- Lead or support design reviews, technical documentation, and mentoring of junior developers.
- Collaborate with business analysts, QA, and cross-functional teams to resolve defects and clarify requirements.
- Deliver working solutions via quick POCs or prototypes for business scenarios.
Technical Skills:
- ETL/ELT Tools: 5+ years of hands-on experience in ETL processes using Siebel EIM.
- Programming & Databases: Strong SQL & PL/SQL development; experience with Oracle and/or SQL Server.
- Data Integration: Proven experience in integrating disparate data systems.
- Data Modelling: Good understanding of relational, dimensional modelling, and data warehousing concepts.
- Performance Tuning: Skilled in application and SQL query performance optimization.
- CRM Systems: Familiarity with Siebel CRM, Siebel Data Model, and Oracle SOA Suite is a plus.
- DevOps & Agile: Strong knowledge of DevOps pipelines and Agile methodologies.
- Documentation: Ability to write clear technical design documents and test cases.
Soft Skills & Attributes:
- Strong analytical and problem-solving skills.
- Excellent communication and interpersonal abilities.
- Experience working with cross-functional, globally distributed teams.
- Proactive mindset and eagerness to learn new technologies.
- Detail-oriented with a focus on reliability and accuracy.
Preferred Qualifications:
- Bachelor’s degree in Computer Science, Information Systems, or a related field.
- Experience in Tier 1 & Tier 2 application support roles.
- Exposure to real-time data integration systems is an advantage.
Position : Senior Data Analyst
Experience Required : 5 to 8 Years
Location : Hyderabad or Bangalore (Work Mode: Hybrid – 3 Days WFO)
Shift Timing : 11:00 AM – 8:00 PM IST
Notice Period : Immediate Joiners Only
Job Summary :
We are seeking a highly analytical and experienced Senior Data Analyst to lead complex data-driven initiatives that influence key business decisions.
The ideal candidate will have a strong foundation in data analytics, cloud platforms, and BI tools, along with the ability to communicate findings effectively across cross-functional teams. This role also involves mentoring junior analysts and collaborating closely with business and tech teams.
Key Responsibilities :
- Lead the design, execution, and delivery of advanced data analysis projects.
- Collaborate with stakeholders to identify KPIs, define requirements, and develop actionable insights.
- Create and maintain interactive dashboards, reports, and visualizations.
- Perform root cause analysis and uncover meaningful patterns from large datasets.
- Present analytical findings to senior leaders and non-technical audiences.
- Maintain data integrity, quality, and governance in all reporting and analytics solutions.
- Mentor junior analysts and support their professional development.
- Coordinate with data engineering and IT teams to optimize data pipelines and infrastructure.
Must-Have Skills :
- Strong proficiency in SQL and Databricks
- Hands-on experience with cloud data platforms (AWS, Azure, or GCP)
- Sound understanding of data warehousing concepts and BI best practices
Good-to-Have :
- Experience with AWS
- Exposure to machine learning and predictive analytics
- Industry-specific analytics experience (preferred but not mandatory)
🚀 Blitz Drive : .NET Full Stack Developer – In-Person Interviews on 18th June 2025 | Hyderabad
- We are conducting a Blitz Hiring Drive for the position of .NET Full Stack Developer on 18th June 2025 (Tuesday) at Hyderabad. This will be an in-person interview process.
🔍 Job Details :
- Position : .NET Full Stack Developer
- Experience : 3 to 8 Years
- Number of Positions : 6
- Job Location : Hyderabad (Onsite – In-Person Interview)
- Interview Date : 18th June 2025
- Notice Period : Immediate to 15 days preferred
✅ Mandatory Skills :
Core .NET, Angular (v8+), SQL (complex queries, stored procedures), REST API development, Entity Framework, LINQ, RxJS, and Dependency Injection.
🛠️ Technical Skill Requirements :
- Frontend : Angular (v8+), RxJS, TypeScript, Bootstrap 5, Reactive/Template Forms, Telerik Kendo, NX mono repo
- Backend : Core .NET, REST APIs, Entity Framework, LINQ, Middleware, Auth, DI, OOPS
- Database : SQL Server, Complex Queries, Joins, Stored Procedures, Performance Tuning
- Good to Have : Git, Cloud Basics (Azure/AWS), CI/CD understanding
Job Title : Cognos BI Developer
Experience : 6+ Years
Location : Bangalore / Hyderabad (Hybrid)
Notice Period : Immediate Joiners Preferred (Candidates serving notice with 10–15 days left can be considered)
Interview Mode : Virtual
Job Description :
We are seeking an experienced Cognos BI Developer with strong data modeling, dashboarding, and reporting expertise to join our growing team. The ideal candidate should have a solid background in business intelligence, data visualization, and performance analysis, and be comfortable working in a hybrid setup from Bangalore or Hyderabad.
Mandatory Skills :
Cognos BI, Framework Manager, Cognos Dashboarding, SQL, Data Modeling, Report Development (charts, lists, cross tabs, maps), ETL Concepts, KPIs, Drill-through, Macros, Prompts, Filters, Calculations.
Key Responsibilities :
- Understand business requirements in the BI context and design data models using Framework Manager to transform raw data into meaningful insights.
- Develop interactive dashboards and reports using Cognos Dashboard.
- Identify and define KPIs and create reports to monitor them effectively.
- Analyze data and present actionable insights to support business decision-making.
- Translate business requirements into technical specifications and determine timelines for execution.
- Design and develop models in Framework Manager, publish packages, manage security, and create reports based on these packages.
- Develop various types of reports, including charts, lists, cross tabs, and maps, and design dashboards combining multiple reports.
- Implement reports using macros, prompts, filters, and calculations.
- Perform data warehouse development activities and ensure seamless data flow.
- Write and optimize SQL queries to investigate data and resolve performance issues.
- Utilize Cognos features such as master-detail reports, drill-throughs, bookmarks, and page sets.
- Analyze and improve ETL processes to enhance data integration.
- Apply technical enhancements to existing BI systems to improve their performance and usability.
- Possess solid understanding of database fundamentals, including relational and multidimensional database design.
- Hands-on experience with Cognos Data Modules (data modeling) and dashboarding.
Job Title : Python Data Engineer
Experience : 4+ Years
Location : Bangalore / Hyderabad (On-site)
Job Summary :
We are seeking a skilled Python Data Engineer to work on cloud-native data platforms and backend services.
The role involves building scalable APIs, working with diverse data systems, and deploying containerized services using modern cloud infrastructure.
Mandatory Skills : Python, AWS, RESTful APIs, Microservices, SQL/PostgreSQL/NoSQL, Docker, Kubernetes, CI/CD (Jenkins/GitLab CI/AWS CodePipeline)
Key Responsibilities :
- Design, develop, and maintain backend systems using Python.
- Build and manage RESTful APIs and microservices architectures.
- Work extensively with AWS cloud services for deployment and data storage.
- Implement and manage SQL, PostgreSQL, and NoSQL databases.
- Containerize applications using Docker and orchestrate with Kubernetes.
- Set up and maintain CI/CD pipelines using Jenkins, GitLab CI, or AWS CodePipeline.
- Collaborate with teams to ensure scalable and reliable software delivery.
- Troubleshoot and optimize application performance.
Must-Have Skills :
- 4+ years of hands-on experience in Python backend development.
- Strong experience with AWS cloud infrastructure.
- Proficiency in building microservices and APIs.
- Good knowledge of relational and NoSQL databases.
- Experience with Docker and Kubernetes.
- Familiarity with CI/CD tools and DevOps processes.
- Strong problem-solving and collaboration skills.




















