50+ SQL Jobs in Bangalore (Bengaluru) | SQL Job openings in Bangalore (Bengaluru)
Apply to 50+ SQL Jobs in Bangalore (Bengaluru) on CutShort.io. Explore the latest SQL Job opportunities across top companies like Google, Amazon & Adobe.


Role overview
- Overall 5 to 7 years of experience. Node.js experience is must.
- At least 3+ years of experience or couple of large-scale products delivered on microservices.
- Strong design skills on microservices and AWS platform infrastructure.
- Excellent programming skill in Python, Node.js and Java.
- Hands on development in rest API’s.
- Good understanding of nuances of distributed systems, scalability, and availability.
What would you do here
- To Work as a Backend Developer in developing Cloud Web Applications
- To be part of the team working on various types of web applications related to Mortgage Finance.
- Experience in solving a real-world problem of Implementing, Designing and helping develop a new Enterprise-class Product from ground-up.
- You have expertise in the AWS Cloud Infrastructure and Micro-services architecture around the AWS Service stack like Lambdas, SQS, SNS, MySQL Databases along with Dockers and containerized solutions/applications.
- Experienced in Relational and No-SQL databases and scalable design.
- Experience in solving challenging problems by developing elegant, maintainable code.
- Delivered rapid iterations of software based on user feedback and metrics.
- Help the team make key decisions on our product and technology direction.
- You actively contribute to the adoption of frameworks, standards, and new technologies.
Job Title : Oracle Analytics Cloud (OAC) / Fusion Data Intelligence (FDI) Specialist
Experience : 3 to 8 years
Location : All USI locations – Hyderabad, Bengaluru, Mumbai, Gurugram (preferred) and Pune, Chennai, Kolkata
Work Mode : Hybrid Only (2-3 days from office or all 5 days from office)
Mandatory Skills : Oracle Analytics Cloud (OAC), Fusion Data Intelligence (FDI), RPD, OAC Reports, Data Visualizations, SQL, PL/SQL, Oracle Databases, ODI, Oracle Cloud Infrastructure (OCI), DevOps tools, Agile methodology.
Key Responsibilities :
- Design, develop, and maintain solutions using Oracle Analytics Cloud (OAC).
- Build and optimize complex RPD models, OAC reports, and data visualizations.
- Utilize SQL and PL/SQL for data querying and performance optimization.
- Develop and manage applications hosted on Oracle Cloud Infrastructure (OCI).
- Support Oracle Cloud migrations, OBIEE upgrades, and integration projects.
- Collaborate with teams using the ODI (Oracle Data Integrator) tool for ETL processes.
- Implement cloud scripting using CURL for Oracle Cloud automation.
- Contribute to the design and implementation of Business Continuity and Disaster Recovery strategies for cloud applications.
- Participate in Agile development processes and DevOps practices including CI/CD and deployment orchestration.
Required Skills :
- Strong hands-on expertise in Oracle Analytics Cloud (OAC) and/or Fusion Data Intelligence (FDI).
- Deep understanding of data modeling, reporting, and visualization techniques.
- Proficiency in SQL, PL/SQL, and relational databases on Oracle.
- Familiarity with DevOps tools, version control, and deployment automation.
- Working knowledge of Oracle Cloud services, scripting, and monitoring.
Good to Have :
- Prior experience in OBIEE to OAC migrations.
- Exposure to data security models and cloud performance tuning.
- Certification in Oracle Cloud-related technologies.

Job Title : IBM Sterling Integrator Developer
Experience : 3 to 5 Years
Locations : Hyderabad, Bangalore, Mumbai, Gurgaon, Chennai, Pune
Employment Type : Full-Time
Job Description :
We are looking for a skilled IBM Sterling Integrator Developer with 3–5 years of experience to join our team across multiple locations.
The ideal candidate should have strong expertise in IBM Sterling and integration, along with scripting and database proficiency.
Key Responsibilities :
- Develop, configure, and maintain IBM Sterling Integrator solutions.
- Design and implement integration solutions using IBM Sterling.
- Collaborate with cross-functional teams to gather requirements and provide solutions.
- Work with custom languages and scripting to enhance and automate integration processes.
- Ensure optimal performance and security of integration systems.
Must-Have Skills :
- Hands-on experience with IBM Sterling Integrator and associated integration tools.
- Proficiency in at least one custom scripting language.
- Strong command over Shell scripting, Python, and SQL (mandatory).
- Good understanding of EDI standards and protocols is a plus.
Interview Process :
- 2 Rounds of Technical Interviews.
Additional Information :
- Open to candidates from Hyderabad, Bangalore, Mumbai, Gurgaon, Chennai, and Pune.

Job Summary:
As an AWS Data Engineer, you will be responsible for designing, developing, and maintaining scalable, high-performance data pipelines using AWS services. With 6+ years of experience, you’ll collaborate closely with data architects, analysts, and business stakeholders to build reliable, secure, and cost-efficient data infrastructure across the organization.
Key Responsibilities:
- Design, develop, and manage scalable data pipelines using AWS Glue, Lambda, and other serverless technologies
- Implement ETL workflows and transformation logic using PySpark and Python on AWS Glue
- Leverage AWS Redshift for warehousing, performance tuning, and large-scale data queries
- Work with AWS DMS and RDS for database integration and migration
- Optimize data flows and system performance for speed and cost-effectiveness
- Deploy and manage infrastructure using AWS CloudFormation templates
- Collaborate with cross-functional teams to gather requirements and build robust data solutions
- Ensure data integrity, quality, and security across all systems and processes
Required Skills & Experience:
- 6+ years of experience in Data Engineering with strong AWS expertise
- Proficient in Python and PySpark for data processing and ETL development
- Hands-on experience with AWS Glue, Lambda, DMS, RDS, and Redshift
- Strong SQL skills for building complex queries and performing data analysis
- Familiarity with AWS CloudFormation and infrastructure as code principles
- Good understanding of serverless architecture and cost-optimized design
- Ability to write clean, modular, and maintainable code
- Strong analytical thinking and problem-solving skills

- Strong Snowflake Cloud database experience Database developer.
- Knowledge of Spark and Databricks is desirable.
- Strong technical background in data modelling, database design and optimization for data warehouses, specifically on column oriented MPP architecture
- Familiar with technologies relevant to data lakes such as Snowflake
- Candidate should have strong ETL & database design/modelling skills.
- Experience creating data pipelines
- Strong SQL skills and debugging knowledge and Performance Tuning exp.
- Experience with Databricks / Azure is add on /good to have .
- Experience working with global teams and global application environments
- Strong understanding of SDLC methodologies with track record of high quality deliverables and data quality, including detailed technical design documentation desired
Job Title : Senior Salesforce Marketing Cloud Developer
Experience : 5–8 years
Location : Bangalore (On-site)
Notice Period : Immediate to 15 Days Only
Job Description :
We are looking for a skilled Senior Salesforce Marketing Cloud Developer with 5 to 8 years of hands-on experience.
The ideal candidate will be responsible for designing and implementing personalized marketing journeys, automation workflows, and multi-channel campaigns using Salesforce Marketing Cloud (SFMC).
This role requires strong expertise in APIs, scripting, and data handling to build scalable, personalized customer experiences.
Must-Have Skills :
- Salesforce Marketing Cloud (Automation Studio, Journey Builder, Email Studio, Mobile Studio)
- API Integration
- Strong SQL skills
- AMPscript & JavaScript
Good-to-Have Skills :
- HTML & CSS
- Interaction Studio / Marketing Cloud Personalization

Contract Duration: 6 months Contract to Hir
Job Description:
We are seeking an experienced and highly skilled Fullstack Developer with expertise in building and maintaining web applications. The ideal candidate will have a strong background in both frontend and backend technologies, with the ability to contribute to every aspect of the development lifecycle. This contract role offers the opportunity to work on cutting-edge projects in a fast-paced environment.
Key Responsibilities:
Develop, test, and deploy scalable and efficient web applications using React for the frontend and Node.js for the backend.
Design and manage relational databases with SQL and implement non-relational databases with MongoDB.
Collaborate with cross-functional teams to define, design, and ship new features.
Implement cloud-based solutions and optimize for performance, security, and scalability using AWS, GCP, or Azure.
Work with DevOps tools to ensure continuous integration and deployment (CI/CD).
Participate in code reviews, provide constructive feedback, and mentor junior developers when necessary.
Troubleshoot, debug, and optimize code to ensure high performance and reliability.
Ensure application security, data protection, and maintainability.
Required Skills & Experience:
5+ years of professional experience as a Fullstack Developer.
Strong proficiency with React and Node.js.
Solid experience in working with SQL (MySQL, PostgreSQL, etc.) and MongoDB.
Proven experience with cloud services like AWS, GCP, or Azure.
Experience with DevOps tools and CI/CD pipelines.
Strong understanding of web application architecture and modern design patterns.
Familiarity with RESTful APIs and integration practices.
Ability to work independently and in an agile team environment.
Desirable Skills:
Knowledge of containerization and orchestration tools (e.g., Docker, Kubernetes).
Experience with microservices architecture.
Familiarity with frontend testing frameworks (e.g., Jest, Mocha).
Strong problem-solving skills and a passion for building high-quality software.
Skill Name: ETL Automation Testing
Location: Bangalore, Chennai and Pune
Experience: 5+ Years
Required:
Experience in ETL Automation Testing
Strong experience in Pyspark.

Required Skills:
- Hands-on experience with Databricks, PySpark
- Proficiency in SQL, Python, and Spark.
- Understanding of data warehousing concepts and data modeling.
- Experience with CI/CD pipelines and version control (e.g., Git).
- Fundamental knowledge of any cloud services, preferably Azure or GCP.
Good to Have:
- Bigquery
- Experience with performance tuning and data governance.
- Extract Transform Load (ETL) and ETL Tools skills
- Data Modeling and Data Integration expertise
- Data Warehousing knowledge
- Experience in working with SQL databases
- Strong analytical and problem-solving abilities
- Excellent communication and interpersonal skills
- Bachelor's degree in Computer Science, Information Systems, or related field
- Relevant certifications in ETL Testing or Data Warehousing


Job Responsibilities:
- Design, develop, test, and maintain high-performance web applications and backend services using Python.
- Build scalable, secure, and reliable backend systems and APIs.
- Optimize and debug existing codebases to enhance performance and maintainability.
- Collaborate closely with cross-functional teams to gather requirements and deliver high-quality solutions.
- Mentor junior developers, conduct code reviews, and uphold best coding practices.
- Write clear, comprehensive technical documentation for internal and external use.
- Stay current with emerging technologies, tools, and industry trends to continually improve development processes.
Qualifications:
- Bachelor's degree in Computer Science, Engineering, or a related field.
- 5+ years of hands-on experience in Python development.
- Strong expertise Flask.
- In-depth understanding of software design principles, architecture, and design patterns.
- Proven experience working with both SQL and NoSQL databases.
- Solid debugging and problem-solving capabilities.
- Effective communication and collaboration skills, with a team-first mindset.
Technical Skills:
- Programming: Python (Advanced)
- Web Frameworks: Flask
- Databases: PostgreSQL, MySQL, MongoDB, Redis
- Version Control: Git
- API Development: RESTful APIs
- Containerization & Orchestration: Docker, Kubernetes
- Cloud Platforms: AWS or Azure (hands-on experience preferred)
- DevOps: CI/CD pipelines (e.g., Jenkins, GitHub Actions)
Qualifications:
- Must have a Bachelor’s degree in computer science or equivalent.
- Must have at least 5+ years’ experience as a SDET.
- At least 1+ year of leadership experience or managing a team.
Responsibilities:
- Design, develop and execute automation scripts using open-source tools.
- Troubleshooting any errors and streamlining the testing procedures.
- Writing and executing detailed test plans, test design & test cases covering feature, integration, regression, certification, system level testing as well as release validation in production.
- Identify, analyze and create detailed records of problems that appear during testing, such as software defects, bugs, functionality issues, and output errors, and work directly with software developers to find solutions and develop retesting procedures.
- Good time-management skills and commitment to meet deadlines.
- Stay up-to-date with new testing tools and test strategies.
- Driving technical projects and providing leadership in an innovative and fast-paced environment.
Requirements:
- Experience in the Automation - API and UI as well as Manual Testing on Web Application.
- Experience in frameworks like Playwright / Selenium Web Driver / Robot Framework / Rest-Assured.
- Must be proficient in Performance Testing tools like K6 / Gatling / JMeter.
- Must be proficient in Core Java / Type Script and Java 17.
- Experience in JUnit-5.
- Good to have TypeScript experience.
- Good to have RPA Experience using Java or any other tools like Robot Framework / Automation Anywhere.
- Experience in SQL (like MySQL, PG) & No-SQL Database (like MongoDB).
- Good understanding of software & systems architecture.
- Well acquainted with Agile Methodology, Software Development Life Cycle (SDLC), Software Test Life Cycle (STLC) and Automation Test Life Cycle.
- Strong experience REST based components testing, back-end, DB and micro services testing.
Work Location: Jayanagar - Bangalore.
Manual Tester with Crew Domain Knowledge
Skill Set/Experience
1. • Experience: 5–12 Years
2. • Locations: Chennai, Bangalore, Trivandrum, Cochin
3. • Notice Period: Immediate to <30 days
- Very good understanding of airline Crew and Ops business processes
- Experience in UAT and SIT testing including on-site customer interactions (added advantage).
- Strong business/testing skills – Use case review, Test case creation (Interface, Functional and UAT and End to End ) , review and execution
- Basic knowledge of SQL, UNIX
- Experience in API/Interfaces testing
- Experience with any of the defect tracking systems.
- Exposure in using selenium Test Automation tool is added advantage

Job Summary:
We are looking for a motivated and detail-oriented Data Engineer with 1–2 years of experience to join our data engineering team. The ideal candidate should have solid foundational skills in SQL and Python, along with exposure to building or maintaining data pipelines. You’ll play a key role in helping to ingest, process, and transform data to support various business and analytical needs.
Key Responsibilities:
- Assist in the design, development, and maintenance of scalable and efficient data pipelines.
- Write clean, maintainable, and performance-optimized SQL queries.
- Develop data transformation scripts and automation using Python.
- Support data ingestion processes from various internal and external sources.
- Monitor data pipeline performance and help troubleshoot issues.
- Collaborate with data analysts, data scientists, and other engineers to ensure data quality and consistency.
- Work with cloud-based data solutions and tools (e.g., AWS, Azure, GCP – as applicable).
- Document technical processes and pipeline architecture.
Core Skills Required:
- Proficiency in SQL (data querying, joins, aggregations, performance tuning).
- Experience with Python, especially in the context of data manipulation (e.g., pandas, NumPy).
- Exposure to ETL/ELT pipelines and data workflow orchestration tools (e.g., Airflow, Prefect, Luigi – preferred).
- Understanding of relational databases and data warehouse concepts.
- Familiarity with version control systems like Git.
Note: Its mandatory to give one round F2F from Bangalore office and the candidate should be based out of Bangalore.

We are looking for:
• 2+ years of expertise in software development with one or more of the general programming languages (e.g., Python, Java, C/C++, Go). Experience in Python and Django is recommended.
• Deep understanding of how to build an application with optimized RESTful APIs.
• Knowledge of a web framework like Django or similar with ORM or multi-tier, multi-DB-based data-heavy web application development will help your profile stand out.
• Knowledge of Gen AI tools and technologies is a plus.
• Sound knowledge of SQL queries & DB like PostgreSQL(must) or MySQL. Working knowledge of NoSQL DBs (Elasticsearch, Mongo, Redis, etc.) is a plus.
• Knowledge of graph DB like Neo4j or AWS Neptune adds extra credits to your profile.
• Knowing queue-based messaging frameworks like Celery, RQ, Kafka, etc., and distributed system understanding will be advantageous.
• Understands a programming language's limitations to exploit the language behavior to the fullest potential.
• Understanding of accessibility and security compliances
• Ability to communicate complex technical concepts to both technical and non- technical audiences with ease
• Diversity in skills like version control tools, CI/CD, cloud basics, good debugging skills, and test-driven development will help your profile stand out.

Job Description
• Role: Quality Assurance Engineer – Automation (3–4 yrs)
• Location: Bengaluru
• Type: Full-time
Why this role? Join a fast-moving team that’s pushing test automation into the AI era. You’ll own end-to-end quality for web, mobile and API layers, combining Playwright (or similar) with next-gen, AI-driven test platforms to deliver smarter, faster releases.
What you’ll do
• Build & maintain automation with Playwright, Selenium, Cypress or equivalent
• Super-charge coverage using AI-powered tools
• Create, run and optimize manual, API (Postman/Rest Assured) and database (SQL) tests
• Triage results, file defects in Jira, and drive them to closure What you bring
• 3–4 years’ hands-on automation experience
• Strong with Playwright (preferred) or Selenium/Cypress and one scripting language (JS/TS, Python or Java)
• Familiarity with AI-based testing platforms
• Solid API testing & SQL skills; sound grasp of STLC and defect management
• Clear communicator with sharp analytical instincts
• Nice to have: BDD (Cucumber/SpecFlow), performance testing (JMeter/LoadRunner), TestRail/Zephyr, ML model validation Qualifications Bachelor’s in Computer Science, Engineering or related field What’s in it for you?
• Hands-on exposure to cutting-edge AI test automation
• Ownership and room to innovate in a collaborative, high-impact environment
• Competitive pay, flexible policies and a fun teaM

Job Title: Ui Path Developer
Experience: 5 to 8 years
Location: Bangalore
Notice Period: Immediate to 15 days
Key Responsibilities:
1. Develop and implement automation solutions using UiPath.
2. Design, develop, test, and deploy RPA bots for process automation.
3. Write and optimize SQL queries, including joins, to manage and manipulate data effectively.
4. Develop scripts using Python, VB, .NET, or JavaScript to enhance automation capabilities.
5. Work with business stakeholders to analyze and optimize automation workflows.
6. Troubleshoot and resolve issues in RPA processes and scripts.
7. Ensure adherence to best practices in automation development and deployment.
Required Skills & Experience:
1. 5-8years of experience in RPA development with UiPath.
2. Strong expertise in SQL, including writing complex queries and joins.
3. Hands-on experience with at least one scripting language: Python, VB, .NET, or JavaScript.
4. Understanding of RPA best practices, exception handling, and performance optimization.
5. Experience integrating UiPath with databases and other applications.
6. Strong problem-solving and analytical skills.
Our Client details
🔍 Who We Are:
Join Leading Healthcare, a U.S.-based product company transforming the $1.1T health insurance space. From our growing Bangalore tech hub, we power systems that support 81M+ lives and process millions of health claims daily.
🚨 Now Hiring: Senior Java Technical Support Engineer 🚨
📍 Location: Bangalore | Hybrid | Onsite role
💼 Experience: 4–8 Years | 🕒 Immediate to 30 Days
Are you the Java Support Rockstar we’re looking for? 🎸
Join our team in Bangalore to solve real-world problems in the healthcare domain!
Your Mission:
🛠 Development | 🧠 Troubleshooting | 🔍 Analysis | 🧪 Testing
🌐 Java | J2EE | REST APIs | WebLogic/WebSphere | Oracle RDBMS
🐧 Linux Scripting | 🧵 Multi-threading & debugging | ♻️ Garbage Collection | Solid SQL | Concurrency, GC, serialization |Strong Scripting skills in Shell/Unix
You’ll be working on:
🩺 Critical Healthcare Systems
📞 Supporting enterprise customers
🔧 Debugging, testing, enhancing platforms
🌍 Collaborating with global teams
🌐 Work across REST APIs, WebLogic, RDBMS, Linux, Unix, Shell and more
📈 Identify & solve performance bottlenecks
🧠 You’ll Analyze. Troubleshoot. Test. Develop.
👥 Lead teams, support enterprise apps, and build rock-solid systems.
Why You Should Apply:
✅ Leadership opportunities
✅ Cross-functional exposure with US teams
✅ Huge tech learning curve
✅ Hybrid work with global impact

- A bachelor’s degree in Computer Science or a related field.
- 5-7 years of experience working as a hands-on developer in Sybase, DB2, ETL technologies.
- Worked extensively on data integration, designing, and developing reusable interfaces Advanced experience in Python, DB2, Sybase, shell scripting, Unix, Perl scripting, DB platforms, database design and modeling.
- Expert level understanding of data warehouse, core database concepts and relational database design.
- Experience in writing stored procedures, optimization, and performance tuning Strong Technology acumen and a deep strategic mindset.
- Proven track record of delivering results
- Proven analytical skills and experience making decisions based on hard and soft data
- A desire and openness to learning and continuous improvement, both of yourself and your team members.
- Hands-on experience on development of APIs is a plus
- Good to have experience with Business Intelligence tools, Source to Pay applications such as SAP Ariba, and Accounts Payable system Skills Required
- Familiarity with Postgres and Python is a plus
Job Summary:
Seeking a seasoned SQL + ETL Developer with 4+ years of experience in managing large-scale datasets and cloud-based data pipelines. The ideal candidate is hands-on with MySQL, PySpark, AWS Glue, and ETL workflows, with proven expertise in AWS migration and performance optimization.
Key Responsibilities:
- Develop and optimize complex SQL queries and stored procedures to handle large datasets (100+ million records).
- Build and maintain scalable ETL pipelines using AWS Glue and PySpark.
- Work on data migration tasks in AWS environments.
- Monitor and improve database performance; automate key performance indicators and reports.
- Collaborate with cross-functional teams to support data integration and delivery requirements.
- Write shell scripts for automation and manage ETL jobs efficiently.
Required Skills:
- Strong experience with MySQL, complex SQL queries, and stored procedures.
- Hands-on experience with AWS Glue, PySpark, and ETL processes.
- Good understanding of AWS ecosystem and migration strategies.
- Proficiency in shell scripting.
- Strong communication and collaboration skills.
Nice to Have:
- Working knowledge of Python.
- Experience with AWS RDS.

Profile: AWS Data Engineer
Mode- Hybrid
Experience- 5+7 years
Locations - Bengaluru, Pune, Chennai, Mumbai, Gurugram
Roles and Responsibilities
- Design and maintain ETL pipelines using AWS Glue and Python/PySpark
- Optimize SQL queries for Redshift and Athena
- Develop Lambda functions for serverless data processing
- Configure AWS DMS for database migration and replication
- Implement infrastructure as code with CloudFormation
- Build optimized data models for performance
- Manage RDS databases and AWS service integrations
- Troubleshoot and improve data processing efficiency
- Gather requirements from business stakeholders
- Implement data quality checks and validation
- Document data pipelines and architecture
- Monitor workflows and implement alerting
- Keep current with AWS services and best practices
Required Technical Expertise:
- Python/PySpark for data processing
- AWS Glue for ETL operations
- Redshift and Athena for data querying
- AWS Lambda and serverless architecture
- AWS DMS and RDS management
- CloudFormation for infrastructure
- SQL optimization and performance tuning

About the Company:
Gruve is an innovative Software Services startup dedicated to empowering Enterprise Customers in managing their Data Life Cycle. We specialize in Cyber Security, Customer Experience, Infrastructure, and advanced technologies such as Machine Learning and Artificial Intelligence. Our mission is to assist our customers in their business strategies utilizing their data to make more intelligent decisions. As a well-funded early-stage startup, Gruve offers a dynamic environment with strong customer and partner networks.
Why Gruve:
At Gruve, we foster a culture of innovation, collaboration, and continuous learning. We are committed to building a diverse and inclusive workplace where everyone can thrive and contribute their best work. If you’re passionate about technology and eager to make an impact, we’d love to hear from you.
Gruve is an equal opportunity employer. We welcome applicants from all backgrounds and thank all who apply; however, only those selected for an interview will be contacted.
Position summary:
We are seeking a Senior Software Development Engineer – Data Engineering with 5-8 years of experience to design, develop, and optimize data pipelines and analytics workflows using Snowflake, Databricks, and Apache Spark. The ideal candidate will have a strong background in big data processing, cloud data platforms, and performance optimization to enable scalable data-driven solutions.
Key Roles & Responsibilities:
- Design, develop, and optimize ETL/ELT pipelines using Apache Spark, PySpark, Databricks, and Snowflake.
- Implement real-time and batch data processing workflows in cloud environments (AWS, Azure, GCP).
- Develop high-performance, scalable data pipelines for structured, semi-structured, and unstructured data.
- Work with Delta Lake and Lakehouse architectures to improve data reliability and efficiency.
- Optimize Snowflake and Databricks performance, including query tuning, caching, partitioning, and cost optimization.
- Implement data governance, security, and compliance best practices.
- Build and maintain data models, transformations, and data marts for analytics and reporting.
- Collaborate with data scientists, analysts, and business teams to define data engineering requirements.
- Automate infrastructure and deployments using Terraform, Airflow, or dbt.
- Monitor and troubleshoot data pipeline failures, performance issues, and bottlenecks.
- Develop and enforce data quality and observability frameworks using Great Expectations, Monte Carlo, or similar tools.
Basic Qualifications:
- Bachelor’s or Master’s Degree in Computer Science or Data Science.
- 5–8 years of experience in data engineering, big data processing, and cloud-based data platforms.
- Hands-on expertise in Apache Spark, PySpark, and distributed computing frameworks.
- Strong experience with Snowflake (Warehouses, Streams, Tasks, Snowpipe, Query Optimization).
- Experience in Databricks (Delta Lake, MLflow, SQL Analytics, Photon Engine).
- Proficiency in SQL, Python, or Scala for data transformation and analytics.
- Experience working with data lake architectures and storage formats (Parquet, Avro, ORC, Iceberg).
- Hands-on experience with cloud data services (AWS Redshift, Azure Synapse, Google BigQuery).
- Experience in workflow orchestration tools like Apache Airflow, Prefect, or Dagster.
- Strong understanding of data governance, access control, and encryption strategies.
- Experience with CI/CD for data pipelines using GitOps, Terraform, dbt, or similar technologies.
Preferred Qualifications:
- Knowledge of streaming data processing (Apache Kafka, Flink, Kinesis, Pub/Sub).
- Experience in BI and analytics tools (Tableau, Power BI, Looker).
- Familiarity with data observability tools (Monte Carlo, Great Expectations).
- Experience with machine learning feature engineering pipelines in Databricks.
- Contributions to open-source data engineering projects.

Job Overview:
We are seeking an experienced AWS Data Engineer to join our growing data team. The ideal candidate will have hands-on experience with AWS Glue, Redshift, PySpark, and other AWS services to build robust, scalable data pipelines. This role is perfect for someone passionate about data engineering, automation, and cloud-native development.
Key Responsibilities:
- Design, build, and maintain scalable and efficient ETL pipelines using AWS Glue, PySpark, and related tools.
- Integrate data from diverse sources and ensure its quality, consistency, and reliability.
- Work with large datasets in structured and semi-structured formats across cloud-based data lakes and warehouses.
- Optimize and maintain data infrastructure, including Amazon Redshift, for high performance.
- Collaborate with data analysts, data scientists, and product teams to understand data requirements and deliver solutions.
- Automate data validation, transformation, and loading processes to support real-time and batch data processing.
- Monitor and troubleshoot data pipeline issues and ensure smooth operations in production environments.
Required Skills:
- 5 to 7 years of hands-on experience in data engineering roles.
- Strong proficiency in Python and PySpark for data transformation and scripting.
- Deep understanding and practical experience with AWS Glue, AWS Redshift, S3, and other AWS data services.
- Solid understanding of SQL and database optimization techniques.
- Experience working with large-scale data pipelines and high-volume data environments.
- Good knowledge of data modeling, warehousing, and performance tuning.
Preferred/Good to Have:
- Experience with workflow orchestration tools like Airflow or Step Functions.
- Familiarity with CI/CD for data pipelines.
- Knowledge of data governance and security best practices on AWS.
Role - ETL Developer
Work Mode - Hybrid
Experience- 4+ years
Location - Pune, Gurgaon, Bengaluru, Mumbai
Required Skills - AWS, AWS Glue, Pyspark, ETL, SQL
Required Skills:
- 4+ years of hands-on experience in MySQL, including SQL queries and procedure development
- Experience in Pyspark, AWS, AWS Glue
- Experience in AWS ,Migration
- Experience with automated scripting and tracking KPIs/metrics for database performance
- Proficiency in shell scripting and ETL.
- Strong communication skills and a collaborative team player
- Knowledge of Python and AWS RDS is a plus
Role : Java Developer
Location : Bangalore
Key responsibilities
- Experience – 3 to 8 years of experience.
- Experience in Core Java and Spring Boot.
- Extensive experience in developing enterprise-scale applications and systems. Should possess good architectural knowledge and be aware of enterprise application design patterns.
- Should have the ability to analyze, design, develop and test complex, low-latency client facing applications.
- Good development experience with RDBMS
- Good knowledge of multi-threading and high-performance server-side development.
- Basic working knowledge of Unix/Linux.
- Excellent problem solving and coding skills.
- Strong interpersonal, communication and analytical skills.
- Should have the ability to express their design ideas and thoughts.
About Wissen Technology:
The Wissen Group was founded in the year 2000. Wissen Technology, a part of Wissen Group, was established in the year 2015. Wissen Technology is a specialized technology company that delivers high-end consulting for organizations in the Banking & Finance, Telecom, and Healthcare domains. We help clients build world class products.
Our workforce consists of highly skilled professionals, with leadership and senior management executives who have graduated from Ivy League Universities like Wharton, MIT, IITs, IIMs, and NITs and with rich work experience in some of the biggest companies in the world. Wissen Technology has grown its revenues by 400% in these five years without any external funding or investments. Globally present with offices US, India, UK, Australia, Mexico, and Canada.
We offer an array of services including Application Development, Artificial Intelligence & Machine Learning, Big Data & Analytics, Visualization & Business Intelligence, Robotic Process Automation, Cloud, Mobility, Agile & DevOps, Quality Assurance & Test Automation.
We have been certified as a Great Place to Work® company for two consecutive years (2020-2022) and voted as the Top 20 AI/ML vendor by CIO Insider.
Over the years, Wissen Group has successfully delivered $650 million worth of projects for more than 20 of the Fortune 500 companies. Wissen Technology provides exceptional value in mission critical projects for its clients, through thought leadership, ownership, and assured on-time deliveries that are always ‘first time right’.
The technology and thought leadership that the company commands in the industry is the direct result of the kind of people Wissen has been able to attract. Wissen is committed to providing them the best possible opportunities and careers, which extends to providing the best possible experience and value to our clients. We have served clients across sectors like Banking, Ecommerce, Telecom, Healthcare, Manufacturing, and Energy.
Career Progression:
At Wissen Technology, your career growth is important for us. Therefore, we put in several efforts towards each employee’s career progression – to enable and empower them to grow within the company as well as to instill a sense of responsibility, loyalty, and trust.
There have been instances where a software engineer has grown from being an individual contributor to technical lead and now on the path to becoming a director responsible for growing revenues and profitability. We deeply value Ownership: taking responsibility, making it happen, not letting the ball drop, and being accountable.
Job Title: Java Developer
Java Developer – Job Description Wissen Technology is now hiring for a Java Developer - Bangalore with hands-on experience in Core Java, algorithms, data structures, multithreading and SQL. We are solving complex technical problems in the industry and need talented software engineers to join our mission and be a part of a global software development team. A brilliant opportunity to become a part of a highly motivated and expert team which has made a mark as a high-end technical consulting.
Required Skills: • Exp. - 4 to 14 years.
• Experience in Core Java and Spring Boot.
• Extensive experience in developing enterprise-scale applications and systems. Should possess good architectural knowledge and be aware of enterprise application design patterns.
• Should have the ability to analyze, design, develop and test complex, low-latency clientfacing applications.
• Good development experience with RDBMS.
• Good knowledge of multi-threading and high-performance server-side development.
• Basic working knowledge of Unix/Linux.
• Excellent problem solving and coding skills.
• Strong interpersonal, communication and analytical skills.
• Should have the ability to express their design ideas and thoughts.
Hello Everyone,
Job Description Wissen Technology is now hiring for a Java Developer - Bangalore with hands-on experience in Core Java, algorithms, data structures, multithreading and SQL. We are solving complex technical problems in the industry and need talented software engineers to join our mission and be a part of a global software development team. A brilliant opportunity to become a part of a highly motivated and expert team which has made a mark as a high-end technical consulting.
Required Skills: • Exp. - 5 to 14 years.
• Experience in Core Java and Spring Boot.
• Extensive experience in developing enterprise-scale applications and systems. Should possess good architectural knowledge and be aware of enterprise application design patterns.
• Should have the ability to analyze, design, develop and test complex, low-latency clientfacing applications.
• Good development experience with RDBMS.
• Good knowledge of multi-threading and high-performance server-side development.
• Basic working knowledge of Unix/Linux.
• Excellent problem solving and coding skills.
• Strong interpersonal, communication and analytical skills.
• Should have the ability to express their design ideas and thoughts.
About Wissen Technology:
Wissen Technology is a niche global consulting and solutions company that brings unparalleled domain expertise in Banking and Finance, Telecom and Startups. Wissen Technology is a part of Wissen Group and was established in the year 2015. Wissen has offices in the US, India, UK, Australia, Mexico, and Canada, with best-in-class infrastructure and development facilities. Wissen hassuccessfully delivered projects worth $1 Billion for more than 25 of the Fortune 500 companies. The Wissen Group overall includes more than 4000 highly skilled professionals. Wissen Technology provides exceptional value in mission critical projects for its clients, through thought leadership, ownership, and assured on-time deliveries that are always ‘first time right’. Our team consists of 1200+ highly skilled professionals, with leadership and senior management executives who have graduated from Ivy League Universities like Wharton, MIT, IITs, IIMs, and NITs and with rich work experience in some of the biggest companies in the world. Wissen Technology offers an array of services including Application Development, Artificial Intelligence & Machine Learning, Big Data & Analytics, Visualization & Business Intelligence, Robotic Process Automation, Cloud, Mobility, Agile & DevOps, Quality Assurance & Test Automation. We have been certified as a Great Place to Work® for two consecutive years (2020-2022) and voted as the Top 20 AI/ML vendor by CIO Insider.

Job Description:
Wissen Technology is looking for a skilled Automation Anywhere Engineer to join our dynamic team in Bangalore. The ideal candidate will have hands-on experience in Automation Anywhere , Document Automation , SQL , and Python , with a strong background in designing and implementing automation solutions.
Key Responsibilities:
- Design, develop, and deploy automation solutions using Automation Anywhere.
- Work on Document Automation to extract, process, and validate structured/unstructured data.
- Develop scripts and automation solutions using Python for enhanced process efficiency.
- Optimize data processing workflows and database queries using SQL.
- Collaborate with cross-functional teams to identify automation opportunities and enhance business processes.
- Perform unit testing, debugging, and troubleshooting of automation scripts.
- Ensure adherence to industry best practices and compliance standards in automation processes.
- Provide support, maintenance, and enhancements to existing automation solutions.
Required Skills & Qualifications:
- 4 to 8 years of experience in RPA development using Automation Anywhere.
- Strong expertise in Automation Anywhere A360(preferable).
- Hands-on experience with Document Automation tools and technologies.
- Proficiency in Python for scripting and automation.
- Strong knowledge of SQL for data processing and querying.
- Experience in troubleshooting, debugging, and optimizing automation workflows.
- Ability to work in an Agile environment and collaborate with cross-functional teams.
- Excellent problem-solving skills and attention to detail.

Responsibilities:
- Develop and maintain high-quality, efficient, and scalable backend applications.
- Participate in all phases of the software development lifecycle (SDLC)
- Write clean, well-documented, and testable code adhering to best practices.
- Collaborate with team members to ensure the successful delivery of projects.
- Debug and troubleshoot complex technical problems.
- Identify and implement performance optimizations.
- Participate in code reviews
- Hands-on experience with Springboot, Java 8 and above.
- 5-8 years of experience developing Java applications.
- Knowledge about at least one messaging system like Kafka, RabbitMQ etc.
- Required React developer requirements, qualifications & skills:
- Proficiency in React.js and its core principles
- Strong JavaScript, HTML5, and CSS3 skills
- Experience with popular React.js workflows (such as Redux)
- Strong understanding of object-oriented programming (OOP) principles.
- Experience with design patterns and best practices for Java development.
- Proficient in unit testing frameworks (e.g., JUnit).
- Experience with build automation tools (e.g., Maven, Gradle).
- Experience with version control systems (e.g., Git).
- Experience with one of these databases – Postgres, MongoDb, Cassandra
- Knowledge on Retail or OMS is a plus.
- Experienced in containerized deployments using Docker, Kubernetes and DevOps mindset
- Ability to reverse engineer existing/legacy and document findings on confluence.
- Create automated tests for unit, integration, regression, performance, and functional testing, to meet established expectations and acceptance criteria.
- Document APIs using Lowe’s established tooling.
Job Title: Backend Developer
Location: In-Office, Bangalore, Karnataka, India
Job Summary:
We are seeking a highly skilled and experienced Backend Developer with a minimum of 1 year of experience in product building to join our dynamic and innovative team. In this role, you will be responsible for designing, developing, and maintaining robust backend systems that drive our applications. You will collaborate with cross-functional teams to ensure seamless integration between frontend and backend components, and your expertise will be critical in architecting scalable, secure, and high-performance backend solutions.
Annual Compensation: 6-10 LPA
Responsibilities:
- Design, develop, and maintain scalable and efficient backend systems and APIs using NodeJS.
- Architect and implement complex backend solutions, ensuring high availability and performance.
- Collaborate with product managers, frontend developers, and other stakeholders to deliver comprehensive end-to-end solutions.
- Design and optimize data storage solutions using relational databases (e.g., MySQL) and NoSQL databases (e.g., MongoDB, Redis).
- Promoting a culture of collaboration, knowledge sharing, and continuous improvement.
- Implement and enforce best practices for code quality, security, and performance optimization.
- Develop and maintain CI/CD pipelines to automate build, test, and deployment processes.
- Ensure comprehensive test coverage, including unit testing, and implement various testing methodologies and tools to validate application functionality.
- Utilize cloud services (e.g., AWS, Azure, GCP) for infrastructure deployment, management, and optimization.
- Conduct system design reviews and contribute to architectural discussions.
- Stay updated with industry trends and emerging technologies to drive innovation within the team.
- Implement secure authentication and authorization mechanisms and ensure data encryption for sensitive information.
- Design and develop event-driven applications utilizing serverless computing principles to enhance scalability and efficiency.
Requirements:
- Minimum of 1 year of proven experience as a Backend Developer, with a strong portfolio of product-building projects.
- Extensive experience with JavaScript backend frameworks (e.g., Express, Socket) and a deep understanding of their ecosystems.
- Strong expertise in SQL and NoSQL databases (MySQL and MongoDB) with a focus on data modeling and scalability.
- Practical experience with Redis and caching mechanisms to enhance application performance.
- Proficient in RESTful API design and development, with a strong understanding of API security best practices.
- In-depth knowledge of asynchronous programming and event-driven architecture.
- Familiarity with the entire web stack, including protocols, web server optimization techniques, and performance tuning.
- Experience with containerization and orchestration technologies (e.g., Docker, Kubernetes) is highly desirable.
- Proven experience working with cloud technologies (AWS/GCP/Azure) and understanding of cloud architecture principles.
- Strong understanding of fundamental design principles behind scalable applications and microservices architecture.
- Excellent problem-solving, analytical, and communication skills.
- Ability to work collaboratively in a fast-paced, agile environment and lead projects to successful completion.


Senior Data Engineer
Location: Bangalore, Gurugram (Hybrid)
Experience: 4-8 Years
Type: Full Time | Permanent
Job Summary:
We are looking for a results-driven Senior Data Engineer to join our engineering team. The ideal candidate will have hands-on expertise in data pipeline development, cloud infrastructure, and BI support, with a strong command of modern data stacks. You’ll be responsible for building scalable ETL/ELT workflows, managing data lakes and marts, and enabling seamless data delivery to analytics and business intelligence teams.
This role requires deep technical know-how in PostgreSQL, Python scripting, Apache Airflow, AWS or other cloud environments, and a working knowledge of modern data and BI tools.
Key Responsibilities:
PostgreSQL & Data Modeling
· Design and optimize complex SQL queries, stored procedures, and indexes
· Perform performance tuning and query plan analysis
· Contribute to schema design and data normalization
Data Migration & Transformation
· Migrate data from multiple sources to cloud or ODS platforms
· Design schema mapping and implement transformation logic
· Ensure consistency, integrity, and accuracy in migrated data
Python Scripting for Data Engineering
· Build automation scripts for data ingestion, cleansing, and transformation
· Handle file formats (JSON, CSV, XML), REST APIs, cloud SDKs (e.g., Boto3)
· Maintain reusable script modules for operational pipelines
Data Orchestration with Apache Airflow
· Develop and manage DAGs for batch/stream workflows
· Implement retries, task dependencies, notifications, and failure handling
· Integrate Airflow with cloud services, data lakes, and data warehouses
Cloud Platforms (AWS / Azure / GCP)
· Manage data storage (S3, GCS, Blob), compute services, and data pipelines
· Set up permissions, IAM roles, encryption, and logging for security
· Monitor and optimize cost and performance of cloud-based data operations
Data Marts & Analytics Layer
· Design and manage data marts using dimensional models
· Build star/snowflake schemas to support BI and self-serve analytics
· Enable incremental load strategies and partitioning
Modern Data Stack Integration
· Work with tools like DBT, Fivetran, Redshift, Snowflake, BigQuery, or Kafka
· Support modular pipeline design and metadata-driven frameworks
· Ensure high availability and scalability of the stack
BI & Reporting Tools (Power BI / Superset / Supertech)
· Collaborate with BI teams to design datasets and optimize queries
· Support development of dashboards and reporting layers
· Manage access, data refreshes, and performance for BI tools
Required Skills & Qualifications:
· 4–6 years of hands-on experience in data engineering roles
· Strong SQL skills in PostgreSQL (tuning, complex joins, procedures)
· Advanced Python scripting skills for automation and ETL
· Proven experience with Apache Airflow (custom DAGs, error handling)
· Solid understanding of cloud architecture (especially AWS)
· Experience with data marts and dimensional data modeling
· Exposure to modern data stack tools (DBT, Kafka, Snowflake, etc.)
· Familiarity with BI tools like Power BI, Apache Superset, or Supertech BI
· Version control (Git) and CI/CD pipeline knowledge is a plus
· Excellent problem-solving and communication skills
🔥 High Priority – Senior Lead Java Developer (10+ Years) | Bangalore – Onsite
Summary :
We are hiring Senior Lead Java Developers with 10+ years of experience for an onsite role in Bangalore.
If you're a hands-on expert with a strong background in Java, Spring Boot, Microservices, and Kubernetes, this is your opportunity to lead, mentor, and deliver high-quality solutions in a fast-paced environment.
🔹 Position : Senior Lead Java Developer
🔹 Experience : 10+ Years (12+ preferred)
🔹 Location : Bangalore (Onsite)
🔹 Openings : 6+
✅ Must-Have Skills :
- 8+ years of hands-on experience with Core Java & Spring Boot
- Expertise in Multithreading, Dependency Injection, and AOP
- Strong in Microservices Architecture and RESTful services
- Good exposure to SQL & NoSQL databases
- Proficient with Git (GitLab preferred)
- Experience with Kubernetes deployments and APM tools (New Relic preferred)
- Solid understanding of distributed tracing and log analysis
- Proven debugging and performance optimization skills
💼 Responsibilities :
- Design and develop high-quality, scalable microservices
- Act as SME for multiple services or subsystems
- Own service performance, SLAs, and incident resolutions
- Mentor junior developers and conduct technical interviews
- Participate in production war rooms and troubleshooting
- Lead development efforts and drive code quality
🎓 Qualification :
- BE/B.Tech or equivalent degree
- 8-10 years of experience in ETL Testing, Snowflake, DWH Concepts.
- Strong SQL knowledge & debugging skills are a must.
- Experience in Azure and Snowflake Testing is plus
- Experience with Qlik Replicate and Compose tools (Change Data Capture) tools is considered a plus
- Strong Data warehousing Concepts, ETL tools like Talend Cloud Data Integration, Pentaho/Kettle tool
- Experience in JIRA, Xray defect management tool is good to have.
- Exposure to the financial domain knowledge is considered a plus.
- Testing the data-readiness (data quality) address code or data issues
- Demonstrated ability to rationalize problems and use judgment and innovation to define clear and concise solutions
- Demonstrate strong collaborative experience across regions (APAC, EMEA and NA) to effectively and efficiently identify root cause of code/data issues and come up with a permanent solution
- Prior experience with State Street and Charles River Development (CRD) considered a plus
- Experience in tools such as PowerPoint, Excel, SQL
- Exposure to Third party data providers such as Bloomberg, Reuters, MSCI and other Rating agencies is a plus
🚀 We're Hiring: Technical Lead – Java Backend & Integration
📍 Bangalore | Hybrid | Full-Time
👨💻 9+ Years Experience | Enterprise Product Development
🏥 Healthcare Tech | U.S. Health Insurance Domain
Join Leading HealthTech, a U.S.-based product company driving innovation in the $1.1 trillion health insurance industry. We power over 81 million lives, with 130+ customers and 100+ third-party integrations. At our growing Bangalore tech hub, you’ll solve real-world, large-scale problems and help modernize one of the most stable and impactful industries in the world.
🔧 What You'll Work On:
- Architect and build backend & integration solutions using Java, J2EE, WebLogic, Spring, Apache Camel
- Transition monolith systems to microservices-based architecture
- Lead design reviews, customer discussions, code quality, UAT & production readiness
- Work with high-volume transactional systems processing millions of health claims daily
- Coach & mentor engineers, contribute to platform modernization
🧠 What You Bring:
- 9+ years in backend Java development and enterprise system integration
- Hands-on with REST, SOAP, JMS, SQL, stored procedures, XML, ESBs
- Solid understanding of SOA, data structures, system design, and performance tuning
- Experience with Agile, CI/CD, unit testing, and code quality tools
- Healthcare/payor domain experience is a huge plus!
💡 Why this opportunity?
- Global product impact from our India technology center
- Work on mission-critical systems in a stable and recession-resilient sector
- Be part of a journey to modernize healthcare through tech
- Solve complex challenges at scale that few companies offer
🎯 Ready to drive change at the intersection of tech and healthcare?

Job Summary:
We are looking for a motivated and detail-oriented Data Engineer with 1–2 years of experience to join our data engineering team. The ideal candidate should have solid foundational skills in SQL and Python, along with exposure to building or maintaining data pipelines. You’ll play a key role in helping to ingest, process, and transform data to support various business and analytical needs.
Key Responsibilities:
- Assist in the design, development, and maintenance of scalable and efficient data pipelines.
- Write clean, maintainable, and performance-optimized SQL queries.
- Develop data transformation scripts and automation using Python.
- Support data ingestion processes from various internal and external sources.
- Monitor data pipeline performance and help troubleshoot issues.
- Collaborate with data analysts, data scientists, and other engineers to ensure data quality and consistency.
- Work with cloud-based data solutions and tools (e.g., AWS, Azure, GCP – as applicable).
- Document technical processes and pipeline architecture.
Core Skills Required:
- Proficiency in SQL (data querying, joins, aggregations, performance tuning).
- Experience with Python, especially in the context of data manipulation (e.g., pandas, NumPy).
- Exposure to ETL/ELT pipelines and data workflow orchestration tools (e.g., Airflow, Prefect, Luigi – preferred).
- Understanding of relational databases and data warehouse concepts.
- Familiarity with version control systems like Git.
Preferred Qualifications:
- Experience with cloud data services (AWS S3, Redshift, Azure Data Lake, etc.)
- Familiarity with data modeling and data integration concepts.
- Basic knowledge of CI/CD practices for data pipelines.
- Bachelor’s degree in Computer Science, Engineering, or related field.

Work Mode: Hybrid
Need B.Tech, BE, M.Tech, ME candidates - Mandatory
Must-Have Skills:
● Educational Qualification :- B.Tech, BE, M.Tech, ME in any field.
● Minimum of 3 years of proven experience as a Data Engineer.
● Strong proficiency in Python programming language and SQL.
● Experience in DataBricks and setting up and managing data pipelines, data warehouses/lakes.
● Good comprehension and critical thinking skills.
● Kindly note Salary bracket will vary according to the exp. of the candidate -
- Experience from 4 yrs to 6 yrs - Salary upto 22 LPA
- Experience from 5 yrs to 8 yrs - Salary upto 30 LPA
- Experience more than 8 yrs - Salary upto 40 LPA

Job Summary:
We are looking for a talented Full Stack Developer with experience in C# (ASP.NET Core Web API), React, and SQL Server. The successful candidate will be responsible for designing, developing, and maintaining robust web applications and APIs, ensuring seamless integration between the front-end and back-end systems.
Key Responsibilities:
Full Stack Development: Design, develop, and maintain web applications using C#, ASP.NET Core Web API, and React.
API Development: Create and maintain RESTful APIs to support front-end applications and integrations.
Database Management: Design, optimize, and manage SQL Server databases, including writing complex queries, stored procedures, and indexing.
Front-End Development: Implement user interfaces using React, ensuring a smooth and responsive user experience.
Code Quality: Write clean, scalable, and well-documented code following best practices in software development.
Collaboration: Work closely with cross-functional teams, including UI/UX designers, back-end developers, and DevOps, to deliver high-quality software solutions.
Testing & Debugging: Conduct unit testing, integration testing, and debugging to ensure the quality and reliability of applications.
Continuous Improvement: Stay updated on the latest industry trends and technologies and integrate them into development processes where applicable.
Required Qualifications:
Experience: Proven experience as a Full Stack Developer with a strong focus on C#, ASP.NET Core Web API, React, and SQL Server.
Technical Skills:
Proficient in C# and ASP.NET Core Web API development.
Strong experience with React and related front-end technologies (JavaScript, HTML, CSS).
Expertise in SQL Server, including database design, query optimization, and performance tuning.
Familiarity with version control systems like Git.
Understanding of RESTful architecture and Web API design.
Problem-Solving: Excellent analytical and problem-solving skills with the ability to troubleshoot complex issues.
Communication: Strong verbal and written communication skills, with the ability to articulate technical concepts to non-technical stakeholders.
Team Collaboration: Ability to work effectively in a team environment, collaborating with cross-functional teams to achieve project goals.
Preferred Qualifications:
Experience with ASP.NET Core MVC or Blazor.
Knowledge of cloud platforms such as Azure or AWS.
Experience with Agile/Scrum development methodologies.
Education:
Bachelor’s degree in Computer Science, Software Engineering, or a related field (or equivalent experience)


Exp: 4-6 years
Position: Backend Engineer
Job Location: Bangalore ( office near cubbon park - opp JW marriott)
Work Mode : 5 days work from office
Requirements:
● Engineering graduate with 3-5 years of experience in software product development.
● Proficient in Python, Node.js, Go
● Good knowledge of SQL and NoSQL
● Strong Experience in designing and building APIs
● Experience with working on scalable interactive web applications
● A clear understanding of software design constructs and their implementation
● Understanding of the threading limitations of Python and multi-process architecture
● Experience implementing Unit and Integration testing
● Exposure to the Finance domain is preferred
● Strong written and oral communication skills
Job Title- Senior Full Stack Web Developer
Job location- Bangalore/Hybrid
Availability- Immediate Joiners
Experience Range- 5-8yrs
Desired skills - Java,AWS, SQL/NoSQL, Javascript, Node.js(good to have)
We are looking for 8-10 years Senior Full Stack Web Developer Java
- Working on different aspects of the core product and associated tools, (server-side or user-interfaces depending on the team you'll join)
- Expertise as a full stack software engineer of large scale complex software systems with at 8+ years of experience with technologies such as Java, Relational and Non relational databases,Node.js and AWS Cloud
- Assisting with in-life maintenance, testing, debugging and documentation of deployed services
- Coding & designing new features
- Creating the supporting functional and technical specifications
- Deep understanding of system architecture , and distributed systems
- Stay updated with the latest services, tools, and trends, and implement innovative solutions that contribute to the company's growth

Job Summary:
We are looking for a highly skilled senior .NET Full Stack Developer with 7+ years of experience to join our dynamic team. The ideal candidate should have strong expertise in .NET 8, Microservices Architecture, SQL, and various design patterns. Prior experience in the banking or financial services domain is highly preferred. The role requires excellent communication skills, client interaction abilities, and a strong understanding of agile methodologies. You will be responsible for designing, developing, and maintaining scalable applications while collaborating closely with clients and cross-functional teams.
Key Responsibilities:
- Design, develop, and maintain robust, scalable, and high-performance applications using .NET 8 and related technologies.
- Develop and implement Microservices Architecture to build modular and scalable solutions.
- Work on both frontend and backend development, ensuring seamless integration.
- Apply design patterns such as SOLID principles, Repository Pattern, CQRS, and DDD for optimized application development.
- Develop and optimize complex SQL queries, stored procedures, and database structures to ensure efficient data management.
- Collaborate with business stakeholders and clients to gather requirements and provide technical solutions.
- Ensure security, performance, and scalability of applications, particularly in banking and financial environments.
- Actively participate in Agile/Scrum development cycles, including sprint planning, daily stand-ups, and retrospectives.
- Communicate effectively with clients and internal teams, demonstrating strong problem-solving skills.
- Troubleshoot and debug technical issues, ensuring high availability and smooth performance of applications.
- Mentor and guide junior developers, conducting code reviews and knowledge-sharing sessions.
- Stay updated with the latest technologies, frameworks, and best practices in .NET development and Microservices.
Required Skills & Experience:
- .NET 8 (C#, ASP.NET Core, Web API) – Strong hands-on experience in enterprise-level application development.
- Microservices Architecture – Experience designing and implementing scalable, loosely coupled services.
- Frontend Technologies – Knowledge of Angular, React, or Blazor for UI development.
- Design Patterns & Best Practices – Strong understanding of SOLID principles, Repository Pattern, CQRS, Factory Pattern, etc.
- SQL & Database Management – Expertise in MS SQL Server, query optimization, and stored procedures.
- Agile Methodologies – Solid understanding of Scrum, Kanban, and Agile best practices.
- Banking/Financial Domain (Preferred) – Experience in core banking systems, payment gateways, or financial applications.
- Client Interaction & Communication Skills – Excellent verbal and written communication, with the ability to engage with clients effectively.
- Logical & Analytical Thinking – Strong problem-solving skills with the ability to design efficient solutions.
- Cloud & DevOps (Preferred) – Exposure to Azure MANDATORY /AWS, Docker, Kubernetes, and CI/CD pipelines.
- Version Control & Collaboration – Proficiency in Git, Azure DevOps, and Agile tools (JIRA, Confluence, etc.).
Preferred Qualifications:
- Bachelor's or Master's degree in Computer Science, Engineering, or a related field.
- Certifications in Microsoft .NET, Azure, or Agile methodologies are a plus.
Minimum 3+ Years of Core Java Programming with Collections Framework, Concurrent Programming, Multi-threading (Good knowledge in Executor service, Fork joins pool and other threading concepts)
· Good knowledge of the JVM with an understanding of performance and memory optimization.
· Extensive and expert programming experience in JAVA programming language (strong OO skills preferred).
· Excellent knowledge on collections like, Array List, Vector, LinkedList, HashMap, Hash Table, HashSet is mandate.
· Exercised exemplary development practices including design specification, coding standards, unit testing, and code-reviews.
· Expert level understanding of Object-Oriented Concepts and Data Structures
· Good experience in Database (Sybase, Oracle or SQL Server) like indexing (clustered, non-clustered), hashing, segmenting, data types like clob / blob, views (materialized), replication, constraints, functions, triggers, procedures etc.

Key Responsibilities:
- Design, develop, and maintain data pipelines on AWS.
- Work with large-scale data processing using SQL, Python or PySpark.
- Implement and optimize ETL processes for structured and unstructured data.
- Develop and manage data models in Snowflake.
- Ensure data security, integrity, and compliance on AWS cloud infrastructure.
- Collaborate with cross-functional teams to support data-driven decision-making.
Required Skills:
- Strong hands-on experience with AWS services
- Proficiency in SQL, Python, or PySpark for data processing and transformation.
- Experience working with Snowflake for data warehousing.
- Strong understanding of data modeling, data governance, and performance tuning.
- Knowledge of CI/CD pipelines for data workflows is a plus.


JD:
The Senior Software Engineer works closely with our development team, product manager, dev-ops and business analysts to build our SaaS platform to support efficient, end-to-end business processes across the industry using modern flexible technologies such as GraphQL, Kubernetes and React.
Technical Skills : C#, Angular, Azure with preferably .Net
Responsibilities
· Develops and maintains back-end, front-end applications and cloud services using C#, . Angular, Azure
· Accountable for delivering high quality results
· Mentors less experienced members of the team
· Thrives in a test-driven development organization with high quality standards
· Contributes to architecture discussions as needed
· Collaborates with Business Analyst to understand user stories and requirements to meet functional needs
· Supports product team’s efforts to produce product roadmap by providing estimates for enhancements
· Supports user acceptance testing and user story approval processes on development items
· Participates in sessions to resolve product issues
· Escalates high priority issues to appropriate internal stakeholders as necessary and appropriate
· Maintains a professional, friendly, open, approachable, positive attitude
Location : Bangalore
Ideal Work Experience and Skills
· At least 7 - 15 years’ experience working in a software development environment
· Prefer Bachelor’s degree in software development or related field
· Development experience with Angular and .NET is beneficial but not required
· Highly self-motivated and able to work effectively with virtual teams of diverse backgrounds
· Strong desire to learn and grow professionally
· A track record of following through on commitments; Excellent planning, organizational, and time management skills
Role & Responsibilities
About the Role:
We are seeking a highly skilled Senior Data Engineer with 5-7 years of experience to join our dynamic team. The ideal candidate will have a strong background in data engineering, with expertise in data warehouse architecture, data modeling, ETL processes, and building both batch and streaming pipelines. The candidate should also possess advanced proficiency in Spark, Databricks, Kafka, Python, SQL, and Change Data Capture (CDC) methodologies.
Key responsibilities:
Design, develop, and maintain robust data warehouse solutions to support the organization's analytical and reporting needs.
Implement efficient data modeling techniques to optimize performance and scalability of data systems.
Build and manage data lakehouse infrastructure, ensuring reliability, availability, and security of data assets.
Develop and maintain ETL pipelines to ingest, transform, and load data from various sources into the data warehouse and data lakehouse.
Utilize Spark and Databricks to process large-scale datasets efficiently and in real-time.
Implement Kafka for building real-time streaming pipelines and ensure data consistency and reliability.
Design and develop batch pipelines for scheduled data processing tasks.
Collaborate with cross-functional teams to gather requirements, understand data needs, and deliver effective data solutions.
Perform data analysis and troubleshooting to identify and resolve data quality issues and performance bottlenecks.
Stay updated with the latest technologies and industry trends in data engineering and contribute to continuous improvement initiatives.

We are looking for a Senior Data Engineer with strong expertise in GCP, Databricks, and Airflow to design and implement a GCP Cloud Native Data Processing Framework. The ideal candidate will work on building scalable data pipelines and help migrate existing workloads to a modern framework.
- Shift: 2 PM 11 PM
- Work Mode: Hybrid (3 days a week) across Xebia locations
- Notice Period: Immediate joiners or those with a notice period of up to 30 days
Key Responsibilities:
- Design and implement a GCP Native Data Processing Framework leveraging Spark and GCP Cloud Services.
- Develop and maintain data pipelines using Databricks and Airflow for transforming Raw → Silver → Gold data layers.
- Ensure data integrity, consistency, and availability across all systems.
- Collaborate with data engineers, analysts, and stakeholders to optimize performance.
- Document standards and best practices for data engineering workflows.
Required Experience:
- 7-8 years of experience in data engineering, architecture, and pipeline development.
- Strong knowledge of GCP, Databricks, PySpark, and BigQuery.
- Experience with Orchestration tools like Airflow, Dagster, or GCP equivalents.
- Understanding of Data Lake table formats (Delta, Iceberg, etc.).
- Proficiency in Python for scripting and automation.
- Strong problem-solving skills and collaborative mindset.
⚠️ Please apply only if you have not applied recently or are not currently in the interview process for any open roles at Xebia.
Looking forward to your response!
Best regards,
Vijay S
Assistant Manager - TAG
Job Description:
We are seeking a Tableau Developer with 5+ years of experience to join our Core Analytics team. The candidate will work on large-scale BI projects using Tableau and related tools.
Must Have:
- Strong expertise in Tableau Desktop and Server, including add-ons like Data and Server Management.
- Ability to interpret business requirements, build wireframes, and finalize KPIs, calculations, and designs.
- Participate in design discussions to implement best practices for dashboards and reports.
- Build scalable BI and Analytics products based on feedback while adhering to best practices.
- Propose multiple solutions for a given problem, leveraging toolset functionality.
- Optimize data sources and dashboards while ensuring business requirements are met.
- Collaborate with product, platform, and program teams for timely delivery of dashboards and reports.
- Provide suggestions and take feedback to deliver future-ready dashboards.
- Peer review team members’ dashboards, offering constructive feedback to improve overall design.
- Proficient in SQL, UI/UX practices, and alation, with an understanding of good data models for reporting.
- Mentor less experienced team members.
What we Require
We are recruiting technical experts with the following core skills and hands-on experience on
Mandatory skills : Core java, Microservices, AWS/Azure/GCP, Spring, Spring Boot
Hands on experience on : Kafka , Redis ,SQL, Docker, Kubernetes
Expert proficiency in designing both producer and consumer types of Rest services.
Expert proficiency in Unit testing and Code Quality tools.
Expert proficiency in ensuring code coverage.
Expert proficiency in understanding High-Level Design and translating that to Low-Level design
Hands-on experience working with no-SQL databases.
Experience working in an Agile development process - Scrum.
Experience working closely with engineers and software cultures.
Ability to think at a high level about product strategy and customer journeys.
Ability to produce low level design considering the paradigm that journeys will be extensible in the future and translate that into components that can be easily extended and reused.
Excellent communication skills to clearly articulate design decisions.


JioTesseract, a digital arm of Reliance Industries, is India's leading and largest AR/VR organization with the mission to democratize mixed reality for India and the world. We make products at the cross of hardware, software, content and services with focus on making India the leader in spatial computing. We specialize in creating solutions in AR, VR and AI, with some of our notable products such as JioGlass, JioDive, 360 Streaming, Metaverse, AR/VR headsets for consumers and enterprise space.
Mon-fri role, In office, with excellent perks and benefits!
Position Overview
We are seeking a Software Architect to lead the design and development of high-performance robotics and AI software stacks utilizing NVIDIA technologies. This role will focus on defining scalable, modular, and efficient architectures for robot perception, planning, simulation, and embedded AI applications. You will collaborate with cross-functional teams to build next-generation autonomous systems 9
Key Responsibilities:
1. System Architecture & Design
● Define scalable software architectures for robotics perception, navigation, and AI-driven decision-making.
● Design modular and reusable frameworks that leverage NVIDIA’s Jetson, Isaac ROS, Omniverse, and CUDA ecosystems.
● Establish best practices for real-time computing, GPU acceleration, and edge AI inference.
2. Perception & AI Integration
● Architect sensor fusion pipelines using LIDAR, cameras, IMUs, and radar with DeepStream, TensorRT, and ROS2.
● Optimize computer vision, SLAM, and deep learning models for edge deployment on Jetson Orin and Xavier.
● Ensure efficient GPU-accelerated AI inference for real-time robotics applications.
3. Embedded & Real-Time Systems
● Design high-performance embedded software stacks for real-time robotic control and autonomy.
● Utilize NVIDIA CUDA, cuDNN, and TensorRT to accelerate AI model execution on Jetson platforms.
● Develop robust middleware frameworks to support real-time robotics applications in ROS2 and Isaac SDK.
4. Robotics Simulation & Digital Twins
● Define architectures for robotic simulation environments using NVIDIA Isaac Sim & Omniverse.
● Leverage synthetic data generation (Omniverse Replicator) for training AI models.
● Optimize sim-to-real transfer learning for AI-driven robotic behaviors.
5. Navigation & Motion Planning
● Architect GPU-accelerated motion planning and SLAM pipelines for autonomous robots.
● Optimize path planning, localization, and multi-agent coordination using Isaac ROS Navigation.
● Implement reinforcement learning-based policies using Isaac Gym.
6. Performance Optimization & Scalability
● Ensure low-latency AI inference and real-time execution of robotics applications.
● Optimize CUDA kernels and parallel processing pipelines for NVIDIA hardware.
● Develop benchmarking and profiling tools to measure software performance on edge AI devices.
Required Qualifications:
● Master’s or Ph.D. in Computer Science, Robotics, AI, or Embedded Systems.
● Extensive experience (7+ years) in software development, with at least 3-5 years focused on architecture and system design, especially for robotics or embedded systems.
● Expertise in CUDA, TensorRT, DeepStream, PyTorch, TensorFlow, and ROS2.
● Experience in NVIDIA Jetson platforms, Isaac SDK, and GPU-accelerated AI.
● Proficiency in programming languages such as C++, Python, or similar, with deep understanding of low-level and high-level design principles.
● Strong background in robotic perception, planning, and real-time control.
● Experience with cloud-edge AI deployment and scalable architectures.
Preferred Qualifications
● Hands-on experience with NVIDIA DRIVE, NVIDIA Omniverse, and Isaac Gym
● Knowledge of robot kinematics, control systems, and reinforcement learning
● Expertise in distributed computing, containerization (Docker), and cloud robotics
● Familiarity with automotive, industrial automation, or warehouse robotics
● Experience designing architectures for autonomous systems or multi-robot systems.
● Familiarity with cloud-based solutions, edge computing, or distributed computing for robotics
● Experience with microservices or service-oriented architecture (SOA)
● Knowledge of machine learning and AI integration within robotic systems
● Knowledge of testing on edge devices with HIL and simulations (Isaac Sim, Gazebo, V-REP etc.)
The Opportunity:
Working within the Professional Services team as Senior Implementation Consultant you will be responsible for ensuring our customers are implemented efficiently and effectively. You will partner with our customers to take them on a journey from understanding their incentive compensation needs, through design and building within the product, testing and training, to the customer using Performio to pay their commissions. You will be able to work independently as well as part of small project teams to create and implement solutions for existing and new customers alike.
About Performio:
We are a small, but mighty company offering incentive compensation management software (ICM) that regularly beats the legacy incumbents in our industry. How? Our people and our product.
Our people are highly-motivated and engaged professionals with a clear set of values and behaviors. We prove these values matter to us by living them each day. This makes Performio both a great place to work and a great company to do business with.
But a great team alone is not sufficient to win. We also have a great product that balances the flexibility that large companies need in a sales commission solution with the great user experience buyers have come to expect from modern software. We are the only company in our industry that can make this claim and the market has responded favorably. We have a global customer base across Australia, Asia, Europe, and the US in 25+ industries that includes many well-known companies like News Corp, Johnson & Johnson and Vodafone.
What will you be doing:
- ● Effectively organize, lead and facilitate discussions with Customers and internal Stakeholders
- ● Elicit detailed requirements from Customers around their sales comp plans, data, and processes
- ● Deliver thorough, well-documented, detailed system design that ties back to Customer specific requirements
- ● Implement system designs in our SaaS ICM Software product: Setup data intake from the Customer, data integration / enrichment / transformation, buildout the compensation plans in the software tool, configure reporting & analytics according to customer requirements gleaned in the discovery process
- ● Conduct thorough functional testing to ensure a high-quality solution is delivered to customers
- ● Evaluate discrepancies in data and accurately identify root causes for any errors
- ● Lead the Customer Testing Support & System Handover activities, working closely with Customer Admins
- ● Train & support Customer Admins on the Performio product and the Customer’s specific implementation
- ● Be actively involved and support Customers through the system go-live process
- ● Support existing Customers by investigating and resolving issues
- ● Provide detailed and accurate estimates for potential Customer Projects & Change
- Requests
- ● Participate in activities, and provide feedback on internal processes and the standard
- solution
- ● Document best practices, define/develop reusable components; and advocate
- leverage/reuse of these to improve the Delivery Efficiencies for Performio & the Time to
- Value for Customers
- ● Participate in activities, and provide Customer focused feedback on improving and
- stabilizing the product
- ● Track work on projects using our PS automation software
- What we’re looking for:
- ● 5+ Years of relevant working experience in professional services on implementation of ICM solutions using products like SAP Commissions/Callidus, Xactly, Varicent etc.
- ● Proficient in working with large datasets using Excel, Relational Database Tables, SQL, ETL or similar type of tools
- ● Good understanding of Incentive Compensation Management concepts
- ● Willing to take on ambiguous & complex challenges and solve them
- ● Loves and is good at multitasking and juggling multiple workstreams
- ● Effective and confident in communication - ability to interface with senior employees at
- our Customers
- ● Ability to lead projects / initiatives, coach & guide junior consultants to help them be
- successful in their roles
- ● Highly detail oriented - takes great notes, can document complex solutions in detail
- ● Some understanding of accounting, finance, and/or sales comp concepts
- ● Positive Attitude - optimistic, cares deeply about Company and Customers
- ● High Emotional IQ - shows empathy, listens when appropriate, creates healthy
- conversation, dynamic, humble / no ego
- ● Resourceful - has a "I'll figure it out" attitude if something they need doesn't exist
- ● Smart and curious to learn
- ● Programming experience using Python will be a plus
- Why us:
- We’re fast-growing, but still small enough for everyone to make a big impact (and have face time with the CEO). We have genuine care for our customers and the passion to transform our product into one where experience and ease of use is a true differentiator. Led by a
strong set of company values, we play to win and are incentivized to do so through our employee equity plan.
We’ve adapted well to the work from home lifestyle, and take advantage of flexible working arrangements.
Our values speak strongly to who we really are. They mean a lot to us, and we use them every day to make decisions, and of course to hire great people!
- ● Play to win - we focus on results, have a bias to action and finish what we start
- ● Paint a clear picture - we’re clear, concise and communicate appropriately
- ● Be curious - we surface alternative solutions and consistently expand our knowledge
- ● Work as one - we all pitch in but also hold each other to account
- ● Do the right thing - we put what’s right for our customers over our own ego
Job Description for QA Engineer:
- 6-10 years of experience in ETL Testing, Snowflake, DWH Concepts.
- Strong SQL knowledge & debugging skills are a must.
- Experience in Azure and Snowflake Testing is plus
- Experience with Qlik Replicate and Compose tools (Change Data Capture) tools is considered a plus
- Strong Data warehousing Concepts, ETL tools like Talend Cloud Data Integration, Pentaho/Kettle tool
- Experience in JIRA, Xray defect management toolis good to have.
- Exposure to the financial domain knowledge is considered a plus.
- Testing the data-readiness (data quality) address code or data issues
- Demonstrated ability to rationalize problems and use judgment and innovation to define clear and concise solutions
- Demonstrate strong collaborative experience across regions (APAC, EMEA and NA) to effectively and efficiently identify root cause of code/data issues and come up with a permanent solution
- Prior experience with State Street and Charles River Development (CRD) considered a plus
- Experience in tools such as PowerPoint, Excel, SQL
- Exposure to Third party data providers such as Bloomberg, Reuters, MSCI and other Rating agencies is a plus
Key Attributes include:
- Team player with professional and positive approach
- Creative, innovative and able to think outside of the box
- Strong attention to detail during root cause analysis and defect issue resolution
- Self-motivated & self-sufficient
- Effective communicator both written and verbal
- Brings a high level of energy with enthusiasm to generate excitement and motivate the team
- Able to work under pressure with tight deadlines and/or multiple projects
- Experience in negotiation and conflict resolution