50+ SQL Jobs in Bangalore (Bengaluru) | SQL Job openings in Bangalore (Bengaluru)
Apply to 50+ SQL Jobs in Bangalore (Bengaluru) on CutShort.io. Explore the latest SQL Job opportunities across top companies like Google, Amazon & Adobe.



Role Overview:
We are seeking a highly skilled and experienced Lead Web App Developer - Backend to join our dynamic team in Bengaluru. The ideal candidate will have a strong background in backend development, microservices architecture, and cloud technologies, with a proven ability to deliver robust, scalable solutions. This role involves designing, implementing, and maintaining complex distributed systems, primarily in the Mortgage Finance domain.
Key Responsibilities:
- Cloud-Based Web Applications Development:
- Lead backend development efforts for cloud-based web applications.
- Work on diverse projects within the Mortgage Finance domain.
- Microservices Design & Development:
- Design and implement microservices-based architectures.
- Ensure scalability, availability, and reliability of distributed systems.
- Programming & API Development:
- Write efficient, reusable, and maintainable code in Python, Node.js, and Java.
- Develop and optimize RESTful APIs.
- Infrastructure Management:
- Leverage AWS platform infrastructure to build secure and scalable solutions.
- Utilize tools like Docker for containerization and deployment.
- Database Management:
- Work with RDBMS (MySQL) and NoSQL databases to design efficient schemas and optimize queries.
- Team Collaboration:
- Collaborate with cross-functional teams to ensure seamless integration and delivery of projects.
- Mentor junior developers and contribute to the overall skill development of the team.
Core Requirements:
- Experience: Minimum 10+ years in backend development, with at least 3+ years of experience in designing and delivering large-scale products on microservices architecture.
- Technical Skills:
- Programming Languages: Python, Node.js, Java.
- Frameworks & Tools: AWS (Lambda, RDS, etc.), Docker.
- Database Expertise: Proficiency in RDBMS (MySQL) and NoSQL databases.
- API Development: Hands-on experience in developing REST APIs.
- System Design: Strong understanding of distributed systems, scalability, and availability.
Additional Skills (Preferred):
- Experience with modern frontend frameworks like React.js or AngularJS.
- Strong design and architecture capabilities.
What We Offer:
- Opportunity to work on cutting-edge technologies in a collaborative environment.
- Competitive salary and benefits package.
- Flexible hybrid working model.
- Chance to contribute to impactful projects in the Mortgage Finance domain.

Role Overview
We are looking for a highly skilled Product Engineer to join our dynamic team. This is an exciting opportunity to work on innovative FinTech solutions and contribute to the future of global payments. If you're passionate about backend development, API design, and scalable architecture, we'd love to hear from you!
Key Responsibilities
- Design, develop, and maintain scalable, high-performance backend systems.
- Write clean, maintainable, and efficient code while following best practices.
- Build and optimize RESTful APIs and database queries.
- Collaborate with cross-functional teams to deliver 0 to 1 products.
- Ensure smooth CI/CD pipeline implementation and deployment automation.
- Contribute to open-source projects and stay updated with industry trends.
- Maintain a strong focus on security, performance, and reliability.
- Work with payment protocols and financial regulations to ensure compliance.
Required Skills & Qualifications
- ✅ 3+ years of professional software development experience.
- ✅ Proficiency in any backend language (with preference for Ruby on Rails).
- ✅ Strong foundation in architecture, design, and database optimization.
- ✅ Experience in building APIs and working with SQL/NoSQL databases.
- ✅ Familiarity with CI/CD practices and automation tools.
- ✅ Excellent problem-solving and analytical skills.
- ✅ Strong track record of open-source contributions (minimum 50 stars on GitHub).
- ✅ Passion for FinTech and payment systems.
- ✅ Strong communication skills and ability to work collaboratively in a team.
Nice to Have
- Prior experience in financial services or payment systems.
- Exposure to microservices architecture and cloud platforms.
- Knowledge of containerization tools like Docker & Kubernetes.

The CRM team is responsible for communications across email, mobile push and web push channels. We focus on our existing customers and manage our interactions and touchpoints to ensure that we optimise revenue generation, drive traffic to the website and app, and extend the active customer lifecycle. We also work closely with the Marketing and Product teams to ensure that any initiatives are integrated with CRM activities.
Our setup is highly data driven and requires the understanding and skill set to work with large datasets, employing data science techniques to create personalised content at a 1:1 level. The candidate for this role will have to demonstrate a strong background working in this environment, and have a proven track record of striving to find technical solutions for the many projects and situations that the business encounters.
Overview of role :
- Setting up automation pipelines in Python and SQL to flow data in and out of CRM platform for reporting, personalisation and use in data warehousing (Redshift)
- Writing, managing, and troubleshooting template logic written in Freemarker.
- Building proprietary algorithms for use in CRM campaigns, targeted at improving all areas of customer lifecycle.
- Working with big datasets to segment audiences on a large scale.
- Driving innovation by planning and implementing a range of AB tests.
- Acting as a technical touchpoint for developer and product teams to push projects over the line.
- Integrating product initiatives into CRM, and performing user acceptance testing (UAT)
- Interacting with multiple departments, and presenting to our executive team to help them understand CRM activities and plan new initiatives.
- Working with third party suppliers to optimise and improve their offering.
- Creating alert systems and troubleshooting tools to check in on health of automated jobs running in Jenkins and CRM platform.
- Setting up automated reporting in Amazon Quicksight.
- Assisting other teams with any technical advice/information they may require.
- When necessary, working in JavaScript to set up Marketing and CRM tags in Adobe Launch.
- Training team members and working with them to make processes more efficient.
- Working with REST APIs to integrate CRM System with a range of technologies from third party vendors to in-house services.
- Contributing to discussions on future strategy, interpretation of test results, and helping resolve any major CRM issues
Key skills required :
- Strong background in SQL
- Experience with a programming language (preferably Python OR Free marker)
- Understanding of REST APIs and how to utilise them
- Technical-savvy - you cast a creative eye on all activities of the team and business and suggest new ideas and improvements
- Comfortable presenting and interacting with all levels of the business and able to communicate technical information in a clear and concise manner.
- Ability to work under pressure and meet tight deadlines.
- Strong attention to detail
- Experience working with large datasets, and able to spot and pick up on important trends
- Understanding of key CRM metrics on performance and deliverability


Exp: 4-6 years
Position: Backend Engineer
Job Location: Bangalore ( office near cubbon park - opp JW marriott)
Work Mode : 5 days work from office
Requirements:
● Engineering graduate with 3-5 years of experience in software product development.
● Proficient in Python, Node.js, Go
● Good knowledge of SQL and NoSQL
● Strong Experience in designing and building APIs
● Experience with working on scalable interactive web applications
● A clear understanding of software design constructs and their implementation
● Understanding of the threading limitations of Python and multi-process architecture
● Experience implementing Unit and Integration testing
● Exposure to the Finance domain is preferred
● Strong written and oral communication skills
We’re looking for an experienced Senior Data Engineer to lead the design and development of scalable data solutions at our company. The ideal candidate will have extensive hands-on experience in data warehousing, ETL/ELT architecture, and cloud platforms like AWS, Azure, or GCP. You will work closely with both technical and business teams, mentoring engineers while driving data quality, security, and performance optimization.
Responsibilities:
- Lead the design of data warehouses, lakes, and ETL workflows.
- Collaborate with teams to gather requirements and build scalable solutions.
- Ensure data governance, security, and optimal performance of systems.
- Mentor junior engineers and drive end-to-end project delivery.
Requirements:
- 6+ years of experience in data engineering, including at least 2 full-cycle data warehouse projects.
- Strong skills in SQL, ETL tools (e.g., Pentaho, dbt), and cloud platforms.
- Expertise in big data tools (e.g., Apache Spark, Kafka).
- Excellent communication skills and leadership abilities.
Preferred: Experience with workflow orchestration tools (e.g., Airflow), real-time data, and DataOps practices.
Job Title- Senior Full Stack Web Developer
Job location- Bangalore/Hybrid
Availability- Immediate Joiners
Experience Range- 5-8yrs
Desired skills - Java,AWS, SQL/NoSQL, Javascript, Node.js(good to have)
We are looking for 8-10 years Senior Full Stack Web Developer Java
- Working on different aspects of the core product and associated tools, (server-side or user-interfaces depending on the team you'll join)
- Expertise as a full stack software engineer of large scale complex software systems with at 8+ years of experience with technologies such as Java, Relational and Non relational databases,Node.js and AWS Cloud
- Assisting with in-life maintenance, testing, debugging and documentation of deployed services
- Coding & designing new features
- Creating the supporting functional and technical specifications
- Deep understanding of system architecture , and distributed systems
- Stay updated with the latest services, tools, and trends, and implement innovative solutions that contribute to the company's growth
Job Title- Senior Java Developer
Exp Range- 8-10 yrs
Location- Bangalore/ Hybrid
Desired skill- Java 8, Microservices (Must), AWS, Kafka, Kubernetes
What you will bring:
● Strong core Java, concurrency and server-side experience
● 8 + Years of experience with hands-on coding.
● Strong Java8 and Microservices. (Must)
● Should have good understanding on AWS/GCP
● Kafka, AWS stack/Kubernetes
● An understanding of Object Oriented Design and standard design patterns.
● Experience of multi-threaded, 3-tier architectures/Distributed architectures, web services and caching.
● A familiarity with SQL databases
● Ability and willingness to work in a global, fast-paced environment.
● Flexible with the ability to adapt working style to meet objectives.
● Excellent communication and analytical skills
● Ability to effectively communicate with team members
● Experience in the following technologies would be beneficial but not essential, SpringBoot, AWS, Kubernetes, Terraform, Redis
Backend- Software Development Engineer II
Experience- 4- 7 yrs
About Wekan Enterprise Solutions
Wekan Enterprise Solutions is a leading Technology Consulting company and a strategic investment partner of MongoDB. We help companies drive innovation in the cloud by adopting modern technology solutions that help them achieve their performance and availability requirements. With strong capabilities around Mobile, IOT and Cloud environments, we have an extensive track record helping Fortune 500 companies modernize their most critical legacy and on-premise applications, migrating them to the cloud and leveraging the most cutting-edge technologies.
Job Description
We are looking for passionate software engineers eager to be a part of our growth journey. The right candidate needs to be interested in working in high-paced and challenging environments. Interested in constantly upskilling, learning new technologies and expanding their domain knowledge to new industries. This candidate needs to be a team player and should be looking to help build a culture of excellence. Do you have what it takes? You will be working on complex data migrations, modernizing legacy applications and building new applications on the cloud for large enterprise and/or growth stage startups. You will have the opportunity to contribute directly into mission critical projects directly interacting with business stakeholders, customer’s technical teams and MongoDB solutions Architects.
Location- Chennai or Bangalore
General responsibilities of the role:
● Relevant experience of 4- 7 years building high-performance back-end applications with at least 2 or more projects delivered using the required technologies
● Understand Customer codebase inclusive of all aspects of the technology stack (Code, Database, Deployment).
● Should understand the importance of communication in a team setup and not be limited to working in a silo.
● Follow existing best practices where applicable and be prepared to innovate to fill in gaps with creative solutions to solve challenges on each project
● Follow the basic hygiene when writing code: Modular, Readable and Testable.
● Be a self-motivated team player and actively participate in sprint events such as stand-ups, grooming, planning, retrospective, demo and review. Main Skills Requirements :
● Java :Strong experience with Enterprise Java (J2EE / JavaEE / Spring) application architectures. Understanding of OOPS concepts and design patterns. Working experience with microservices.
● SQL: Experience on SQL databases, preferably Oracle. Should have a good understanding of database concepts along with experience on PL/SQL procedures. Working knowledge of schema design is preferred.
● Testing Frameworks : Knowledge around any of the testing frameworks (Selenium, Junit, Mockito, etc) and practices. Hands on experience would be preferred.
● Version Control : Working knowledge of Git. Should understand the basics such as resolving conflicts, feature branches etc. Nice to have skills:
Nice to have skills:
● Experience working with Kafka
● Experience working with any cloud provider AWS / Azure / GCP
Role: Senior Software Engineer - Backend
Location: In-Office, Bangalore, Karnataka, India
Job Summary:
We are seeking a highly skilled and experienced Senior Backend Engineer with a minimum of 3 years of experience in product building to join our dynamic and innovative team. In this role, you will be responsible for designing, developing, and maintaining robust backend systems that power our applications. You will work closely with cross-functional teams to ensure seamless integration between frontend and backend components, leveraging your expertise to architect scalable, secure, and high-performance solutions. As a senior team member, you will mentor junior developers and lead technical initiatives to drive innovation and excellence.
Annual Compensation: 12-18 LPA
Responsibilities:
- Lead the design, development, and maintenance of scalable and efficient backend systems and APIs.
- Architect and implement complex backend solutions, ensuring high availability and performance.
- Collaborate with product managers, frontend developers, and other stakeholders to deliver comprehensive end-to-end solutions.
- Design and optimize data storage solutions using relational databases and NoSQL databases.
- Mentor and guide junior developers, fostering a culture of knowledge sharing and continuous improvement.
- Implement and enforce best practices for code quality, security, and performance optimization.
- Develop and maintain CI/CD pipelines to automate build, test, and deployment processes.
- Ensure comprehensive test coverage, including unit testing, and implement various testing methodologies and tools to validate application functionality.
- Utilize cloud services (e.g., AWS, Azure, GCP) for infrastructure deployment, management, and optimization.
- Conduct system design reviews and provide technical leadership in architectural discussions.
- Stay updated with industry trends and emerging technologies to drive innovation within the team.
- Implement secure authentication and authorization mechanisms and ensure data encryption for sensitive information.
- Design and develop event-driven applications utilizing serverless computing principles to enhance scalability and efficiency.
Requirements:
- Minimum of 3 years of proven experience as a Backend Engineer, with a strong portfolio of product-building projects.
- Strong proficiency in backend development using Java, Python, and JavaScript, with experience in building scalable and high-performance applications.
- Experience with popular backend frameworks and libraries for Java (e.g., Spring Boot) and Python (e.g., Django, Flask).
- Strong expertise in SQL and NoSQL databases (e.g., MySQL, MongoDB) with a focus on data modeling and scalability.
- Practical experience with caching mechanisms (e.g., Redis) to enhance application performance.
- Proficient in RESTful API design and development, with a strong understanding of API security best practices.
- In-depth knowledge of asynchronous programming and event-driven architecture.
- Familiarity with the entire web stack, including protocols, web server optimization techniques, and performance tuning.
- Experience with containerization and orchestration technologies (e.g., Docker, Kubernetes) is highly desirable.
- Proven experience working with cloud technologies (AWS/GCP/Azure) and understanding of cloud architecture principles.
- Strong understanding of fundamental design principles behind scalable applications and microservices architecture.
- Excellent problem-solving, analytical, and communication skills.
- Ability to work collaboratively in a fast-paced, agile environment and lead projects to successful completion.

This is a full-time. Must have Odoo development experience:
- Hands-on coding(is must) on Python, PostgreSQL, JavaScript;
- Hands-on Proficiency in writing complex SQL queries and query optimisation.
- Experience with Odoo framework, Odoo deployment, Kubernetes or docker.
- Frontend with Owl.js or any JavaScript Framework;
- Strong foundation in programming.
- API integration and data exchange.
- At least 4+ years of Odoo development.
- Team leadership experience.

Job Summary:
We are looking for a highly skilled senior .NET Full Stack Developer with 7+ years of experience to join our dynamic team. The ideal candidate should have strong expertise in .NET 8, Microservices Architecture, SQL, and various design patterns. Prior experience in the banking or financial services domain is highly preferred. The role requires excellent communication skills, client interaction abilities, and a strong understanding of agile methodologies. You will be responsible for designing, developing, and maintaining scalable applications while collaborating closely with clients and cross-functional teams.
Key Responsibilities:
- Design, develop, and maintain robust, scalable, and high-performance applications using .NET 8 and related technologies.
- Develop and implement Microservices Architecture to build modular and scalable solutions.
- Work on both frontend and backend development, ensuring seamless integration.
- Apply design patterns such as SOLID principles, Repository Pattern, CQRS, and DDD for optimized application development.
- Develop and optimize complex SQL queries, stored procedures, and database structures to ensure efficient data management.
- Collaborate with business stakeholders and clients to gather requirements and provide technical solutions.
- Ensure security, performance, and scalability of applications, particularly in banking and financial environments.
- Actively participate in Agile/Scrum development cycles, including sprint planning, daily stand-ups, and retrospectives.
- Communicate effectively with clients and internal teams, demonstrating strong problem-solving skills.
- Troubleshoot and debug technical issues, ensuring high availability and smooth performance of applications.
- Mentor and guide junior developers, conducting code reviews and knowledge-sharing sessions.
- Stay updated with the latest technologies, frameworks, and best practices in .NET development and Microservices.
Required Skills & Experience:
- .NET 8 (C#, ASP.NET Core, Web API) – Strong hands-on experience in enterprise-level application development.
- Microservices Architecture – Experience designing and implementing scalable, loosely coupled services.
- Frontend Technologies – Knowledge of Angular, React, or Blazor for UI development.
- Design Patterns & Best Practices – Strong understanding of SOLID principles, Repository Pattern, CQRS, Factory Pattern, etc.
- SQL & Database Management – Expertise in MS SQL Server, query optimization, and stored procedures.
- Agile Methodologies – Solid understanding of Scrum, Kanban, and Agile best practices.
- Banking/Financial Domain (Preferred) – Experience in core banking systems, payment gateways, or financial applications.
- Client Interaction & Communication Skills – Excellent verbal and written communication, with the ability to engage with clients effectively.
- Logical & Analytical Thinking – Strong problem-solving skills with the ability to design efficient solutions.
- Cloud & DevOps (Preferred) – Exposure to Azure MANDATORY /AWS, Docker, Kubernetes, and CI/CD pipelines.
- Version Control & Collaboration – Proficiency in Git, Azure DevOps, and Agile tools (JIRA, Confluence, etc.).
Preferred Qualifications:
- Bachelor's or Master's degree in Computer Science, Engineering, or a related field.
- Certifications in Microsoft .NET, Azure, or Agile methodologies are a plus.
Minimum 3+ Years of Core Java Programming with Collections Framework, Concurrent Programming, Multi-threading (Good knowledge in Executor service, Fork joins pool and other threading concepts)
· Good knowledge of the JVM with an understanding of performance and memory optimization.
· Extensive and expert programming experience in JAVA programming language (strong OO skills preferred).
· Excellent knowledge on collections like, Array List, Vector, LinkedList, HashMap, Hash Table, HashSet is mandate.
· Exercised exemplary development practices including design specification, coding standards, unit testing, and code-reviews.
· Expert level understanding of Object-Oriented Concepts and Data Structures
· Good experience in Database (Sybase, Oracle or SQL Server) like indexing (clustered, non-clustered), hashing, segmenting, data types like clob / blob, views (materialized), replication, constraints, functions, triggers, procedures etc.
We are seeking skilled Data Engineers with prior experience on Azure data engineering services and SQL server to join our team. As a Data Engineer, you will be responsible for designing and implementing robust data infrastructure, building scalable data pipelines, and ensuring efficient data integration and storage.
Experience : 6 - 10 years
Notice : Immediate to 30days
Responsibilities :
- Design, develop, and maintain scalable data pipelines and use Azure Data Factory and Azure Stream Analytics
- Collaborate with data scientists and analysts to understand data requirements and implement solutions that support analytics and machine learning initiatives.
- Optimize data storage and retrieval mechanisms to ensure performance, reliability, and cost-effectiveness.
- Implement data governance and security best practices to ensure compliance and data integrity.
- Troubleshoot and debug data pipeline issues, providing timely resolution and proactive monitoring.
- Stay abreast of emerging technologies and industry trends, recommending innovative solutions to enhance data engineering capabilities.
Qualifications :
- Proven experience as a Data Engineer or in a similar role.
- Experience in designing and hands-on development in cloud-based (AWS/Azure) analytics solutions.
- Expert level understanding on Azure Data Factory, Azure Synapse, Azure SQL, Azure Data Lake, Azure App Service, Azure Databricks, Azure IoT, Azure HDInsight + Spark, Azure Stream Analytics
- Knowledge of Dev-Ops processes (including CI/CD) and Infrastructure as code is essential.
- Experience in SQL server and procedures.
- Thorough understanding of Azure Infrastructure offerings.
- Strong experience in common data warehouse modelling principles including Kimball, Inmon.
- Experience in additional Modern Database terminologies.
- Working knowledge of Python or Java or Scala is desirable
- Strong knowledge of data modelling, ETL processes, and database technologies
- Experience with big data processing frameworks (Hadoop, Spark) and data pipeline orchestration tools (Airflow).
- Solid understanding of data governance, data security, and data quality best practices.
- Strong analytical and problem-solving skills, with attention to detail.

Key Responsibilities:
- Design, develop, and maintain data pipelines on AWS.
- Work with large-scale data processing using SQL, Python or PySpark.
- Implement and optimize ETL processes for structured and unstructured data.
- Develop and manage data models in Snowflake.
- Ensure data security, integrity, and compliance on AWS cloud infrastructure.
- Collaborate with cross-functional teams to support data-driven decision-making.
Required Skills:
- Strong hands-on experience with AWS services
- Proficiency in SQL, Python, or PySpark for data processing and transformation.
- Experience working with Snowflake for data warehousing.
- Strong understanding of data modeling, data governance, and performance tuning.
- Knowledge of CI/CD pipelines for data workflows is a plus.


JD:
The Senior Software Engineer works closely with our development team, product manager, dev-ops and business analysts to build our SaaS platform to support efficient, end-to-end business processes across the industry using modern flexible technologies such as GraphQL, Kubernetes and React.
Technical Skills : C#, Angular, Azure with preferably .Net
Responsibilities
· Develops and maintains back-end, front-end applications and cloud services using C#, . Angular, Azure
· Accountable for delivering high quality results
· Mentors less experienced members of the team
· Thrives in a test-driven development organization with high quality standards
· Contributes to architecture discussions as needed
· Collaborates with Business Analyst to understand user stories and requirements to meet functional needs
· Supports product team’s efforts to produce product roadmap by providing estimates for enhancements
· Supports user acceptance testing and user story approval processes on development items
· Participates in sessions to resolve product issues
· Escalates high priority issues to appropriate internal stakeholders as necessary and appropriate
· Maintains a professional, friendly, open, approachable, positive attitude
Location : Bangalore
Ideal Work Experience and Skills
· At least 7 - 15 years’ experience working in a software development environment
· Prefer Bachelor’s degree in software development or related field
· Development experience with Angular and .NET is beneficial but not required
· Highly self-motivated and able to work effectively with virtual teams of diverse backgrounds
· Strong desire to learn and grow professionally
· A track record of following through on commitments; Excellent planning, organizational, and time management skills
Role & Responsibilities
About the Role:
We are seeking a highly skilled Senior Data Engineer with 5-7 years of experience to join our dynamic team. The ideal candidate will have a strong background in data engineering, with expertise in data warehouse architecture, data modeling, ETL processes, and building both batch and streaming pipelines. The candidate should also possess advanced proficiency in Spark, Databricks, Kafka, Python, SQL, and Change Data Capture (CDC) methodologies.
Key responsibilities:
Design, develop, and maintain robust data warehouse solutions to support the organization's analytical and reporting needs.
Implement efficient data modeling techniques to optimize performance and scalability of data systems.
Build and manage data lakehouse infrastructure, ensuring reliability, availability, and security of data assets.
Develop and maintain ETL pipelines to ingest, transform, and load data from various sources into the data warehouse and data lakehouse.
Utilize Spark and Databricks to process large-scale datasets efficiently and in real-time.
Implement Kafka for building real-time streaming pipelines and ensure data consistency and reliability.
Design and develop batch pipelines for scheduled data processing tasks.
Collaborate with cross-functional teams to gather requirements, understand data needs, and deliver effective data solutions.
Perform data analysis and troubleshooting to identify and resolve data quality issues and performance bottlenecks.
Stay updated with the latest technologies and industry trends in data engineering and contribute to continuous improvement initiatives.

We are looking for a Senior Data Engineer with strong expertise in GCP, Databricks, and Airflow to design and implement a GCP Cloud Native Data Processing Framework. The ideal candidate will work on building scalable data pipelines and help migrate existing workloads to a modern framework.
- Shift: 2 PM 11 PM
- Work Mode: Hybrid (3 days a week) across Xebia locations
- Notice Period: Immediate joiners or those with a notice period of up to 30 days
Key Responsibilities:
- Design and implement a GCP Native Data Processing Framework leveraging Spark and GCP Cloud Services.
- Develop and maintain data pipelines using Databricks and Airflow for transforming Raw → Silver → Gold data layers.
- Ensure data integrity, consistency, and availability across all systems.
- Collaborate with data engineers, analysts, and stakeholders to optimize performance.
- Document standards and best practices for data engineering workflows.
Required Experience:
- 7-8 years of experience in data engineering, architecture, and pipeline development.
- Strong knowledge of GCP, Databricks, PySpark, and BigQuery.
- Experience with Orchestration tools like Airflow, Dagster, or GCP equivalents.
- Understanding of Data Lake table formats (Delta, Iceberg, etc.).
- Proficiency in Python for scripting and automation.
- Strong problem-solving skills and collaborative mindset.
⚠️ Please apply only if you have not applied recently or are not currently in the interview process for any open roles at Xebia.
Looking forward to your response!
Best regards,
Vijay S
Assistant Manager - TAG

Real-Time Marketing Automation Built on Customer Data Platform (CDP) for Enterprises
Mandatory Skills:
- Java
- Kafka
- Spring Boot
- SQL / MySQL
- Algorithms
- Data Structures
Key Responsibilities:
- Design and Develop large scale sub-systems
- To periodically explore latest technologies (esp Open Source) and prototype sub-systems
- Be a part of the team that develops the next gen Customer Data Platform
- Build components to make the customer data platform more efficient and scalable
Qualifications:
- 2-4 years of relevant experience with Algorithms, Data Structures, & Optimizations in addition to Coding.
- Education: B.E/B-Tech/M-Tech/M.S/MCA Computer Science or Equivalent from premier institutes only
- Candidates with CGPA 9 or above will be preferred.
Skill Set:
- Good Aptitude/Analytical skills (emphasis will be on Algorithms, Data Structures,& Optimizations in addition to Coding)
- Good System design and Class design
- Good knowledge of Databases (Both SQL/NOSQL)
- Good knowledge of Kafka, Streaming Systems
- Good Knowledge of Java, Unit Testing
Soft Skills:
- Has appreciation of technology and its ability to create value in the CDP domain
- Excellent written and verbal communication skills
- Active & contributing team member
- Strong work ethic with demonstrated ability to meet and exceed commitments
- Others: Experience of having worked in a start-up is a plus

Real-Time Marketing Automation Built on Customer Data Platform (CDP) for Enterprises
Mandatory Skills:
- Java
- Kafka
- Spring Boot
- SQL / MySQL
- Algorithms
- Data Structures
Key Responsibilities:
- Design and Develop large scale sub-systems.
- To periodically explore latest technologies (esp. Open Source) and prototype sub-systems.
- Be a part of the team that develops the next-gen Targeting platform.
- Build components to make the customer data platform more efficient and scalable.
Qualifications:
- 0-2 years of relevant experience with Java, Algorithms, Data Structures, & Optimizations in addition to Coding.
- Education: B.E/B-Tech/M-Tech/M.S in Computer Science or IT from premier institutes.
- Candidates with CGPA 9 or above will be preferred.
Skill Set:
- Good Aptitude/Analytical skills (emphasis will be on Algorithms, Data Structures,& Optimizations in addition to Coding).
- Good knowledge of Databases - SQL, NoSQL.
- Knowledge of Unit Testing a plus.
Soft Skills:
- Has an appreciation of technology and its ability to create value in the marketing domain.
- Excellent written and verbal communication skills.
- Active & contributing team member.
- Strong work ethic with demonstrated ability to meet and exceed commitments.
- Others: Experience of having worked in a start-up is a plus.
Qualifications:
- Must have a Bachelor’s degree in computer science or equivalent.
- Must have at least 5+ years’ experience as a SDET.
- At least 1+ year of leadership experience or managing a team.
Responsibilities:
- Design, develop and execute automation scripts using open-source tools.
- Troubleshooting any errors and streamlining the testing procedures.
- Writing and executing detailed test plans, test design & test cases covering feature, integration, regression, certification, system level testing as well as release validation in production.
- Identify, analyze and create detailed records of problems that appear during testing, such as software defects, bugs, functionality issues, and output errors, and work directly with software developers to find solutions and develop retesting procedures.
- Good time-management skills and commitment to meet deadlines.
- Stay up-to-date with new testing tools and test strategies.
- Driving technical projects and providing leadership in an innovative and fast-paced environment.
Requirements:
- Experience in the Automation - API and UI as well as Manual Testing on Web Application.
- Experience in frameworks like Playwright / Selenium Web Driver / Robot Framework / Rest-Assured.
- Must be proficient in Performance Testing tools like K6 / Gatling / JMeter.
- Must be proficient in Core Java / Type Script and Java 17.
- Experience in JUnit-5.
- Good to have TypeScript experience.
- Good to have RPA Experience using Java or any other tools like Robot Framework / Automation Anywhere.
- Experience in SQL (like MySQL, PG) & No-SQL Database (like MongoDB).
- Good understanding of software & systems architecture.
- Well acquainted with Agile Methodology, Software Development Life Cycle (SDLC), Software Test Life Cycle (STLC) and Automation Test Life Cycle.
- Strong experience REST based components testing, back-end, DB and micro services testing.
Work Location: Jayanagar - Bangalore.


Job Description :
We are seeking a highly skilled and motivated Python Developer with 4 to 6 years of experience to join our dynamic development team. The ideal candidate will have expertise in Python programming and be proficient in building scalable, secure, and efficient applications. The role involves collaborating with cross-functional teams to design, develop, and maintain software solutions.
The core responsibilities for the job include the following :
1.Application Development :
- Write clean, efficient, and reusable Python code.
- Develop scalable backend solutions and RESTful APIs.
- Optimize applications for maximum speed and scalability.
2.Integration and Database Management :
- Integrate data storage solutions such as SQL, PostgreSQL, or NoSQL databases (e. g., MongoDB).
- Work with third-party APIs and libraries to enhance application functionality.
3.Collaboration and Problem-Solving :
- Collaborate with front-end developers, designers, and project managers.
- Debug, troubleshoot, and resolve application issues promptly.
4.Code Quality and Documentation :
- Adhere to coding standards and best practices.
- Write comprehensive technical documentation and unit tests.
5.Innovation and Optimization :
- Research and implement new technologies and frameworks to improve software performance.
- Identify bottlenecks and devise solutions to optimize performance.
6.Requirements :
- Strong programming skills in Python with 4-6 years of hands-on experience.
- Proficiency in at least one Python web framework (e. g., Django, Flask, FastAPI).
- Experience with RESTful API development and integration.
- Knowledge of database design and management using SQL (MySQL, PostgreSQL) and NoSQL (MongoDB).
- Familiarity with cloud platforms (e. g., AWS, Azure, or Google Cloud) and containerization tools like Docker.
- Experience with version control systems like Git.
- Strong understanding of software development lifecycle (SDLC) and Agile methodologies.
- Knowledge of front-end technologies (e. g., HTML, CSS, JavaScript) is a plus.
- Experience with testing frameworks like Pytest or Unittest.
- Working knowledge of Java is a plus.
- Bachelors or Masters degree in Computer Science, Engineering, or a related field.
7.Preferred Skills :
- Knowledge of data processing libraries such as Pandas or NumPy.
- Experience with machine learning frameworks like TensorFlow or PyTorch (optional but a plus).
- Familiarity with CI/CD pipelines and deployment practices.
- Experience in message brokers like RabbitMQ or Kafka.
8.Soft Skills :
- Excellent problem-solving skills and attention to detail.
- Strong communication and teamwork abilities.
- Ability to manage multiple tasks and meet deadlines in a fast-paced environment.
- Willingness to learn and adapt to new technologies.
Job Description:
We are seeking a Tableau Developer with 5+ years of experience to join our Core Analytics team. The candidate will work on large-scale BI projects using Tableau and related tools.
Must Have:
- Strong expertise in Tableau Desktop and Server, including add-ons like Data and Server Management.
- Ability to interpret business requirements, build wireframes, and finalize KPIs, calculations, and designs.
- Participate in design discussions to implement best practices for dashboards and reports.
- Build scalable BI and Analytics products based on feedback while adhering to best practices.
- Propose multiple solutions for a given problem, leveraging toolset functionality.
- Optimize data sources and dashboards while ensuring business requirements are met.
- Collaborate with product, platform, and program teams for timely delivery of dashboards and reports.
- Provide suggestions and take feedback to deliver future-ready dashboards.
- Peer review team members’ dashboards, offering constructive feedback to improve overall design.
- Proficient in SQL, UI/UX practices, and alation, with an understanding of good data models for reporting.
- Mentor less experienced team members.
Job Summary
We are seeking a skilled Snowflake Developer to design, develop, migrate, and optimize Snowflake-based data solutions. The ideal candidate will have hands-on experience with Snowflake, SQL, and data integration tools to build scalable and high-performance data pipelines that support business analytics and decision-making.
Key Responsibilities:
Develop and implement Snowflake data warehouse solutions based on business and technical requirements.
Design, develop, and optimize ETL/ELT pipelines for efficient data ingestion, transformation, and processing.
Write and optimize complex SQL queries for data retrieval, performance enhancement, and storage optimization.
Collaborate with data architects and analysts to create and refine efficient data models.
Monitor and fine-tune Snowflake query performance and storage optimization strategies for large-scale data workloads.
Ensure data security, governance, and access control policies are implemented following best practices.
Integrate Snowflake with various cloud platforms (AWS, Azure, GCP) and third-party tools.
Troubleshoot and resolve performance issues within the Snowflake environment to ensure high availability and scalability.
Stay updated on Snowflake best practices, emerging technologies, and industry trends to drive continuous improvement.
Qualifications:
Education: Bachelor’s or master’s degree in computer science, Information Systems, or a related field.
Experience:
6+ years of experience in data engineering, ETL development, or similar roles.
3+ years of hands-on experience in Snowflake development.
Technical Skills:
Strong proficiency in SQL, Snowflake Schema Design, and Performance Optimization.
Experience with ETL/ELT tools like dbt, Talend, Matillion, or Informatica.
Proficiency in Python, Java, or Scala for data processing.
Familiarity with cloud platforms (AWS, Azure, GCP) and integration with Snowflake.
Experience with data governance, security, and compliance best practices.
Strong analytical, troubleshooting, and problem-solving skills.
Communication: Excellent communication and teamwork abilities, with a focus on collaboration across teams.
Preferred Skills:
Snowflake Certification (e.g., SnowPro Core or Advanced).
Experience with real-time data streaming using tools like Kafka or Apache Spark.
Hands-on experience with CI/CD pipelines and DevOps practices in data environments.
Familiarity with BI tools like Tableau, Power BI, or Looker for data visualization and reporting.
What we Require
We are recruiting technical experts with the following core skills and hands-on experience on
Mandatory skills : Core java, Microservices, AWS/Azure/GCP, Spring, Spring Boot
Hands on experience on : Kafka , Redis ,SQL, Docker, Kubernetes
Expert proficiency in designing both producer and consumer types of Rest services.
Expert proficiency in Unit testing and Code Quality tools.
Expert proficiency in ensuring code coverage.
Expert proficiency in understanding High-Level Design and translating that to Low-Level design
Hands-on experience working with no-SQL databases.
Experience working in an Agile development process - Scrum.
Experience working closely with engineers and software cultures.
Ability to think at a high level about product strategy and customer journeys.
Ability to produce low level design considering the paradigm that journeys will be extensible in the future and translate that into components that can be easily extended and reused.
Excellent communication skills to clearly articulate design decisions.


JioTesseract, a digital arm of Reliance Industries, is India's leading and largest AR/VR organization with the mission to democratize mixed reality for India and the world. We make products at the cross of hardware, software, content and services with focus on making India the leader in spatial computing. We specialize in creating solutions in AR, VR and AI, with some of our notable products such as JioGlass, JioDive, 360 Streaming, Metaverse, AR/VR headsets for consumers and enterprise space.
Mon-fri role, In office, with excellent perks and benefits!
Position Overview
We are seeking a Software Architect to lead the design and development of high-performance robotics and AI software stacks utilizing NVIDIA technologies. This role will focus on defining scalable, modular, and efficient architectures for robot perception, planning, simulation, and embedded AI applications. You will collaborate with cross-functional teams to build next-generation autonomous systems 9
Key Responsibilities:
1. System Architecture & Design
● Define scalable software architectures for robotics perception, navigation, and AI-driven decision-making.
● Design modular and reusable frameworks that leverage NVIDIA’s Jetson, Isaac ROS, Omniverse, and CUDA ecosystems.
● Establish best practices for real-time computing, GPU acceleration, and edge AI inference.
2. Perception & AI Integration
● Architect sensor fusion pipelines using LIDAR, cameras, IMUs, and radar with DeepStream, TensorRT, and ROS2.
● Optimize computer vision, SLAM, and deep learning models for edge deployment on Jetson Orin and Xavier.
● Ensure efficient GPU-accelerated AI inference for real-time robotics applications.
3. Embedded & Real-Time Systems
● Design high-performance embedded software stacks for real-time robotic control and autonomy.
● Utilize NVIDIA CUDA, cuDNN, and TensorRT to accelerate AI model execution on Jetson platforms.
● Develop robust middleware frameworks to support real-time robotics applications in ROS2 and Isaac SDK.
4. Robotics Simulation & Digital Twins
● Define architectures for robotic simulation environments using NVIDIA Isaac Sim & Omniverse.
● Leverage synthetic data generation (Omniverse Replicator) for training AI models.
● Optimize sim-to-real transfer learning for AI-driven robotic behaviors.
5. Navigation & Motion Planning
● Architect GPU-accelerated motion planning and SLAM pipelines for autonomous robots.
● Optimize path planning, localization, and multi-agent coordination using Isaac ROS Navigation.
● Implement reinforcement learning-based policies using Isaac Gym.
6. Performance Optimization & Scalability
● Ensure low-latency AI inference and real-time execution of robotics applications.
● Optimize CUDA kernels and parallel processing pipelines for NVIDIA hardware.
● Develop benchmarking and profiling tools to measure software performance on edge AI devices.
Required Qualifications:
● Master’s or Ph.D. in Computer Science, Robotics, AI, or Embedded Systems.
● Extensive experience (7+ years) in software development, with at least 3-5 years focused on architecture and system design, especially for robotics or embedded systems.
● Expertise in CUDA, TensorRT, DeepStream, PyTorch, TensorFlow, and ROS2.
● Experience in NVIDIA Jetson platforms, Isaac SDK, and GPU-accelerated AI.
● Proficiency in programming languages such as C++, Python, or similar, with deep understanding of low-level and high-level design principles.
● Strong background in robotic perception, planning, and real-time control.
● Experience with cloud-edge AI deployment and scalable architectures.
Preferred Qualifications
● Hands-on experience with NVIDIA DRIVE, NVIDIA Omniverse, and Isaac Gym
● Knowledge of robot kinematics, control systems, and reinforcement learning
● Expertise in distributed computing, containerization (Docker), and cloud robotics
● Familiarity with automotive, industrial automation, or warehouse robotics
● Experience designing architectures for autonomous systems or multi-robot systems.
● Familiarity with cloud-based solutions, edge computing, or distributed computing for robotics
● Experience with microservices or service-oriented architecture (SOA)
● Knowledge of machine learning and AI integration within robotic systems
● Knowledge of testing on edge devices with HIL and simulations (Isaac Sim, Gazebo, V-REP etc.)
The Opportunity:
Working within the Professional Services team as Senior Implementation Consultant you will be responsible for ensuring our customers are implemented efficiently and effectively. You will partner with our customers to take them on a journey from understanding their incentive compensation needs, through design and building within the product, testing and training, to the customer using Performio to pay their commissions. You will be able to work independently as well as part of small project teams to create and implement solutions for existing and new customers alike.
About Performio:
We are a small, but mighty company offering incentive compensation management software (ICM) that regularly beats the legacy incumbents in our industry. How? Our people and our product.
Our people are highly-motivated and engaged professionals with a clear set of values and behaviors. We prove these values matter to us by living them each day. This makes Performio both a great place to work and a great company to do business with.
But a great team alone is not sufficient to win. We also have a great product that balances the flexibility that large companies need in a sales commission solution with the great user experience buyers have come to expect from modern software. We are the only company in our industry that can make this claim and the market has responded favorably. We have a global customer base across Australia, Asia, Europe, and the US in 25+ industries that includes many well-known companies like News Corp, Johnson & Johnson and Vodafone.
What will you be doing:
- ● Effectively organize, lead and facilitate discussions with Customers and internal Stakeholders
- ● Elicit detailed requirements from Customers around their sales comp plans, data, and processes
- ● Deliver thorough, well-documented, detailed system design that ties back to Customer specific requirements
- ● Implement system designs in our SaaS ICM Software product: Setup data intake from the Customer, data integration / enrichment / transformation, buildout the compensation plans in the software tool, configure reporting & analytics according to customer requirements gleaned in the discovery process
- ● Conduct thorough functional testing to ensure a high-quality solution is delivered to customers
- ● Evaluate discrepancies in data and accurately identify root causes for any errors
- ● Lead the Customer Testing Support & System Handover activities, working closely with Customer Admins
- ● Train & support Customer Admins on the Performio product and the Customer’s specific implementation
- ● Be actively involved and support Customers through the system go-live process
- ● Support existing Customers by investigating and resolving issues
- ● Provide detailed and accurate estimates for potential Customer Projects & Change
- Requests
- ● Participate in activities, and provide feedback on internal processes and the standard
- solution
- ● Document best practices, define/develop reusable components; and advocate
- leverage/reuse of these to improve the Delivery Efficiencies for Performio & the Time to
- Value for Customers
- ● Participate in activities, and provide Customer focused feedback on improving and
- stabilizing the product
- ● Track work on projects using our PS automation software
- What we’re looking for:
- ● 5+ Years of relevant working experience in professional services on implementation of ICM solutions using products like SAP Commissions/Callidus, Xactly, Varicent etc.
- ● Proficient in working with large datasets using Excel, Relational Database Tables, SQL, ETL or similar type of tools
- ● Good understanding of Incentive Compensation Management concepts
- ● Willing to take on ambiguous & complex challenges and solve them
- ● Loves and is good at multitasking and juggling multiple workstreams
- ● Effective and confident in communication - ability to interface with senior employees at
- our Customers
- ● Ability to lead projects / initiatives, coach & guide junior consultants to help them be
- successful in their roles
- ● Highly detail oriented - takes great notes, can document complex solutions in detail
- ● Some understanding of accounting, finance, and/or sales comp concepts
- ● Positive Attitude - optimistic, cares deeply about Company and Customers
- ● High Emotional IQ - shows empathy, listens when appropriate, creates healthy
- conversation, dynamic, humble / no ego
- ● Resourceful - has a "I'll figure it out" attitude if something they need doesn't exist
- ● Smart and curious to learn
- ● Programming experience using Python will be a plus
- Why us:
- We’re fast-growing, but still small enough for everyone to make a big impact (and have face time with the CEO). We have genuine care for our customers and the passion to transform our product into one where experience and ease of use is a true differentiator. Led by a
strong set of company values, we play to win and are incentivized to do so through our employee equity plan.
We’ve adapted well to the work from home lifestyle, and take advantage of flexible working arrangements.
Our values speak strongly to who we really are. They mean a lot to us, and we use them every day to make decisions, and of course to hire great people!
- ● Play to win - we focus on results, have a bias to action and finish what we start
- ● Paint a clear picture - we’re clear, concise and communicate appropriately
- ● Be curious - we surface alternative solutions and consistently expand our knowledge
- ● Work as one - we all pitch in but also hold each other to account
- ● Do the right thing - we put what’s right for our customers over our own ego
Job Description for QA Engineer:
- 6-10 years of experience in ETL Testing, Snowflake, DWH Concepts.
- Strong SQL knowledge & debugging skills are a must.
- Experience in Azure and Snowflake Testing is plus
- Experience with Qlik Replicate and Compose tools (Change Data Capture) tools is considered a plus
- Strong Data warehousing Concepts, ETL tools like Talend Cloud Data Integration, Pentaho/Kettle tool
- Experience in JIRA, Xray defect management toolis good to have.
- Exposure to the financial domain knowledge is considered a plus.
- Testing the data-readiness (data quality) address code or data issues
- Demonstrated ability to rationalize problems and use judgment and innovation to define clear and concise solutions
- Demonstrate strong collaborative experience across regions (APAC, EMEA and NA) to effectively and efficiently identify root cause of code/data issues and come up with a permanent solution
- Prior experience with State Street and Charles River Development (CRD) considered a plus
- Experience in tools such as PowerPoint, Excel, SQL
- Exposure to Third party data providers such as Bloomberg, Reuters, MSCI and other Rating agencies is a plus
Key Attributes include:
- Team player with professional and positive approach
- Creative, innovative and able to think outside of the box
- Strong attention to detail during root cause analysis and defect issue resolution
- Self-motivated & self-sufficient
- Effective communicator both written and verbal
- Brings a high level of energy with enthusiasm to generate excitement and motivate the team
- Able to work under pressure with tight deadlines and/or multiple projects
- Experience in negotiation and conflict resolution


Key Responsibilities:
- Design, build, and maintain scalable, real-time data pipelines using Apache Flink (or Apache Spark).
- Work with Apache Kafka (mandatory) for real-time messaging and event-driven data flows.
- Build data infrastructure on Lakehouse architecture, integrating data lakes and data warehouses for efficient storage and processing.
- Implement data versioning and cataloging using Apache Nessie, and optimize datasets for analytics with Apache Iceberg.
- Apply advanced data modeling techniques and performance tuning using Apache Doris or similar OLAP systems.
- Orchestrate complex data workflows using DAG-based tools like Prefect, Airflow, or Mage.
- Collaborate with data scientists, analysts, and engineering teams to develop and deliver scalable data solutions.
- Ensure data quality, consistency, performance, and security across all pipelines and systems.
- Continuously research, evaluate, and adopt new tools and technologies to improve our data platform.
Skills & Qualifications:
- 3–6 years of experience in data engineering, building scalable data pipelines and systems.
- Strong programming skills in Python, Go, or Java.
- Hands-on experience with stream processing frameworks – Apache Flink (preferred) or Apache Spark.
- Mandatory experience with Apache Kafka for stream data ingestion and message brokering.
- Proficiency with at least one DAG-based orchestration tool like Airflow, Prefect, or Mage.
- Solid understanding and hands-on experience with SQL and NoSQL databases.
- Deep understanding of data lakehouse architectures, including internal workings of data lakes and data warehouses, not just usage.
- Experience working with at least one cloud platform, preferably AWS (GCP or Azure also acceptable).
- Strong knowledge of distributed systems, data modeling, and performance optimization.
Nice to Have:
- Experience with Apache Doris or other MPP/OLAP databases.
- Familiarity with CI/CD pipelines, DevOps practices, and infrastructure-as-code in data workflows.
- Exposure to modern data version control and cataloging tools like Apache Nessie.

Key Responsibilities:
• Design, develop, and maintain scalable and robust full-stack applications using
cutting-edge technologies.
• Collaborate with product managers, designers, and other stakeholders to
understand requirements and translate them into technical specifications.
• Implement front-end and back-end features with a focus on usability,
performance, and security.
• Write clean, efficient, and maintainable code while following best practices and
coding standards.
• Conduct code reviews, provide constructive feedback, and mentor junior
developers to foster a culture of continuous improvement.
• Troubleshoot and debug issues, identify root causes, and implement solutions to
ensure smooth application operation.
• Stay updated on emerging technologies and industry trends and apply them to
enhance our software development process and capabilities.
Requirements & Skills:
• Bachelor’s degree in computer science, Engineering, or a related field.
• 4+ years of professional experience in software development, with a focus on fullstack development.
• Proficiency in programming languages such as JavaScript (Node.js), Python, orJava.
• Experience with front-end frameworks/libraries such as React, Angular, or Vue.js.
• Solid understanding of back-end frameworks/libraries such as Express.js, Django, or Spring Boot.
• Experience with database systems (SQL and NoSQL) and ORMs (e.g., Sequelize, SQLAlchemy, Hibernate).
Job Title: Backend Developer
Location: In-Office, Bangalore, Karnataka, India
Job Summary:
We are seeking a highly skilled and experienced Backend Developer with a minimum of 1 year of experience in product building to join our dynamic and innovative team. In this role, you will be responsible for designing, developing, and maintaining robust backend systems that drive our applications. You will collaborate with cross-functional teams to ensure seamless integration between frontend and backend components, and your expertise will be critical in architecting scalable, secure, and high-performance backend solutions.
Annual Compensation: 6-10 LPA
Responsibilities:
- Design, develop, and maintain scalable and efficient backend systems and APIs using NodeJS.
- Architect and implement complex backend solutions, ensuring high availability and performance.
- Collaborate with product managers, frontend developers, and other stakeholders to deliver comprehensive end-to-end solutions.
- Design and optimize data storage solutions using relational databases (e.g., MySQL) and NoSQL databases (e.g., MongoDB, Redis).
- Promoting a culture of collaboration, knowledge sharing, and continuous improvement.
- Implement and enforce best practices for code quality, security, and performance optimization.
- Develop and maintain CI/CD pipelines to automate build, test, and deployment processes.
- Ensure comprehensive test coverage, including unit testing, and implement various testing methodologies and tools to validate application functionality.
- Utilize cloud services (e.g., AWS, Azure, GCP) for infrastructure deployment, management, and optimization.
- Conduct system design reviews and contribute to architectural discussions.
- Stay updated with industry trends and emerging technologies to drive innovation within the team.
- Implement secure authentication and authorization mechanisms and ensure data encryption for sensitive information.
- Design and develop event-driven applications utilizing serverless computing principles to enhance scalability and efficiency.
Requirements:
- Minimum of 1 year of proven experience as a Backend Developer, with a strong portfolio of product-building projects.
- Extensive experience with JavaScript backend frameworks (e.g., Express, Socket) and a deep understanding of their ecosystems.
- Strong expertise in SQL and NoSQL databases (MySQL and MongoDB) with a focus on data modeling and scalability.
- Practical experience with Redis and caching mechanisms to enhance application performance.
- Proficient in RESTful API design and development, with a strong understanding of API security best practices.
- In-depth knowledge of asynchronous programming and event-driven architecture.
- Familiarity with the entire web stack, including protocols, web server optimization techniques, and performance tuning.
- Experience with containerization and orchestration technologies (e.g., Docker, Kubernetes) is highly desirable.
- Proven experience working with cloud technologies (AWS/GCP/Azure) and understanding of cloud architecture principles.
- Strong understanding of fundamental design principles behind scalable applications and microservices architecture.
- Excellent problem-solving, analytical, and communication skills.
- Ability to work collaboratively in a fast-paced, agile environment and lead projects to successful completion.


Level of skills and experience:
5 years of hands-on experience in using Python, Spark,Sql.
Experienced in AWS Cloud usage and management.
Experience with Databricks (Lakehouse, ML, Unity Catalog, MLflow).
Experience using various ML models and frameworks such as XGBoost, Lightgbm, Torch.
Experience with orchestrators such as Airflow and Kubeflow.
Familiarity with containerization and orchestration technologies (e.g., Docker, Kubernetes).
Fundamental understanding of Parquet, Delta Lake and other data file formats.
Proficiency on an IaC tool such as Terraform, CDK or CloudFormation.
Strong written and verbal English communication skill and proficient in communication with non-technical stakeholderst

Required Skills and Experience:
- 5-7 years of experience in Software Development Engineering in Test (SDET).
- Proven experience in developing and maintaining automated test suites using Selenium.
- Strong proficiency in .NET programming languages (C# ).
- Solid understanding of software testing principles and methodologies.
- Experience with SQL for database testing.
- Ability to analyze and troubleshoot complex software issues.
- Excellent communication and collaboration skills.
Nice-to-Have Skills:
- Experience writing manual test cases and creating comprehensive test documentation.
- Understanding of API testing and experience with API test automation.
- Familiarity with Azure DevOps and GitHub for version control and CI/CD.
- Experience in performance testing .
- Experience with Agile/Scrum development methodologies.

Join an innovative and groundbreaking cybersecurity startup focused on helping customers identify, mitigate, and protect against ever-evolving cyber threats. With the current geopolitical climate, organizations need to stay ahead of malicious threat actors as well as nation-state actors. Cybersecurity teams are getting overwhelmed, and they need intelligent systems to help them focus on addressing the biggest and current risks first.
We help organizations protect their assets and customer data by continuously evaluating the new threats and risks to their cloud environment. This will, in turn, help mitigate the high-priority threats quickly so that the engineers can spend more time innovating and providing value to their customers.
About the Engineering Team:
We have several decades of experience working in the security industry, having worked on some of the most cutting-edge security technology that helped protect millions of customers. We have built technologies from the ground up, partnered with the industry on innovation, and helped customers with some of the most stringent requirements. We leverage industry and academic experts and veterans for their unique insight. Security technology includes all facets of software engineering work from data analytics and visualization, AI/ML processing, highly distributed and available services with real-time monitoring, integration with various other services, including protocol-level work. You will be learning from some of the best engineering talent with multi-cloud expertise.
We are looking for a highly experienced Principal Software Engineer to lead the development and scaling of our backend systems. The ideal candidate will have extensive experience in distributed systems, database management, Kubernetes, and cloud technologies. As a key technical leader, you will design, implement, and optimize critical backend services, working closely with cross-functional teams to ensure system reliability, scalability, and performance.
Key Responsibilities:
- Architect and Develop Distributed Systems: Design and implement scalable, distributed systems using microservices architecture. Expertise in both synchronous (REST/gRPC) and asynchronous communication patterns (message queues, Kafka), with a strong emphasis on building resilient services that can handle large data and maintain high throughput. Craft cloud solutions tailored to specific needs, choosing appropriate AWS services and optimizing resource utilization to ensure performance and high availability.
- Database Architecture & Optimization: Lead efforts to design and manage databases with a focus on scaling, replication, query optimization, and managing large datasets.
- Performance & Reliability: Engage in continuous learning and innovation to improve customer satisfaction. Embrace accountability and respond promptly to service issues to maintain and enhance system health. Ensure the backend systems meet high standards for performance, reliability, and scalability, identifying and solving bottlenecks and architectural challenges by leveraging various observability tools (such as Prometheus and Grafana).
- Leadership & Mentorship: Provide technical leadership and mentorship to other engineers, guiding architecture decisions, reviewing code, and helping to build a strong engineering culture. Stay abreast of the latest industry trends in cloud technology, adopting best practices to continuously improve our services and security measures.
Key Qualifications:
- Experience: 8+ years of experience in backend engineering, with at least 5 years of experience in building distributed systems.
- Technical Expertise:
- Distributed Systems: Extensive experience with microservices architecture, working with both synchronous (REST, gRPC) and asynchronous patterns (SNS, SNQ). Strong understanding of service-to-service authentication and authorization, API rate limiting, and other critical aspects of scalable systems.
- Database: Expertise in database technologies with experience working with large datasets, optimizing queries, handling replication, and creating views for performance. Hands-on experience with relational databases (e.g., PostgreSQL, MySQL) and NoSQL databases (e.g., DynamoDB, Cassandra). Expertise in various database technologies and deep experience with creating data models that provide consistent data views to the customer while data is being morphed, handling data migrations, and ensuring data integrity and high availability.
- Kubernetes: In-depth knowledge of Kubernetes, with experience deploying and managing services in Kubernetes clusters (EKS, AKS). Strong understanding of pods, services, networking, and scaling applications within Kubernetes environments.
- Golang: Proven experience using Golang as the primary programming language for backend development. Deep understanding of concurrency, performance optimization, and scalability in Golang applications.
- Cloud Technologies: Strong hands-on experience with AWS services (EC2, S3, DynamoDB, Lambda, RDS, EKS). Experience in designing and optimizing cloud-based architectures for large-scale distributed systems.
- Problem Solver: Strong problem-solving and debugging skills, with a proven ability to design and optimize complex systems.
- Leadership: Experience in leading engineering teams, guiding architectural decisions, and mentoring junior engineers.
Preferred Skills:
- Experience with infrastructure as code (Terraform, CloudFormation).
- Knowledge of GitHub-based CI/CD tools and best practices.
- Experience with monitoring and logging tools (Prometheus, Grafana, ELK).
- Cybersecurity experience.

Job Summary:
We are seeking a skilled Senior Tableau Developer to join our data team. In this role, you will design and build interactive dashboards, collaborate with data teams to deliver impactful insights, and optimize data pipelines using Airflow. If you are passionate about data visualization, process automation, and driving business decisions through analytics, we want to hear from you.
Key Responsibilities:
- Develop and maintain dynamic Tableau dashboards and visualizations to provide actionable business insights.
- Partner with data teams to gather reporting requirements and translate them into effective data solutions.
- Ensure data accuracy by integrating various data sources and optimizing data pipelines.
- Utilize Airflow for task orchestration, workflow scheduling, and monitoring.
- Enhance dashboard performance by streamlining data processing and improving query efficiency.
Requirements:
- 5+ years of hands-on experience in Tableau development.
- Proficiency in Airflow for building and automating data pipelines.
- Strong skills in data transformation, ETL processes, and data modeling.
- Solid understanding of SQL and database management.
- Excellent problem-solving skills and the ability to work collaboratively across teams.
Nice to Have:
- Experience with cloud platforms like AWS, GCP, or Azure.
- Familiarity with programming languages such as Python or R.
Why Join Us?
- Work on impactful data projects with a talented and collaborative team.
- Opportunity to innovate and shape data visualization strategies.
- Competitive compensation and professional growth opportunities

Required Skills and Qualifications:
- Proficiency in AWS Cloud services
- Strong experience with SQL for data manipulation and analysis.
- Hands-on programming experience in Python or pyspark.
- Working knowledge of Snowflake, including data modeling, performance tuning, and security configurations.
- Familiarity with CI/CD pipelines and version control (e.g., Git) is a plus.
- Excellent problem-solving and communication skills.
Note : one face-to-face (F2F) round is mandatory, and as per the process, you will need to visit the office for this.

Position: UiPath Developer
Experience: 3-7 years
Key Responsibilities:
1. Develop and implement automation solutions using UiPath.
2. Design, develop, test, and deploy RPA bots for process automation.
3. Write and optimize SQL queries, including joins, to manage and manipulate data effectively.
4. Develop scripts using Python, VB, .NET, or JavaScript to enhance automation capabilities.
5. Work with business stakeholders to analyze and optimize automation workflows.
6. Troubleshoot and resolve issues in RPA processes and scripts.
7. Ensure adherence to best practices in automation development and deployment.
Required Skills & Experience:
1. 3-7 years of experience in RPA development with UiPath.
2. Strong expertise in SQL, including writing complex queries and joins.
3. Hands-on experience with at least one scripting language: Python, VB, .NET, or JavaScript.
4. Understanding of RPA best practices, exception handling, and performance optimization.
5. Experience integrating UiPath with databases and other applications.
6. Strong problem-solving and analytical skills.
Preferred Qualifications:
1. UiPath certification is a plus.
2. Experience working in Agile environments.
3. Knowledge of APIs and web automation (AA).

HIRING FOR OUR CLIENT - PERFORMIO
https://www.linkedin.com/company/performio/
About Performio
Headquartered in Irvine, California, and with offices in San Francisco and Melbourne, Performio continues to offer sales performance management software for businesses looking to automate their sales compensation calculations and provide increased transparency to their sales reps.
Used by large global enterprises such as Veeva, GrubHub, Johnson & Johnson, and Vodafone - as well as growing mid-market companies - Performio is a new breed of sales compensation software that combines the enterprise-grade functionality that you need with the ease of use you’ve come to expect from modern software applications.
What’s the opportunity?
As Senior/Lead Software Engineer, you will play a significant role in turning our product vision into reality, while working within the Product Engineering team to add value across our product. You’ll draw on your experience to help us establish contemporary, web application software that is highly scalable, durable, and based on current best practices.
Our company is built on good practice and high standards. Passion for our customers and a willingness to put their needs first is at the centre of everything we do. We have a long history of going the extra mile to make sure our customers are happy. We’re looking for someone to join our team, ensuring that our systems scale effectively with our growth, and considering not only where we want to be but how we will get there.
Our product is written mainly in Java (Spring, Hibernate) and ReactJS with our Design System, Electric. Our architecture is a combination of microservices, decoupled AWS service architectures, and a well-maintained monolith. The product is deployed on AWS across multiple regions. We use tools like Docker and Buildkite and deploy our systems and monitor our technology using CloudWatch, New Relic and SquadCast. We’re looking for someone to help us evolve how our systems operate together while we grow our team and capability.
What will I be doing?
- Creating change in a complex system. Working within our product stream, making well-considered decisions around patterns, principals, frameworks, languages and tools, thinking through and mitigating for potential cascading impacts of those changes.
- Designing and developing well-architected systems. Understand and contribute to our product source code and cloud infrastructure.
- Designing holistically, delivering iteratively. Work with the team to break down system-wide architecture recommendations into small, intelligently planned increments for delivery.
- Advocate for technology needs. Translate technology risk into opportunity during product and technology roadmap discussions.
- Coach and mentor. Assist with career development of less experienced staff on our teams.
- Putting Customers First. A regular rotation on support for the systems we develop.
What we’re looking for
- Demonstrated experience as a software engineer, with 4-8 years experience in technology roles
- Experience working on complex systems and cloud architectures
- Significant experience with across the full stack:
- The Java programming language and frameworks such as Spring & SpringBoot
- Front-end Javascript frameworks such as ReactJS
- Good knowledge of AWS services, design patterns and practices - ideally certified but if not, we’ll help you get there
- Some knowledge of optimising databases and SQL queries for high-performance
- Experience and keen understanding of the value of working in agile teams
- A “quality-first” mindset, with experience working in continuous integration environments and supporting the systems you contribute to
- Highly effective at communicating, and comfortable whiteboarding design ideas with teams of engineers, product managers, and business analysts
- Desire to challenge the status quo and maturity to know when to compromise
- Respect for other team members and a highly collaborative approach to working and learning together

Job Description:
Please find below details:
Experience - 5+ Years
Location- Bangalore/Python
Role Overview
We are seeking a skilled Python Data Engineer with expertise in designing and implementing data solutions using the AWS cloud platform. The ideal candidate will be responsible for building and maintaining scalable, efficient, and secure data pipelines while leveraging Python and AWS services to enable robust data analytics and decision-making processes.
Key Responsibilities
- Design, develop, and optimize data pipelines using Python and AWS services such as Glue, Lambda, S3, EMR, Redshift, Athena, and Kinesis.
- Implement ETL/ELT processes to extract, transform, and load data from various sources into centralized repositories (e.g., data lakes or data warehouses).
- Collaborate with cross-functional teams to understand business requirements and translate them into scalable data solutions.
- Monitor, troubleshoot, and enhance data workflows for performance and cost optimization.
- Ensure data quality and consistency by implementing validation and governance practices.
- Work on data security best practices in compliance with organizational policies and regulations.
- Automate repetitive data engineering tasks using Python scripts and frameworks.
- Leverage CI/CD pipelines for deployment of data workflows on AWS.
Required Skills and Qualifications
- Professional Experience: 5+ years of experience in data engineering or a related field.
- Programming: Strong proficiency in Python, with experience in libraries like pandas, pySpark, or boto3.
- AWS Expertise: Hands-on experience with core AWS services for data engineering, such as:
- AWS Glue for ETL/ELT.
- S3 for storage.
- Redshift or Athena for data warehousing and querying.
- Lambda for serverless compute.
- Kinesis or SNS/SQS for data streaming.
- IAM Roles for security.
- Databases: Proficiency in SQL and experience with relational (e.g., PostgreSQL, MySQL) and NoSQL (e.g., DynamoDB) databases.
- Data Processing: Knowledge of big data frameworks (e.g., Hadoop, Spark) is a plus.
- DevOps: Familiarity with CI/CD pipelines and tools like Jenkins, Git, and CodePipeline.
- Version Control: Proficient with Git-based workflows.
- Problem Solving: Excellent analytical and debugging skills.
Optional Skills
- Knowledge of data modeling and data warehouse design principles.
- Experience with data visualization tools (e.g., Tableau, Power BI).
- Familiarity with containerization (e.g., Docker) and orchestration (e.g., Kubernetes).
- Exposure to other programming languages like Scala or Java.
Role Summary:
Lead and drive the development in BI domain using Tableau eco-system with deep technical and BI ecosystem knowledge. The resource will be responsible for the dashboard design, development, and delivery of BI services using Tableau eco-system.
Key functions & responsibilities:
· Communication & interaction with Project Manager to understand the requirement
· Dashboard designing, development and deployment using Tableau eco-system
· Ensure delivery within given time frame while maintaining quality
· Stay up to date with current tech and bring relevant ideas to the table
· Proactively work with the Management team to identify and resolve issues
· Performs other related duties as assigned or advised
· He/she should be a leader that sets the standard and expectations through example in his/her conduct, work ethic, integrity and character
· Contribute in dashboard designing, R&D and project delivery using Tableau
Experience:
· Overall 3-7 Years of experience in DWBI development projects, having worked on BI and Visualization technologies (Tableau, Qlikview) for at least 3 years.
· At least 3 years of experience covering Tableau implementation lifecycle including hands-on development/programming, managing security, data modeling, data blending, etc.
Technology & Skills:
· Hands-on expertise of Tableau administration and maintenance
· Strong working knowledge and development experience with Tableau Server and Desktop
· Strong knowledge in SQL, PL/SQL and Data modelling
· Knowledge of databases like Microsoft SQL Server, Oracle, etc.
· Exposure to alternate Visualization technologies like Qlikview, Spotfire, Pentaho etc.
· Good communication & Analytical skills with Excellent creative and conceptual thinking abilities
· Superior organizational skills, attention to detail/level of quality, Strong communication skills, both verbal and written


Job Responsibilities :
- Work closely with product managers and other cross functional teams to help define, scope and deliver world-class products and high quality features addressing key user needs.
- Translate requirements into system architecture and implement code while considering performance issues of dealing with billions of rows of data and serving millions of API requests every hour.
- Ability to take full ownership of the software development lifecycle from requirement to release.
- Writing and maintaining clear technical documentation enabling other engineers to step in and deliver efficiently.
- Embrace design and code reviews to deliver quality code.
- Play a key role in taking Trendlyne to the next level as a world-class engineering team
-Develop and iterate on best practices for the development team, ensuring adherence through code reviews.
- As part of the core team, you will be working on cutting-edge technologies like AI products, online backtesting, data visualization, and machine learning.
- Develop and maintain scalable, robust backend systems using Python and Django framework.
- Proficient understanding of the performance of web and mobile applications.
- Mentor junior developers and foster skill development within the team.
Job Requirements :
- 2+ years of experience with Python and Django.
- Strong understanding of relational databases like PostgreSQL or MySQL and Redis.
- (Optional) : Experience with web front-end technologies such as JavaScript, HTML, and CSS
Who are we :
Trendlyne, is a Series-A products startup in the financial markets space with cutting-edge analytics products aimed at businesses in stock markets and mutual funds.
Our founders are IIT + IIM graduates, with strong tech, analytics, and marketing experience. We have top finance and management experts on the Board of Directors.
What do we do :
We build powerful analytics products in the stock market space that are best in class. Organic growth in B2B and B2C products have already made the company profitable. We deliver 900 million+ APIs every month to B2B customers. Trendlyne analytics deals with 100s of millions rows of data to generate insights, scores, and visualizations which are an industry benchmark.


Job Responsibilities :
- Work closely with product managers and other cross functional teams to help define, scope and deliver world-class products and high quality features addressing key user needs.
- Translate requirements into system architecture and implement code while considering performance issues of dealing with billions of rows of data and serving millions of API requests every hour.
- Ability to take full ownership of the software development lifecycle from requirement to release.
- Writing and maintaining clear technical documentation enabling other engineers to step in and deliver efficiently.
- Embrace design and code reviews to deliver quality code.
- Play a key role in taking Trendlyne to the next level as a world-class engineering team
-Develop and iterate on best practices for the development team, ensuring adherence through code reviews.
- As part of the core team, you will be working on cutting-edge technologies like AI products, online backtesting, data visualization, and machine learning.
- Develop and maintain scalable, robust backend systems using Python and Django framework.
- Proficient understanding of the performance of web and mobile applications.
- Mentor junior developers and foster skill development within the team.
Job Requirements :
- 1+ years of experience with Python and Django.
- Strong understanding of relational databases like PostgreSQL or MySQL and Redis.
- (Optional) : Experience with web front-end technologies such as JavaScript, HTML, and CSS
Who are we :
Trendlyne, is a Series-A products startup in the financial markets space with cutting-edge analytics products aimed at businesses in stock markets and mutual funds.
Our founders are IIT + IIM graduates, with strong tech, analytics, and marketing experience. We have top finance and management experts on the Board of Directors.
What do we do :
We build powerful analytics products in the stock market space that are best in class. Organic growth in B2B and B2C products have already made the company profitable. We deliver 900 million+ APIs every month to B2B customers. Trendlyne analytics deals with 100s of millions rows of data to generate insights, scores, and visualizations which are an industry benchmark.
In this role you would be responsible for day to day maintenance of engineering systems. You would also often act as the first line of support for internal applications while fixing bugs, developing and deploying small components of code.
High Impact production issues often require coordination between multiple Engineering, Infrastructure and Product groups, so you get to experience a breadth of impact with various groups.
What will you do at Fynd?
- Supporting the adoption team during setup, maintenance, and troubleshooting processes.
- Analyzing existing systems operations and developing preventative maintenance strategies.
- Identifying potential problems and notifying the relevant stakeholders in a timely manner.
- Excellent communication skills, both verbal and written, with the ability to convey technical concepts in a clear and understandable manner.
- Problem-solving abilities, customer-focused approach to handling inquiries and resolving issues.
- Ability to multitask, prioritize, and manage time effectively in a fast-paced customer support environment.
- Attention to detail and accuracy in documenting user interactions and resolutions.
- Ability to work independently as well as collaboratively in a team-oriented setting.
- Work closely with business in managing day to day issues, and resolve user queries.
- Knowledge of Ticketing Systems: Freshdesk, Hubspot, Zoho Desk, Jira Service Management.
- High availability for fast response to customers.
- Ability to join the dots around multiple events occurring concurrently and spot patterns.
- Technology experience of retail industry products or custom solutions (retail fashion & lifestyle, both physical and digital commerce channels)
- Experience in SAAS B2B software companies - an advantage.


.Net Lead(2+ lead exp must) - Exp : 8+
.Net Developer - Exp : 6+
Skills : .Net Core, C#, SQL and Angular.
Np : Immediate Joiners
Location – Bangalore.

Lead Data Scientist
Job Description
As a Lead Data Scientist, you will be responsible for identifying, scoping, and delivering data science projects with a strong emphasis on causal inference.
The ability to take large, scientifically complex projects and break them down into manageable hypotheses and experiments to inform functional specifications, and then deliver features in a successful and timely manner, is expected. Maturity, high judgment, negotiation skills, ability to influence are essential to success in this role.
We will rely on your experience in successfully delivering projects that significantly, positively, and measurably affect the business. You should also have experience in large-scale data science projects.
What You'll Do
• Work closely with the Tech team to convert those POC into fully scalable products.
• Actively identify existing and new features which could benefit from predictive modelling and productionization of predictive models
• Actively identify and resolve strategic issues that may impair the team’s ability to meet strategic, scientific, and technical goals
• Contribute to research and development of AI/ML techniques and technology that fuels the business innovation and growth of -
• Work closely with engineers to deploy models in production both in real time and in batch process and systematically track model performance
• Encourage team building, best practices sharing especially with more junior team members
Requirements & Skills
• Strong problem solving skills with an emphasis on product development
• Master’s or PhD degree in Statistics, Mathematics, Computer Science, or another quantitative field
• More than 8 years of experience in practicing machine learning and data science in business or a related field, with a focus on statistical analysis
• Strong background in statistics and causal inference.
• Proficient in statistical programming languages such as Python or R, and data visualization tools (e.g., Tableau, Power BI).
• Strong experience with machine learning algorithms and statistical modeling techniques
• Strong computing/programming skills; Proficient in Python, Spark, SQL, Linux shell script.
• Proven ability to work with large datasets and familiarity with big data technologies (e.g., Hadoop, Spark, SQL) is a plus.
• Experience with end to end feature development (owning feature definition, roadmap development and experimentation
• Effective leadership and communication skills, with the ability to inspire and guide a team.
• Excellent problem solving and critical thinking capabilities.
• Strong experience in Cloud technology.
Functional Consultant
Bangalore / Full-Time
Job Description
We build Enterprise-Scale AI/ML-powered products for Manufacturing, Sustainability and Supply Chain. We are looking for a Functional Consultant to help us to gather requirements, preparation of detailed analysis, understand data from various data sources as part of data discovery, and assist implementation consultants to build data pipelines and data analysis on the standard product for our esteem customers. By joining our team, you will take part in various projects, involves working with clients to successfully implement and integrate a -’s products, software, or systems into their existing infrastructure or - cloud.
What You'll Do
• Preparation of detailed analysis of business processes, including client interviews, current flow validation, and development of automated flow charts detailing process steps, hand offs and decision points
• Providing applications support to customers, answering complex questions on function and usage of product
• Serving as primary support liaison between company and customer, conveying customer feedback to application development staff
• Delivering tasks on time and in line with client expectations
• Analyze client requirements and help implementation consultants to configure the product or system accordingly. Communicate product owner on the new functionalities to be developed in the product if required
• Managing, developing and encouraging junior members of the team
• Conduct training sessions and workshops to educate clients on the effective use of the product or system
• Foster strong relationships with clients, acting as a trusted advisor and maintaining regular communication to ensure client satisfaction. Identify upselling and cross-selling opportunities based on client needs and the company's product portfolio
There are no shortcuts to achieving greatness: It is a lot of work. But once you are at the top, you will enjoy the view.
Requirements & Skills
• Overall 10+ years of experience having Bachelor's degree/Master’s degree or equivalent practical experience
• 4-6 years of experience in a supply chain, manufacturing, logistic department as domain specialist
• Deep experience of at least one major ERP supply chain solution preferably SAP
• Strong skills in developing and presenting clear and concise solution briefing
• Exceptionally strong verbal and written communication skills are required good interpersonal and organizational skill
Looking for an experienced Business Analyst with a strong data analytics background in Supply Chain Domain and willing to learn new technologies. This role is responsible for developing and enhancing products in supply chain, manufacturing, and unstructured data.
What You'll Do
• Discuss with customers/SME to understand business processes and formulate solutions to intricate requirements.
• Working with large datasets from disparate systems to derive trends, patterns, and correlations that integrates with data-driven solutions.
• Play a key role in enhancing the value of products and services.
• Produce quality deliverables like, high and detailed level designs including data model, API specifications, wireframes, program logics, user requirement documents etc.
• Collaborate with the development team to ensure development is carried out as per the specifications. Guide the young development team with creative logics to solve complex requirements.
• Create test scripts and carry out functional testing, generate test results and ensure development is done as per expectations.
• Create WBS of new project/activity with task timelines and milestones with business outcomes . Estimate efforts required for each task.
• Ensure delivery of the tasks are as per the project timelines. Contribute to the achievement of Project KPI’s.
• Follow project management process as defined for the project (JIRA / any other application).
• Attend daily or periodic meetings with the clients, present status and discuss requirements, challenges and manage situations.
• Establish good relations with the clients and look for new business opportunities identified by gaps or pain points faced by them.
• Acting as a single source of contact for the business requirements for the delivery team.
• Prioritize, Manage & Deliver multiple tasks meeting aligned timelines, with optimal quality and optimize customer satisfaction.
• Work in a distributed team environment across multiple time-zones.
• Requirements
• Bachelor's degree or equivalent practical experience
• 5+ years of experience in the role of a Business Analyst.
• A positive and welcoming attitude to undertake multiple complex tasks despite time constraints and effectively managing them.
• Self-directed, self-driven and proactive approach to solve problems leveraging available resources.
• Possess analytical way of thinking and troubleshooting issues.
• Excellent client-facing, interpersonal and communication skills.
• Hands-on with SQL Data analysis.
• Proficient in Excel, MS Word, and MS PowerPoint.
• Proficient in Jira - decompose business processes into epics, user stories, acceptance criteria and manage product backlog.
• Ability to create good project documentation.
• Nice to Have
• Functional experience in SAP/JDE in Supply chain: production/procurement/sales and distribution.
• A good understanding of business processes in cross functional areas.
• Experience and knowledge in analyzing supply chain tables/data of at least one ERP system, SAP or JDE. Knowledge in how tables can be joined to bring out a business reporting requirement.
• E2E ERP implementation experience is an added advantage.
• An understanding of Visualization tools (e.g. Tableau/Power BI/Graph UI)
• Experience on Consumer, Pharma/Life-Sciences industry.
• Basic ability to understand technical code ABAP / Python etc.
Position : Software Engineer (Java Backend Engineer)
Experience : 4+ Years
📍 Location : Bangalore, India (Hybrid)
Mandatory Skills : Java 8+ (Advanced Features), Spring Boot, Apache Spark (Spark Streaming), SQL & Cosmos DB, Git, Maven, CI/CD (Jenkins, GitHub), Azure Cloud, Agile Scrum.
About the Role :
We are seeking a highly skilled Backend Engineer with expertise in Java, Spark, and microservices architecture to join our dynamic team. The ideal candidate will have a strong background in object-oriented programming, experience with Spark Streaming, and a deep understanding of distributed systems and cloud technologies.
Key Responsibilities :
- Design, develop, and maintain highly scalable microservices and optimized RESTful APIs using Spring Boot and Java 8+.
- Implement and optimize Spark Streaming applications for real-time data processing.
- Utilize advanced Java 8 features, including:
- Functional interfaces & Lambda expressions
- Streams and Parallel Streams
- Completable Futures & Concurrency API improvements
- Enhanced Collections APIs
- Work with relational (SQL) and NoSQL (Cosmos DB) databases, ensuring efficient data modeling and retrieval.
- Develop and manage CI/CD pipelines using Jenkins, GitHub, and related automation tools.
- Collaborate with cross-functional teams, including Product, Business, and Automation, to deliver end-to-end product features.
- Ensure adherence to Agile Scrum practices and participate in code reviews to maintain high-quality standards.
- Deploy and manage applications in Azure Cloud environments.
Minimum Qualifications:
- BS/MS in Computer Science or a related field.
- 4+ Years of experience developing backend applications with Spring Boot and Java 8+.
- 3+ Years of hands-on experience with Git for version control.
- Strong understanding of software design patterns and distributed computing principles.
- Experience with Maven for building and deploying artifacts.
- Proven ability to work in Agile Scrum environments with a collaborative team mindset.
- Prior experience with Azure Cloud Technologies.


Job Description: .NET Core Developer with C# and Angular
We are looking for a skilled .NET Core Developer with experience in C# and Angular to design, develop, and maintain web applications. You will work on both the front-end and back-end, delivering high-quality solutions in a collaborative environment.
Key Responsibilities:
- Develop and maintain web applications using .NET Core, C#, and Angular.
- Design and implement RESTful APIs and integrate with front-end components.
- Ensure application performance, scalability, and security.
- Collaborate with cross-functional teams to define and deliver software features.
- Troubleshoot, debug, and optimize code.
Share Cv to
Thirega@ vysystems dot com - WhatsApp - 91Five0033Five2Three

Applications Engineer Responsibilities:
Collaborating on software development projects with the engineering, sales, and customer services departments.
Liaising with clients and incorporating user-defined needs and feedback into application designs.
Writing code and scripts for applications, as well as installing, maintaining, and testing applications.
Providing clients with technical support. Documenting development processes, procedures, and application version histories.
Keeping up with advancements in application engineering and new technologies.
Applications Engineer Requirements:
Bachelor's degree in computer science, information technology, information systems, or similar.
An experience of 2-5yrs as an applications engineer.
Previous experience in sales or customer services will be advantageous.
Advanced proficiency in programming languages, such as Java, SQL, .NET, and C.
Extensive experience in deploying, optimizing, and maintaining software.
Excellent analytical and problem-solving skills.
Exceptional customer services and interpersonal skills.
Superb collaboration and communication skills.
Great organizational and time management skills.
About the company
KPMG International Limited, commonly known as KPMG, is one of the largest professional services networks in the world, recognized as one of the "Big Four" accounting firms alongside Deloitte, PricewaterhouseCoopers (PwC), and Ernst & Young (EY). KPMG provides a comprehensive range of professional services primarily focused on three core areas: Audit and Assurance, Tax Services, and Advisory Services. Their Audit and Assurance services include financial statement audits, regulatory audits, and other assurance services. The Tax Services cover various aspects such as corporate tax, indirect tax, international tax, and transfer pricing. Meanwhile, their Advisory Services encompass management consulting, risk consulting, deal advisory, and other related services.
Apply through this link- https://forms.gle/qmX9T7VrjySeWYa37
Job Description
Position: Data Engineer
Experience: Experience 5+ years of relevant experience
Location : WFO (3 days working) Pune – Kharadi, NCR – Gurgaon , Bangalore
Employment Type: contract for 3 months-Can be extended basis performance and future requirements
Skills Required:
• Proficiency in SQL, AWS, data integration tools like Airflow or equivalent. Knowledge on using tools like JIRA, GitHub, etc.
• Data Engineer who will be able to work on the data management activities and orchestration processes.