50+ SQL Jobs in Bangalore (Bengaluru) | SQL Job openings in Bangalore (Bengaluru)
Apply to 50+ SQL Jobs in Bangalore (Bengaluru) on CutShort.io. Explore the latest SQL Job opportunities across top companies like Google, Amazon & Adobe.

L2 Support Engineers are responsible for:
1. Production Issue Debugging
○ Analyzing logs and monitoring application behavior to identify root
causes of production issues.
○ Providing temporary resolutions using database-level fixes or
configuration changes.
2. Handling Escalations from L1 Support
○ Addressing tickets and issues escalated from L1 by providing immediate
workarounds.
○ Ensuring minimal downtime and impact on business operations.
3. Forwarding Issues to L3/Development Team
○ When a permanent code-level fix is required, the issue is escalated to the
L3 (Development) team.
○ Prior to escalation, L2 should provide a detailed analysis and temporary
resolution (e.g., database fix) to minimize user impact.
4. Root Cause Analysis and Documentation
○ Conducting detailed root cause analysis (RCA) for major incidents.
○ Updating the Confluence Playbook with clear, actionable steps for L1
teams to facilitate future self-resolution.
AccioJob is conducting a Walk-In Hiring Drive with Pazy for the position of MERN Full Stack Developer.
To apply, register and select your slot here: https://go.acciojob.com/3nSkNh
Required Skills: JavaScript, Git, SQL, Redis
Eligibility:
- Degree: BTech./BE, MTech./ME, BCA, MCA
- Branch: Computer Science/CSE/Other CS related branch, IT
- Graduation Year: 2025, 2026
Work Details:
- Work Location: Bangalore (Onsite)
- CTC: 10 LPA to 14 LPA
Evaluation Process:
Round 1: Offline Assessment at Acciojob Bangalore Centre
Further Rounds (for shortlisted candidates only):
- Resume shortlist
- Technical Interview 1
- Technical Interview 2
Important Note: Bring your laptop & earphones for the test.
Register here: https://go.acciojob.com/3nSkNh
Min 7+ years of developing applications utilizing.NET core, C#, and Web API.
• Angular 16 (TypeScript)
• NET Core Web API Development
• LINQ & Entity Framework
• SQL Server Database Design & Querying
• Unit Testing Frameworks: NUnit, MSTest
Good to have skills : • CI/CD Pipeline experience (Azure DevOps, GitHub Actions, etc.) • Team Foundation Server (TFS) • AWS Cloud Services • Docker containerization • Microservices Architecture
AccioJob is conducting a Walk-In Hiring Drive with a Leading Healthcare Group for the position of Data Analyst.
To apply, register and select your slot here: https://go.acciojob.com/ctS9M4
Required Skills: SQL, Power BI, Problem Solving, Aptitude, Excel
Eligibility:
- Degree: BTech./BE, MTech./ME, MCA
- Branch: Computer Science/CSE/Other CS related branch, Electrical/Other electrical-related branches, IT
- Graduation Year: 2024, 2025
Work Details:
- Work Location: Bangalore (Onsite)
- CTC: 6 LPA to 8 LPA
Evaluation Process:
Round 1: Offline Assessment at Acciojob Bangalore Centre
Further Rounds (for shortlisted candidates only):
- Resume shortlist
- Technical Interview 1
- Technical Interview 2
Important Note: Bring your laptop & earphones for the test.
Register here: https://go.acciojob.com/ctS9M4
About the Role:
We are looking for a highly skilled Full-Stack Developer with expertise in .NET Core, to develop and maintain scalable web applications and microservices. The ideal candidate will have strong problem-solving skills, experience in modern software development, and a passion for creating robust, high-performance applications.
Key Responsibilities:
Backend Development:
- Design, develop, and maintain microservices and APIs using.NET Core. Should have a good understanding of .NET Framework.
- Implement RESTful APIs, ensuring high performance and security.
- Optimize database queries and design schemas for SQL Server / Snowflake / MongoDB.
Software Architecture & DevOps:
- Design and implement scalable microservices architecture.
- Work with Docker, Kubernetes, and CI/CD pipelines for deployment and automation.
- Ensure best practices in security, scalability, and performance.
Collaboration & Agile Development:
- Work closely with UI/UX designers, backend engineers, and product managers.
- Participate in Agile/Scrum ceremonies, code reviews, and knowledge-sharing sessions.
- Write clean, maintainable, and well-documented code.
Required Skills & Qualifications:
- 3+ years of experience as a Full-Stack Developer.
- Strong experience in .NET Core, C#.
- Proficiency in React.js, JavaScript (ES6+), TypeScript.
- Experience with RESTful APIs, Microservices architecture.
- Knowledge of SQL / NoSQL databases (SQL Server, Snowflake, MongoDB).
- Experience with Git, CI/CD pipelines, Docker, and Kubernetes.
- Familiarity with Cloud services (Azure, AWS, or GCP) is a plus.
- Strong debugging and troubleshooting skills.
Nice-to-Have:
- Experience with GraphQL, gRPC, WebSockets.
- Exposure to serverless architecture and cloud-based solutions.
- Knowledge of authentication/authorization frameworks (OAuth, JWT, Identity Server).
- Experience with unit testing and integration testing.
Job Overview:
We are looking for a Senior Analyst who has led teams and managed system operations.
Key Responsibilities:
- Lead and mentor a team of analysts to drive high-quality execution.
- Design, write, and optimize SQL queries to derive actionable insights.
- Manage, monitor, and enhance Payment Governance Systems for accuracy and efficiency.
- Work cross-functionally with Finance, Tech, and Operations teams to maintain data integrity.
- Build and automate dashboards/reports to track key metrics and system performance.
- Identify anomalies and lead root cause analysis for payment-related issues.
- Define and document processes, SOPs, and governance protocols.
- Ensure compliance with internal control frameworks and audit readiness.
Requirements:
We require candidates with the following qualifications:
- 3–5 years of experience in analytics, data systems, or operations.
- Proven track record of leading small to mid-size teams.
- Strong command over SQL and data querying techniques.
- Experience with payment systems, reconciliation, or financial data platforms.
- Analytical mindset with problem-solving abilities.
- Ability to work in a fast-paced, cross-functional environment.
- Excellent communication and stakeholder management skills.
We are looking for a skilled Java Developer to join our growing team. The ideal candidate should have hands-on experience in designing, developing, and maintaining high-performance Java applications. You will be responsible for writing clean, efficient, and scalable code while collaborating with cross-functional teams to deliver robust solutions.
Key Responsibilities:
- Design, develop, test, and deploy Java-based applications.
- Write clean, maintainable, and efficient code.
- Work with databases (SQL/NoSQL) and ensure smooth integration.
- Debug, troubleshoot, and optimize application performance.
- Collaborate with the team to understand requirements and deliver within timelines.
- Participate in code reviews and maintain coding standards.
Requirements:
- 3–6 years of hands-on experience in Java development.
- Strong knowledge of Core Java, OOPs, Collections, Multithreading.
- Experience with Spring / Spring Boot frameworks.
- Familiarity with RESTful APIs, Microservices architecture.
- Knowledge of relational databases (MySQL, PostgreSQL, Oracle, etc.).
- Good understanding of version control (Git).
- Strong problem-solving and debugging skills.
Good to Have Skills:
- Experience with Hibernate/JPA.
- Exposure to cloud platforms (AWS, Azure, GCP).
- Familiarity with CI/CD tools (Jenkins, Docker, Kubernetes).
- Knowledge of Agile/Scrum methodologies.
Why Join Us?
- Opportunity to work on challenging and scalable projects.
- Growth-oriented environment with learning opportunities.
- Collaborative and inclusive work culture.
Experience: 2–6 Years
Location: [Bangalore]
Employment Type: Full-time/ WFO
Notice - 30 days
Role Overview
We are looking for a skilled and motivated .NET Developer with 2–6 years of experience to join our dynamic team. The ideal candidate should have a strong foundation in .NET technologies, relational databases, and cloud-based application development. You will be responsible for designing, developing, and maintaining scalable, secure, and high-performance applications while collaborating with cross-functional teams in an Agile environment.
Key Responsibilities
- Design, develop, and maintain applications using .NET technologies.
- Work extensively with relational databases and optimize SQL queries for performance.
- Develop and maintain API-driven solutions and integrate with ETL/data warehouse environments handling large volumes of data.
- Apply Object-Oriented Programming (OOP) principles and design patterns to build scalable and maintainable solutions.
- Develop, migrate, and deploy applications in Microsoft Azure Cloud (Managed SQL, VMs, containerized architecture).
- Write and optimize SSAS queries for analytics and reporting.
- Collaborate with Product Owners, Architects, and QA in an Agile development environment (Scrum/Kanban).
- Participate in code reviews, sprint planning, and daily standups to ensure high-quality deliverables.
- Implement best practices in security, performance, and cloud-native development.
- Build and maintain frontend components using Angular frameworks (2–3 years preferred).
Required Skills & Qualifications
- 2–6 years of experience in software development using .NET technologies.
- Strong understanding of OOP concepts and design patterns.
- Hands-on experience with SQL, data warehouses, ETL pipelines, and high-volume data processing.
- Proven experience in API development and integration.
- 3–5 years of experience in Azure Cloud with cloud-native development and application migration.
- Proficiency in Azure services: Managed SQL, VMs, container-based deployments.
- Experience with SSAS query writing and optimization.
- 2–3 years of experience in frontend development with Angular frameworks.
- Familiarity with Agile methodologies and collaboration tools (JIRA, Azure DevOps, Git, etc.).
- Strong problem-solving skills, attention to detail, and ability to work independently as well as in a team.
Good to Have
- Experience in Azure DevOps CI/CD pipelines.
- Knowledge of other cloud providers (AWS, GCP) is a plus.
- Exposure to performance tuning, monitoring, and troubleshooting in cloud-hosted applications.
Education
- Bachelor’s or Master’s degree in Computer Science, Information Technology, or related field.
Proven experience as a Data Scientist or similar role with relevant experience of at least 4 years and total experience 6-8 years.
· Technical expertiseregarding data models, database design development, data mining and segmentation techniques
· Strong knowledge of and experience with reporting packages (Business Objects and likewise), databases, programming in ETL frameworks
· Experience with data movement and management in the Cloud utilizing a combination of Azure or AWS features
· Hands on experience in data visualization tools – Power BI preferred
· Solid understanding of machine learning
· Knowledge of data management and visualization techniques
· A knack for statistical analysis and predictive modeling
· Good knowledge of Python and Matlab
· Experience with SQL and NoSQL databases including ability to write complex queries and procedures
Responsibilities:
- Design and develop scalable, secure, and high-performance applications using Python (Django framework).
- Architect system components, define database schemas, and optimize backend services for speed and efficiency.
- Lead and implement design patterns and software architecture best practices.
- Ensure code quality through comprehensive unit testing, integration testing, and participation in code reviews.
- Collaborate closely with Product, DevOps, QA, and Frontend teams to build seamless end-to-end solutions.
- Drive performance improvements, monitor system health, and troubleshoot production issues.
- Apply domain knowledge in payments and finance, including transaction processing, reconciliation, settlements, wallets, UPI, etc.
- Contribute to technical decision-making and mentor junior developers.
Requirements:
- 6 to 10 years of professional backend development experience with Python and Django.
- Strong background in payments/financial systems or FinTech applications.
- Proven experience in designing software architecture in a microservices or modular monolith environment.
- Experience working in fast-paced startup environments with agile practices.
- Proficiency in RESTful APIs, SQL (PostgreSQL/MySQL), NoSQL (MongoDB/Redis).
- Solid understanding of Docker, CI/CD pipelines, and cloud platforms (AWS/GCP/Azure).
- Hands-on experience with test-driven development (TDD) and frameworks like pytest, unittest, or factory_boy.
- Familiarity with security best practices in financial applications (PCI compliance, data encryption, etc.).
Preferred Skills:
- Exposure to event-driven architecture (Celery, Kafka, RabbitMQ).
- Experience integrating with third-party payment gateways, banking APIs, or financial instruments.
- Understanding of DevOps and monitoring tools (Prometheus, ELK, Grafana).
- Contributions to open-source or personal finance-related projects.
Key Responsibilities:
- Develop, maintain, and optimize data pipelines using DBT and SQL.
- Collaborate with data analysts and business teams to build scalable data models.
- Implement data transformations, testing, and documentation within the DBT framework.
- Work with Snowflake for data warehousing tasks, including data ingestion, query optimization, and performance tuning.
- Use Python (preferred) for automation, scripting, and additional data processing as needed.
Required Skills:
- 4–6 years of experience in data engineering or related roles.
- Strong hands-on expertise with DBT and advanced SQL.
- Experience working with modern data warehouses, preferably Snowflake.
- Knowledge of Python for data manipulation and workflow automation (preferred but not mandatory).
- Good understanding of data modeling concepts, ETL/ELT processes, and best practices.
Required Skill Set:
- Strong experience in Java Spring frameworks
- Working knowledge of Python for scripting
- Familiarity with relational databases like Postgres
- Exposure to document databases like MongoDB
- Understanding of caching mechanisms
- Strong debugging and analytical skills
Java Developer – Job Description Wissen Technology is now hiring for a Java Developer - Bangalore with hands-on experience in Core Java, algorithms, data structures, multithreading and SQL. We are solving complex technical problems in the industry and need talented software engineers to join our mission and be a part of a global software development team. A brilliant opportunity to become a part of a highly motivated and expert team which has made a mark as a high-end technical consulting. Required Skills: • Exp. - 4 to 7 years. • Experience in Core Java and Spring Boot. • Extensive experience in developing enterprise-scale applications and systems. Should possess good architectural knowledge and be aware of enterprise application design patterns. • Should have the ability to analyze, design, develop and test complex, low-latency client facing applications. • Good development experience with RDBMS. • Good knowledge of multi-threading and high-performance server-side development. • Basic working knowledge of Unix/Linux. • Excellent problem solving and coding skills. • Strong interpersonal, communication and analytical skills. • Should have the ability to express their design ideas and thoughts. About Wissen Technology: Wissen Technology is a niche global consulting and solutions company that brings unparalleled domain expertise in Banking and Finance, Telecom and Startups. Wissen Technology is a part of Wissen Group and was established in the year 2015. Wissen has offices in the US, India, UK, Australia, Mexico, and Canada, with best-in-class infrastructure and development facilities. Wissen has successfully delivered projects worth $1 Billion for more than 25 of the Fortune 500 companies. The Wissen Group overall includes more than 4000 highly skilled professionals. Wissen Technology provides exceptional value in mission critical projects for its clients, through thought leadership, ownership, and assured on-time deliveries that are always ‘first time right’. Our team consists of 1200+ highly skilled professionals, with leadership and senior management executives who have graduated from Ivy League Universities like Wharton, MIT, IITs, IIMs, and NITs and with rich work experience in some of the biggest companies in the world. Wissen Technology offers an array of services including Application Development, Artificial Intelligence & Machine Learning, Big Data & Analytics, Visualization & Business Intelligence, Robotic Process Automation, Cloud, Mobility, Agile & DevOps, Quality Assurance & Test Automation. We have been certified as a Great Place to Work® for two consecutive years (2020-2022) and voted as the Top 20 AI/ML vendor by CIO Insider.
Job Title: Senior Data Engineer
Location: Bangalore | Hybrid
Company: krtrimaIQ Cognitive Solutions
Role Overview:
As a Senior Data Engineer, you will design, build, and optimize robust data foundations and end-to-end solutions to unlock maximum value from data across the organization. You will play a key role in fostering data-driven thinking, not only within the IT function but also among broader business stakeholders. You will serve as a technology and subject matter expert, providing mentorship to junior engineers and translating the company’s vision and Data Strategy into actionable, high-impact IT solutions.
Key Responsibilities:
- Design, develop, and implement scalable data solutions to support business objectives and drive digital transformation.
- Serve as a subject matter expert in data engineering, providing guidance and mentorship to junior team members.
- Enable and promote data-driven culture throughout the organization, engaging both technical and business stakeholders.
- Lead the design and delivery of Data Foundation initiatives, ensuring adoption and value realization across business units.
- Collaborate with business and IT teams to capture requirements, design optimal data models, and deliver high-value insights.
- Manage and drive change management, incident management, and problem management processes related to data platforms.
- Present technical reports and actionable insights to stakeholders and leadership teams, acting as the expert in Data Analysis and Design.
- Continuously improve efficiency and effectiveness of solution delivery, driving down costs and reducing implementation times.
- Contribute to organizational knowledge-sharing and capability building (e.g., Centers of Excellence, Communities of Practice).
- Champion best practices in code quality, DevOps, CI/CD, and data governance throughout the solution lifecycle.
Key Characteristics:
- Technology expert with a passion for continuous learning and exploring multiple perspectives.
- Deep expertise in the data engineering/technology domain, with hands-on experience across the full data stack.
- Excellent communicator, able to bridge the gap between technical teams and business stakeholders.
- Trusted leader, respected across levels for subject matter expertise and collaborative approach.
Mandatory Skills & Experience:
- Mastery in public cloud platforms: AWS, Azure, SAP
- Mastery in ELT (Extract, Load, Transform) operations
- Advanced data modeling expertise for enterprise data platforms
Hands-on skills:
- Data Integration & Ingestion
- Data Manipulation and Processing
- Source/version control and DevOps tools: GITHUB, Actions, Azure DevOps
- Data engineering/data platform tools: Azure Data Factory, Databricks, SQL Database, Synapse Analytics, Stream Analytics, AWS Glue, Apache Airflow, AWS Kinesis, Amazon Redshift, SonarQube, PyTest
- Experience building scalable and reliable data pipelines for analytics and other business applications
Optional/Preferred Skills:
- Project management experience, especially running or contributing to Scrum teams
- Experience working with BPC (Business Planning and Consolidation), Planning tools
- Exposure to working with external partners in the technology ecosystem and vendor management
What We Offer:
- Opportunity to leverage cutting-edge technologies in a high-impact, global business environment
- Collaborative, growth-oriented culture with strong community and knowledge-sharing
- Chance to influence and drive key data initiatives across the organization
Quidcash is seeking a skilled Backend Developer to architect, build, and optimize mission-critical financial systems. You’ll leverage your expertise in JavaScript, Python, and OOP to develop scalable backend services that power our fintech/lending solutions. This role offers
the chance to solve complex technical challenges, integrate cutting-edge technologies, and directly impact the future of financial services for Indian SMEs.
If you are a leader who thrives on technical challenges, loves building high-performing teams, and is excited by the potential of AI/ML in fintech, we want to hear from you!
What You ll Do:
Design & Development: Build scalable backend services using JavaScript(Node.js) and Python, adhering to OOP principles and microservices architecture.
Fintech Integration: Develop secure APIs (REST/gRPC) for financial workflows(e.g., payments, transactions, data processing) and ensure compliance with regulations (PCI-DSS, GDPR).
System Optimization: Enhance performance, reliability, and scalability of cloud- native applications on AWS.
Collaboration: Partner with frontend, data, and product teams to deliver end-to-end features in Agile/Scrum cycles.
Quality Assurance: Implement automated testing (unit/integration), CI/CD pipelines, and DevOps practices.
Technical Innovation: Contribute to architectural decisions and explore AI/ML integration opportunities in financial products.
What You'll Bring (Must-Haves):
Experience:
3–5 years of backend development with JavaScript (Node.js) and Python.
Proven experience applying OOP principles, design patterns, and micro services.
Background in fintech, banking, or financial systems (e.g., payment gateways, risk engines, transactional platforms).
Technical Acumen:
Languages/Frameworks:
JavaScript (Node.js + Express.js/Fastify)
Python (Django/Flask/FastAPI)
Databases: SQL (PostgreSQL/MySQL) and/or NoSQL (MongoDB/Redis).
Cloud & DevOps: AWS/GCP/Azure, Docker, Kubernetes, CI/CD tools (Jenkins/GitLab).
Financial Tech: API security (OAuth2/JWT), message queues (Kafka/RabbitMQ), and knowledge of financial protocols (e.g., ISO 20022).
Mindset:
Problem-solver with a passion for clean, testable code and continuous improvement.
Adaptability in fast-paced environments and commitment to deadlines.
Collaborative spirit with strong communication skills.
Why Join Quidcash?
Impact: Play a pivotal role in shaping a product that directly impacts Indian SMEs' business growth.
Innovation: Work with cutting-edge technologies, including AI/ML, in a forward-thinking environment.
Growth: Opportunities for professional development and career advancement in a growing company.
Culture: Be part of a collaborative, supportive, and brilliant team that values every contribution.
Benefits: Competitive salary, comprehensive benefits package, and be a part of the next fintech evolution.
If you are interested, pls share your profile to smithaquidcash.in
Key Responsibilities
- Data Architecture & Pipeline Development
- Design, implement, and optimize ETL/ELT pipelines using Azure Data Factory, Databricks, and Synapse Analytics.
- Integrate structured, semi-structured, and unstructured data from multiple sources.
- Data Storage & Management
- Develop and maintain Azure SQL Database, Azure Synapse Analytics, and Azure Data Lake solutions.
- Ensure proper indexing, partitioning, and storage optimization for performance.
- Data Governance & Security
- Implement role-based access control, data encryption, and compliance with GDPR/CCPA.
- Ensure metadata management and data lineage tracking with Azure Purview or similar tools.
- Collaboration & Stakeholder Engagement
- Work closely with BI developers, analysts, and business teams to translate requirements into data solutions.
- Provide technical guidance and best practices for data integration and transformation.
- Monitoring & Optimization
- Set up monitoring and alerting for data pipelines.
Job Title: Full Stack Developer – Java + React
Location: Bangalore
Experience: 10 to 14 Years
Job Summary:
We are looking for a skilled Full Stack Developer with strong experience in Java (backend) and React.js (frontend) to join our dynamic engineering team. The ideal candidate will have hands-on experience in building scalable web applications, RESTful services, and responsive UI components.
Key Responsibilities:
- Develop and maintain scalable backend services using Core Java / Spring Boot
- Design and implement responsive, high-quality front-end interfaces using React.js
- Integrate backend APIs with frontend components seamlessly
- Collaborate with product managers, architects, and QA to deliver quality software
- Ensure code quality through unit testing, code reviews, and performance tuning
- Troubleshoot and debug production issues as needed
Key Skills Required:
Backend:
- Strong programming skills in Core Java, Java 8+
- Experience with Spring Boot, RESTful APIs, Microservices architecture
- Good understanding of JPA/Hibernate, and SQL/NoSQL databases
Frontend:
- Proficient in React.js, JavaScript (ES6+), HTML5, CSS3
- Experience with Redux, React Hooks, and component-based architecture
- Familiarity with front-end build tools (Webpack, Babel, NPM)
Responsibility:
∙Develop and maintain code following predefined cost, company and security
standards.
∙Work on bug fixes, supporting in the maintenance and improvement of existing
applications.
∙Elaborate interfaces using standards and design principles defined by the team.
∙Develop systems with high availability.
∙Attend and contribute to development meetings.
∙Well versed with Unit testing and PSR Standards.
∙Master Software Development lifecycle, standards and technologies used by the
team.
∙Deliver on time with high quality.
∙Write Automation tests before to API call to code it and test it.
∙Trouble Shooting and debugging skills.
∙Perform technical documentation of the implemented tasks.
Job Title: Backend Engineer - NodeJS, NestJS, and Python
Location: Hybrid weekly ⅔ days WFO (Bengaluru- India)
About the role:
We are looking for a skilled and passionate Senior Backend Developer to join our dynamic team. The ideal candidate should have strong experience in Node.js and NestJS, along with a solid understanding of database management, query optimization, and microservices architecture. As a backend developer, you will be responsible for developing and maintaining scalable backend systems, building robust APIs, integrating databases, and working closely with frontend and DevOps teams to deliver high-quality software solutions.
What You'll Do 🛠️
- Design, develop, and maintain server-side logic using Node.js, NestJS, and Python.
- Develop and integrate RESTful APIs and microservices to support scalable systems.
- Work with NoSQL and SQL databases (e.g., MongoDB, PostgreSQL, MySQL) to create and manage schemas, write complex queries, and optimize performance.
- Collaborate with cross-functional teams including frontend, DevOps, and QA.
- Ensure code quality, maintainability, and scalability through code reviews, testing, and documentation.
- Monitor and troubleshoot production systems, ensuring high availability and performance.
- Implement security and data protection best practices.
What You'll Bring 💼
- 4 to 6 years of professional experience as a backend developer.
- Strong proficiency in Node.js and NestJS framework.
- Good hands-on experience with Python (Django/Flask experience is a plus).
- Solid understanding of relational and non-relational databases.
- Proficient in writing complex NoSQL queries and SQL queries
- Experience with microservices architecture and distributed systems.
- Familiarity with version control systems like Git.
- Basic understanding of containerization (e.g., Docker) and cloud services is a plus.
- Excellent problem-solving skills and a collaborative mindset.
Bonus Points ➕
- Experience with CI/CD pipelines.
- Exposure to cloud platforms like AWS, GCP or Azure.
- Familiarity with event-driven architecture or message brokers (MQTT, Kafka, RabbitMQ)
Why this role matters
You will help build the company from the ground up—shaping our culture and having an impact from Day 1 as part of the foundational team.
Job Title: Backend Engineer - NodeJS, NestJS, and Python
Location: Hybrid weekly ⅔ days WFO (Bengaluru- India)
About the role:
We are looking for a skilled and passionate Senior Backend Developer to join our dynamic team. The ideal candidate should have strong experience in Node.js and NestJS, along with a solid understanding of database management, query optimization, and microservices architecture. As a backend developer, you will be responsible for developing and maintaining scalable backend systems, building robust APIs, integrating databases, and working closely with frontend and DevOps teams to deliver high-quality software solutions.
What You'll Do 🛠️
- Design, develop, and maintain server-side logic using Node.js, NestJS, and Python.
- Develop and integrate RESTful APIs and microservices to support scalable systems.
- Work with NoSQL and SQL databases (e.g., MongoDB, PostgreSQL, MySQL) to create and manage schemas, write complex queries, and optimize performance.
- Collaborate with cross-functional teams including frontend, DevOps, and QA.
- Ensure code quality, maintainability, and scalability through code reviews, testing, and documentation.
- Monitor and troubleshoot production systems, ensuring high availability and performance.
- Implement security and data protection best practices.
What You'll Bring 💼
- 4 to 6 years of professional experience as a backend developer.
- Strong proficiency in Node.js and NestJS framework.
- Good hands-on experience with Python (Django/Flask experience is a plus).
- Solid understanding of relational and non-relational databases.
- Proficient in writing complex NoSQL queries and SQL queries
- Experience with microservices architecture and distributed systems.
- Familiarity with version control systems like Git.
- Basic understanding of containerization (e.g., Docker) and cloud services is a plus.
- Excellent problem-solving skills and a collaborative mindset.
Bonus Points ➕
- Experience with CI/CD pipelines.
- Exposure to cloud platforms like AWS, GCP or Azure.
- Familiarity with event-driven architecture or message brokers (MQTT, Kafka, RabbitMQ)
Why this role matters
You will help build the company from the ground up—shaping our culture and having an impact from Day 1 as part of the foundational team.
Job Title: Python Developer
Location: Bangalore
Experience: 5–7 Years
Employment Type: Full-Time
Job Description:
We are seeking an experienced Python Developer with strong proficiency in data analysis tools and PySpark, along with a solid understanding of SQL syntax. The ideal candidate will work on large-scale data processing and analysis tasks within a fast-paced environment.
Key Requirements:
Python: Hands-on experience with Python, specifically in data analysis using libraries such as pandas, numpy, etc.
PySpark: Proficiency in writing efficient PySpark code for distributed data processing.
SQL: Strong knowledge of SQL syntax and experience in writing optimized queries.
Ability to work independently and collaborate effectively with cross-functional teams.
Job Title : Informatica MDM Developer
Experience : 7 to 10 Years
Location : Bangalore (3 Days Work From Office – ITPL Main Road, Mahadevapura)
Job Type : Full-time / Contract
Job Overview :
We are hiring an experienced Informatica MDM Developer to join our team in Bangalore. The ideal candidate will play a key role in implementing and customizing Master Data Management (MDM) solutions using Informatica MDM (Multi-Domain Edition), ensuring a trusted, unified view of enterprise data.
Mandatory Skills :
Informatica MDM (Multi-Domain Edition), ActiveVOS workflows, Java (User Exits), Services Integration Framework (SIF) APIs, SQL/PLSQL, Data Modeling, Informatica Data Quality (IDQ), MDM concepts (golden record, survivorship, trust, hierarchy).
Key Responsibilities :
- Configure Informatica MDM Hub : subject area models, base objects, relationships.
- Develop match/merge rules, trust/survivorship logic to create golden records.
- Design workflows using ActiveVOS for data stewardship and exception handling.
- Integrate with source/target systems (ERP, CRM, Data Lakes, APIs).
- Customize user exits (Java), SIF APIs, and business entity services.
- Implement and maintain data quality validations using IDQ.
- Collaborate with cross-functional teams for governance alignment.
- Support MDM jobs, synchronization, batch groups, and performance tuning.
Must-Have Skills :
- 7 to 10 years of experience in Data Engineering or MDM.
- 5+ years hands-on with Informatica MDM (Multi-Domain Edition).
- Strong in MDM concepts : golden record, trust, survivorship, hierarchy.
Proficient in :
- Informatica MDM Hub Console, Provisioning Tool, SIF.
- ActiveVOS workflows, Java-based user exits.
- SQL, PL/SQL, and data modeling.
- Experience with system integration and Informatica Data Quality (IDQ).
Nice-to-Have :
- Knowledge of Informatica EDC, Axon, cloud MDM (AWS/GCP/Azure).
- Understanding of data lineage, GDPR/HIPAA compliance, and DevOps tools.
- 5 -10 years of experience in ETL Testing, Snowflake, DWH Concepts.
- Strong SQL knowledge & debugging skills are a must.
- Experience on Azure and Snowflake Testing is plus
- Experience with Qlik Replicate and Compose tools (Change Data Capture) tools is considered a plus
- Strong Data warehousing Concepts, ETL tools like Talend Cloud Data Integration, Pentaho/Kettle tool
- Experience in JIRA, Xray defect management toolis good to have.
- Exposure to the financial domain knowledge is considered a plus
- Testing the data-readiness (data quality) address code or data issues
- Demonstrated ability to rationalize problems and use judgment and innovation to define clear and concise solutions
- Demonstrate strong collaborative experience across regions (APAC, EMEA and NA) to effectively and efficiently identify root cause of code/data issues and come up with a permanent solution
- Prior experience with State Street and Charles River Development (CRD) considered a plus
- Experience in tools such as PowerPoint, Excel, SQL
- Exposure to Third party data providers such as Bloomberg, Reuters, MSCI and other Rating agencies is a plus
Key Attributes include:
- Team player with professional and positive approach
- Creative, innovative and able to think outside of the box
- Strong attention to detail during root cause analysis and defect issue resolution
- Self-motivated & self-sufficient
- Effective communicator both written and verbal
- Brings a high level of energy with enthusiasm to generate excitement and motivate the team
- Able to work under pressure with tight deadlines and/or multiple projects
- Experience in negotiation and conflict resolution
🔍 Job Description:
We are looking for an experienced and highly skilled Technical Lead to guide the development and enhancement of a large-scale Data Observability solution built on AWS. This platform is pivotal in delivering monitoring, reporting, and actionable insights across the client's data landscape.
The Technical Lead will drive end-to-end feature delivery, mentor junior engineers, and uphold engineering best practices. The position reports to the Programme Technical Lead / Architect and involves close collaboration to align on platform vision, technical priorities, and success KPIs.
🎯 Key Responsibilities:
- Lead the design, development, and delivery of features for the data observability solution.
- Mentor and guide junior engineers, promoting technical growth and engineering excellence.
- Collaborate with the architect to align on platform roadmap, vision, and success metrics.
- Ensure high quality, scalability, and performance in data engineering solutions.
- Contribute to code reviews, architecture discussions, and operational readiness.
🔧 Primary Must-Have Skills (Non-Negotiable):
- 5+ years in Data Engineering or Software Engineering roles.
- 3+ years in a technical team or squad leadership capacity.
- Deep expertise in AWS Data Services: Glue, EMR, Kinesis, Lambda, Athena, S3.
- Advanced programming experience with PySpark, Python, and SQL.
- Proven experience in building scalable, production-grade data pipelines on cloud platforms.
Job Title: Salesforce QA Engineer
Experience: 6+ Years
Location: Bangalore
Work mode: Hybrid (2 days work from office) (Manyata Tech Park)
Job description:
6+ years of hands on experience with both Manual and Automated testing with strong preference of experience using AccelQ on Salesforce ans SAP platforms.
Proven expertise in Salesforce particularly within the Sales Cloud module.
Proficient in writing complex SOQL and SQL queries for data validation and backend testing.
Extensive experience in designing and developing robust, reusable automated test scripts for Salesforce environments.
Highly skilled at early issue detection, with a deep understanding of backend configurations, process flows and validation rules.
Should have a strong background in Salesforce testing, with hands-on experience in automation tools such as Selenium, Provar, or TestNG.
You will be responsible for creating and maintaining automated test scripts, executing test cases, identifying bugs, and ensuring the quality and reliability of Salesforce applications.
A solid understanding of Salesforce modules (Sales Cloud, Service Cloud, etc.) and APIs is essential.
Experience with CI/CD tools like Jenkins and version control systems like Git is preferred.
You will work closely with developers, business analysts, and stakeholders to define test strategies and improve the overall QA process.
Minimum 3 years of experience in Sales Cloud is required.
Proficiency in test automation using any tool (AccelQ preferred).
Hands-on experience with API testing.
Strong knowledge of SQL queries.
Manual testing experience with any third-party applications.(SAP, ERP,etc)
Job Title: Java Developer
Java Developer – Job Description Wissen Technology is now hiring for a Java Developer - Bangalore with hands-on experience in Core Java, algorithms, data structures, multithreading and SQL. We are solving complex technical problems in the industry and need talented software engineers to join our mission and be a part of a global software development team. A brilliant opportunity to become a part of a highly motivated and expert team which has made a mark as a high-end technical consulting.
Required Skills: • Exp. - 4 to 14 years.
• Experience in Core Java and Spring Boot.
• Extensive experience in developing enterprise-scale applications and systems. Should possess good architectural knowledge and be aware of enterprise application design patterns.
• Should have the ability to analyze, design, develop and test complex, low-latency clientfacing applications.
• Good development experience with RDBMS.
• Good knowledge of multi-threading and high-performance server-side development.
• Basic working knowledge of Unix/Linux.
• Excellent problem solving and coding skills.
• Strong interpersonal, communication and analytical skills.
• Should have the ability to express their design ideas and thoughts.
About Wissen Technology:
Wissen Technology is a niche global consulting and solutions company that brings unparalleled domain expertise in Banking and Finance, Telecom and Startups. Wissen Technology is a part of Wissen Group and was established in the year 2015. Wissen has offices in the US, India, UK, Australia, Mexico, and Canada, with best-in-class infrastructure and development facilities. Wissen hassuccessfully delivered projects worth $1 Billion for more than 25 of the Fortune 500 companies. The Wissen Group overall includes more than 4000 highly skilled professionals. Wissen Technology provides exceptional value in mission critical projects for its clients, through thought leadership, ownership, and assured on-time deliveries that are always ‘first time right’. Our team consists of 1200+ highly skilled professionals, with leadership and senior management executives who have graduated from Ivy League Universities like Wharton, MIT, IITs, IIMs, and NITs and with rich work experience in some of the biggest companies in the world. Wissen Technology offers an array of services including Application Development, Artificial Intelligence & Machine Learning, Big Data & Analytics, Visualization & Business Intelligence, Robotic Process Automation, Cloud, Mobility, Agile & DevOps, Quality Assurance & Test Automation. We have been certified as a Great Place to Work® for two consecutive years (2020-2022) and voted as the Top 20 AI/ML vendor by CIO Insider.
- 5+ years of experience
- FlaskAPI, RestAPI development experience
- Proficiency in Python programming.
- Basic knowledge of front-end development.
- Basic knowledge of Data manipulation and analysis libraries
- Code versioning and collaboration. (Git)
- Knowledge for Libraries for extracting data from websites.
- Knowledge of SQL and NoSQL databases
- Familiarity with RESTful APIs
- Familiarity with Cloud (Azure /AWS) technologies
🚀 Hiring: Manual Tester
⭐ Experience: 5+ Years
📍 Location: Pan India
⭐ Work Mode:- Hybrid
⏱️ Notice Period: Immediate Joiners
(Only immediate joiners & candidates serving notice period)
Must-Have Skills:
✅5+ years of experience in Manual Testing
✅Solid experience in ETL, Database, and Report Testing
✅Strong expertise in SQL queries, RDBMS concepts, and DML/DDL operations
✅Working knowledge of BI tools such as Power BI
✅Ability to write effective Test Cases and Test Scenarios

Product company for financial operations automation platform
Mandatory Criteria (Can't be neglected during screening) :
- Candidate Must have Project management experience.
- Strong hands-on experience with SQL, including the ability to write, optimize, and debug complex queries (joins, CTEs, subqueries).
- Must have experience in Treasury Module.
- Should have a basic understanding of accounting principles and financial workflows
- 3+ years of implementation experience is required.
- Looking candidates from Fintech company ONLY. ( Candidate should have Strong knowledge of fintech products, financial workflows, and integrations )
- Candidate should have Hands-on experience with tools such as Jira, Confluence, Excel, and project management platforms.
- Candidate should have Experience in managing multi-stakeholder projects from scratch.
Position Overview
We are looking for an experienced Implementation Lead to drive the onboarding and implementation of our platform for new and existing fintech clients. This role is ideal for someone with a strong understanding of financial systems, implementation methodologies, and client management. You’ll collaborate closely with product, engineering, and customer success teams to ensure timely, accurate, and seamless deployments.
Key Responsibilities
- Lead end-to-end implementation projects for enterprise fintech clients
- Translate client requirements into detailed implementation plans and configure solutions accordingly.
- Write and optimize complex SQL queries for data analysis, validation, and integration
- Oversee ETL processes – extract, transform, and load financial data across systems
- Collaborate with cross-functional teams including Product, Engineering, and Support
- Ensure timely, high-quality delivery across multiple stakeholders and client touchpoints
- Document processes, client requirements, and integration flows in detail.
Required Qualifications
- Bachelor’s degree in Finance, Business Administration, Information Systems, or related field
- 3+ years of hands-on implementation/project management experience
- Proven experience delivering projects in Fintech, SaaS, or ERP environments
- Strong understanding of accounting principles and financial workflows
- Hands-on SQL experience, including the ability to write and debug complex queries (joins, CTEs, subqueries)
- Experience working with ETL pipelines or data migration processes
- Proficiency in tools like Jira, Confluence, Excel, and project tracking systems
- Strong communication and stakeholder management skills
- Ability to manage multiple projects simultaneously and drive client success
Preferred Qualifications
- Prior experience implementing financial automation tools (e.g., SAP, Oracle, Anaplan, Blackline)
- Familiarity with API integrations and basic data mapping
- Experience in agile/scrum-based implementation environments
- Exposure to reconciliation, book closure, AR/AP, and reporting systems
- PMP, CSM, or similar certifications
Skills & Competencies
Functional Skills
- Financial process knowledge (e.g., reconciliation, accounting, reporting)
- Business analysis and solutioning
- Client onboarding and training
- UAT coordination
- Documentation and SOP creation
Project Skills
- Project planning and risk management
- Task prioritization and resource coordination
- KPI tracking and stakeholder reporting
Soft Skills
- Cross-functional collaboration
- Communication with technical and non-technical teams
- Attention to detail and customer empathy
- Conflict resolution and crisis management
What We Offer
- An opportunity to shape fintech implementations across fast-growing companies
- Work in a dynamic environment with cross-functional experts
- Competitive compensation and rapid career growth
- A collaborative and meritocratic culture
What We’re Looking For:
- Strong experience in Python (5+ years).
- Hands-on experience with any database (SQL or NoSQL).
- Experience with frameworks like Flask, FastAPI, or Django.
- Knowledge of ORMs, API development, and unit testing
Job Title: Lead Java Developer
Location: Bangalore
Experience: 8-12 years (only)
Job Overview:
We are looking for a highly experienced Java / Backend Developer with a strong background in investment banking/trading/brokerages to join our team. The ideal candidate will have extensive experience in developing front office and middle office systems which require high availability, high reliability and low latency/high throughput performance profiles.
Skills & Qualifications:
- 8 to 12 years of experience in Java development, with at least 4 years in the investment banking/financial services domain.
- Strong knowledge of Java, Spring Framework, RESTful web services.
- Hands-on experience with AWS Cloud and related services. Knowledge of Kubernetes is a plus.
- Solid experience with Kafka and/or messaging protocols.
- Familiarity with SQL databases (e.g., Postgres/Oracle).
- Strong problem-solving skills and ability to work in a team.
- Ability to understand and work with distributed systems / microservices architecture.
- Solid written and verbal communication skills to interface with other technology and business stakeholders.
Background
Fisdom is a leading digital wealth management platform. Fisdom platform (mobile apps and web apps) provides access to consumers to a wide bouquet of financial solutions – investments, savings and protection (and many more in the pipeline). Fisdom blends cutting-edge technology with conventional financial wisdom, awesome UX and friendly customer service to make financial products simpler and more accessible to millions of Indians. We are growing and constantly looking for high performers to participate in our growth story. We have recently been certified as the great place to work. For more info, visit www.fisdom.com.
Objectives of this Role
Improve, execute, and effectively communicate significant analyses that identify opportunities across the business
Participate in meetings with management, assessing and addressing issues to identify and implement toward operations
Provide strong and timely financial and business analytic decision support to various organizational stakeholders
Responsibilities
Interpret data, analyze results using analytics, research methodologies, and statistical techniques
Develop and implement data analyses, leverage data collection and other strategies that statistical efficiency and quality
Prepare summarize various weekly, monthly, and periodic results for use by various key stakeholders
Conduct full lifecycle of analytics projects, including pulling, manipulating, and exporting data from project requirements documentation to design and execution
Evaluate key performance indicators, provide ongoing reports, and recommend business plan updates
Skills and Qualification
Bachelor’s degree, preferably in computer science, mathematics, or economics
Advanced analytical skills with experience collecting, organizing, analyzing, and disseminating information with accuracy
The ability to present findings in a polished way
Proficiency with statistics and dataset analytics (using SQL, Python, Excel)
Entrepreneurial mindset, with an innovative approach to business planning
Relevant Industry Experience of more than 2 - 6years, more than 4yrs python experience is must
Preferable- Product startup's, fintech
Why join us and where?
We have an excellent work culture and an opportunity to be a part of a growing organization with immense learning possibilities. You have an opportunity to build a brand from the scratch. All of this, along with top of the line remuneration and challenging work. You will based out of Bangalore.
JOB REQUIREMENT:
Wissen Technology is now hiring a Azure Data Engineer with 7+ years of relevant experience.
We are solving complex technical problems in the financial industry and need talented software engineers to join our mission and be a part of a global software development team. A brilliant opportunity to become a part of a highly motivated and expert team, which has made a mark as a high-end technical consultant.
Required Skills:
· 6+ years of being a practitioner in data engineering or a related field.
· Proficiency in programming skills in Python
· Experience with data processing frameworks like Apache Spark or Hadoop.
· Experience working on Snowflake and Databricks.
· Familiarity with cloud platforms (AWS, Azure) and their data services.
· Experience with data warehousing concepts and technologies.
· Experience with message queues and streaming platforms (e.g., Kafka).
· Excellent communication and collaboration skills.
· Ability to work independently and as part of a geographically distributed team.
Job Description :
We are seeking a highly experienced Sr Data Modeler / Solution Architect to join the Data Architecture team at Corporate Office in Bangalore. The ideal candidate will have 4 to 8 years of experience in data modeling and architecture with deep expertise in AWS cloud stack, data warehousing, and enterprise data modeling tools. This individual will be responsible for designing and creating enterprise-grade data models and driving the implementation of Layered Scalable Architecture or Medallion Architecture to support robust, scalable, and high-quality data marts across multiple business units.
This role will involve managing complex datasets from systems like PoS, ERP, CRM, and external sources, while optimizing performance and cost. You will also provide strategic leadership on data modeling standards, governance, and best practices, ensuring the foundation for analytics and reporting is solid and future-ready.
Key Responsibilities:
· Design and deliver conceptual, logical, and physical data models using tools like ERWin.
· Implement Layered Scalable Architecture / Medallion Architecture for building scalable, standardized data marts.
· Optimize performance and cost of AWS-based data infrastructure (Redshift, S3, Glue, Lambda, etc.).
· Collaborate with cross-functional teams (IT, business, analysts) to gather data requirements and ensure model alignment with KPIs and business logic.
· Develop and optimize SQL code, materialized views, stored procedures in AWS Redshift.
· Ensure data governance, lineage, and quality mechanisms are established across systems.
· Lead and mentor technical teams in an Agile project delivery model.
· Manage data layer creation and documentation: data dictionary, ER diagrams, purpose mapping.
· Identify data gaps and availability issues with respect to source systems.
Required Skills & Qualifications:
· Bachelor’s or Master’s degree in Computer Science, IT, or related field (B.E./B.Tech/M.E./M.Tech/MCA).
· Minimum 4 years of experience in data modeling and architecture.
· Proficiency with data modeling tools such as ERWin, with strong knowledge of forward and reverse engineering.
· Deep expertise in SQL (including advanced SQL, stored procedures, performance tuning).
· Strong experience in data warehousing, RDBMS, and ETL tools like AWS Glue, IBM DataStage, or SAP Data Services.
· Hands-on experience with AWS services: Redshift, S3, Glue, RDS, Lambda, Bedrock, and Q.
· Good understanding of reporting tools such as Tableau, Power BI, or AWS QuickSight.
· Exposure to DevOps/CI-CD pipelines, AI/ML, Gen AI, NLP, and polyglot programming is a plus.
· Familiarity with data governance tools (e.g., ORION/EIIG).
· Domain knowledge in Retail, Manufacturing, HR, or Finance preferred.
· Excellent written and verbal communication skills.
Certifications (Preferred):
· AWS Certification (e.g., AWS Certified Solutions Architect or Data Analytics – Specialty)
· Data Governance or Data Modeling Certifications (e.g., CDMP, Databricks, or TOGAF)
Mandatory Skills
aws, Technical Architecture, Aiml, SQL, Data Warehousing, Data Modelling
🚀 Hiring: Postgres DBA at Deqode
⭐ Experience: 6+ Years
📍 Location: Pune & Hyderabad
⭐ Work Mode:- Hybrid
⏱️ Notice Period: Immediate Joiners
(Only immediate joiners & candidates serving notice period)
Looking for an experienced Postgres DBA with:-
✅ 6+ years in Postgres & strong SQL skills
✅ Good understanding of database services & storage management
✅ Performance tuning & monitoring expertise
✅ Knowledge of Dataguard admin, backups, upgrades
✅ Basic Linux admin & shell scripting
Employment type- Contract basis
Key Responsibilities
- Design, develop, and maintain scalable data pipelines using PySpark and distributed computing frameworks.
- Implement ETL processes and integrate data from structured and unstructured sources into cloud data warehouses.
- Work across Azure or AWS cloud ecosystems to deploy and manage big data workflows.
- Optimize performance of SQL queries and develop stored procedures for data transformation and analytics.
- Collaborate with Data Scientists, Analysts, and Business teams to ensure reliable data availability and quality.
- Maintain documentation and implement best practices for data architecture, governance, and security.
⚙️ Required Skills
- Programming: Proficient in PySpark, Python, and SQL, MongoDB
- Cloud Platforms: Hands-on experience with Azure Data Factory, Databricks, or AWS Glue/Redshift.
- Data Engineering Tools: Familiarity with Apache Spark, Kafka, Airflow, or similar tools.
- Data Warehousing: Strong knowledge of designing and working with data warehouses like Snowflake, BigQuery, Synapse, or Redshift.
- Data Modeling: Experience in dimensional modeling, star/snowflake schema, and data lake architecture.
- CI/CD & Version Control: Exposure to Git, Terraform, or other DevOps tools is a plus.
🧰 Preferred Qualifications
- Bachelor's or Master's in Computer Science, Engineering, or related field.
- Certifications in Azure/AWS are highly desirable.
- Knowledge of business intelligence tools (Power BI, Tableau) is a bonus.
1. Software Development Engineer - Salesforce
What we ask for
We are looking for strong engineers to build best in class systems for commercial &
wholesale banking at Bank, using Salesforce service cloud. We seek experienced
developers who bring deep understanding of salesforce development practices, patterns,
anti-patterns, governor limits, sharing & security model that will allow us to architect &
develop robust applications.
You will work closely with business, product teams to build applications which provide end
users with intuitive, clean, minimalist, easy to navigate experience
Develop systems by implementing software development principles and clean code
practices scalable, secure, highly resilient, have low latency
Should be open to work in a start-up environment and have confidence to deal with complex
issues keeping focus on solutions and project objectives as your guiding North Star
Technical Skills:
● Strong hands-on frontend development using JavaScript and LWC
● Expertise in backend development using Apex, Flows, Async Apex
● Understanding of Database concepts: SOQL, SOSL and SQL
● Hands-on experience in API integration using SOAP, REST API, graphql
● Experience with ETL tools , Data migration, and Data governance
● Experience with Apex Design Patterns, Integration Patterns and Apex testing
framework
● Follow agile, iterative execution model using CI-CD tools like Azure Devops, gitlab,
bitbucket
● Should have worked with at least one programming language - Java, python, c++
and have good understanding of data structures
Preferred qualifications
● Graduate degree in engineering
● Experience developing with India stack
● Experience in fintech or banking domain
Designing, building, and automating ETL processes using AWS services like Apache Sqoop, AWS S3, AWS CLI, Amazon
EMR, Amazon MSK, Amazon Sagemaker.
∙Developing and maintaining data pipelines to move and transform data from diverse sources into data warehouses or
data lakes.
∙Ensuring data quality and integrity through validation, cleansing, and monitoring ETL processes.
∙Optimizing ETL workflows for performance, scalability, and cost efficiency within the AWS environment.
∙Troubleshooting and resolving issues related to data processing and ETL workflows.
∙Implementing and maintaining security measures and compliance standards for data pipelines and infrastructure.
∙Documenting ETL processes, data mappings, and system architecture.
Job Overview
We are looking for a detail-oriented and skilled QA Engineer with expertise in Cypress to join our Quality Assurance team. In this role, you will be responsible for creating and maintaining automated test scripts to ensure the stability and performance of our web applications. You’ll work closely with developers, product managers, and other QA professionals to identify issues early and help deliver a high-quality user experience.
You should have a strong background in test automation, excellent analytical skills, and a passion for improving software quality through efficient testing practices.
Key Responsibilities
- Develop, maintain, and execute automated test cases using Cypress.
- Design robust test strategies and plans based on product requirements and user stories.
- Work with cross-functional teams to identify test requirements and ensure proper coverage.
- Perform regression, integration, smoke, and exploratory testing as needed.
- Report and track defects, and work with developers to resolve issues quickly.
- Collaborate in Agile/Scrum development cycles and contribute to sprint planning and reviews.
- Continuously improve testing tools, processes, and best practices.
- Optimize test scripts for performance, reliability, and maintainability.
Required Skills & Qualifications
- Hands-on experience with Cypress and JavaScript-based test automation.
- Strong understanding of QA methodologies, tools, and processes.
- Experience in testing web applications across multiple browsers and devices.
- Familiarity with REST APIs and tools like Postman or Swagger.
- Experience with version control systems like Git.
- Knowledge of CI/CD pipelines and integrating automated tests (e.g., GitHub Actions, Jenkins).
- Excellent analytical and problem-solving skills.
- Strong written and verbal communication.
Preferred Qualifications
- Experience with other automation tools (e.g., Selenium, Playwright) is a plus.
- Familiarity with performance testing or security testing.
- Background in Agile or Scrum methodologies.
- Basic understanding of DevOps practices.
Hybrid work mode
(Azure) EDW Experience working in loading Star schema data warehouses using framework
architectures including experience loading type 2 dimensions. Ingesting data from various
sources (Structured and Semi Structured), hands on experience ingesting via APIs to lakehouse architectures.
Key Skills: Azure Databricks, Azure Data Factory, Azure Datalake Gen 2 Storage, SQL (expert),
Python (intermediate), Azure Cloud Services knowledge, data analysis (SQL), data warehousing,documentation – BRD, FRD, user story creation.
- A minimum of 4-10 years of experience into data integration/orchestration services, service architecture and providing data driven solutions for client requirements
- Experience on Microsoft Azure cloud and Snowflake SQL, database query/performance tuning.
- Experience with Qlik Replicate and Compose tools(Change Data Capture) tools is considered a plus
- Strong Data warehousing Concepts, ETL tools such as Talend Cloud Data Integration tool is must
- Exposure to the financial domain knowledge is considered a plus.
- Cloud Managed Services such as source control code Github, MS Azure/Devops is considered a plus.
- Prior experience with State Street and Charles River Development ( CRD) considered a plus.
- Experience in tools such as Visio, PowerPoint, Excel.
- Exposure to Third party data providers such as Bloomberg, Reuters, MSCI and other Rating agencies is a plus.
- Strong SQL knowledge and debugging skills is a must.
Profile - SQL Database
Experience - 5 Years
Location - Bangalore (5 days working)
Mandatory skills - SQL, Stored Procedures, MySQL
Notice Period - Immediate Joiners
Job Description -
- Strong experience in SQL
- Experience in Databases - MySQL & PostgresSQL.
- Experience in writing & adjusting Stored Procedures, T-SQL.
- Experience in Query optimization Index creation, SQL Joins, Sub-Queries.
Role : Java Developer (2-7 years)
Location : Bangalore
Key responsibilities
- Develop and maintain high-quality, efficient, and scalable backend applications.
- Participate in all phases of the software development lifecycle (SDLC)
- Write clean, well-documented, and testable code adhering to best practices.
- Collaborate with team members to ensure the successful delivery of projects.
- Debug and troubleshoot complex technical problems.
- Identify and implement performance optimizations.
- Participate in code reviews
- Hands-on experience with Spring boot, Java 8 and above.
- 2-7 years of experience developing Java applications.
- Knowledge about at least one messaging system like Kafka, RabbitMQ etc.
- Required React developer requirements, qualifications & skills:
- Proficiency in React.js and its core principles
- Strong JavaScript, HTML5, and CSS3 skills
- Experience with popular React.js workflows (such as Redux)
- Strong understanding of object-oriented programming (OOP) principles.
- Experience with design patterns and best practices for Java development.
- Proficient in unit testing frameworks (e.g., JUnit).
- Experience with build automation tools (e.g., Maven, Gradle).
- Experience with version control systems (e.g., Git).
- Experience with one of these databases – Postgres, MongoDb, Cassandra
- Knowledge on Retail or OMS is a plus.
- Experienced in containerized deployments using Docker, Kubernetes and DevOps mindset
- Ability to reverse engineer existing/legacy and document findings on confluence.
- Create automated tests for unit, integration, regression, performance, and functional testing, to meet established expectations and acceptance criteria.
About the Role
We are looking for a Python Developer with expertise in data synchronization (ETL & Reverse ETL), automation workflows, AI functionality, and connectivity to work directly with a customer in Peliqan. In this role, you will be responsible for building seamless integrations, enabling AI-driven functionality, and ensuring data flows smoothly across various systems.
Key Responsibilities
- Build and maintain data sync pipelines (ETL & Reverse ETL) to ensure seamless data transfer between platforms.
- Develop automation workflows to streamline processes and improve operational efficiency.
- Implement AI-driven functionality, including AI-powered analytics, automation, and decision-making capabilities.
- Build and enhance connectivity between different data sources, APIs, and enterprise applications.
- Work closely with the customer to understand their technical needs and design tailored solutions in Peliqan.
- Optimize performance of data integrations and troubleshoot issues as they arise.
- Ensure security and compliance in data handling and integrations.
Requirements
- Strong experience in Python and related libraries for data processing & automation.
- Expertise in ETL, Reverse ETL, and workflow automation tools.
- Experience working with APIs, data connectors, and integrations across various platforms.
- Familiarity with AI & machine learning concepts and their practical application in automation.
- Hands-on experience with Peliqan or similar integration/data automation platforms is a plus.
- Strong problem-solving skills and the ability to work directly with customers to define and implement solutions.
- Excellent communication and collaboration skills.
Preferred Qualifications
- Experience in SQL, NoSQL databases, and cloud platforms (AWS, GCP, Azure).
- Knowledge of data governance, security best practices, and performance optimization.
- Prior experience in customer-facing engineering roles.
If you’re a Python & Integration Engineer who loves working on cutting-edge AI, automation, and data connectivity projects, we’d love to hear from you
Job Title : Lead Web Developer / Frontend Engineer
Experience Required : 10+ Years
Location : Bangalore (Hybrid – 3 Days Work From Office)
Work Timings : 11:00 AM to 8:00 PM IST
Notice Period : Immediate or Up to 30 Days (Preferred)
Work Mode : Hybrid
Interview Mode : Face-to-Face mandatory (for Round 2)
Role Overview :
We are hiring a Lead Frontend Engineer with 10+ Years of experience to drive the development of scalable, modern, and high-performance web applications.
This is a hands-on technical leadership role focused on React.js, micro-frontends, and Backend for Frontend (BFF) architecture, requiring both coding expertise and team leadership skills.
Mandatory Skills :
React.js, JavaScript/TypeScript, HTML, CSS, micro-frontend architecture, Backend for Frontend (BFF), Webpack, Jenkins (CI/CD), GCP, RDBMS/SQL, Git, and team leadership.
Core Responsibilities :
- Design and develop cloud-based web applications using React.js, HTML, CSS.
- Collaborate with UX/UI designers and backend engineers to implement seamless user experiences.
- Lead and mentor a team of frontend developers.
- Write clean, well-documented, scalable code using modern JavaScript/TypeScript practices.
- Implement CI/CD pipelines using Jenkins, deploy applications to CDNs.
- Integrate with GCP services, optimize front-end performance.
- Stay updated with modern frontend technologies and design patterns.
- Use Git for version control and collaborative workflows.
- Implement JavaScript libraries for web analytics and performance monitoring.
Key Requirements :
- 10+ Years of experience as a frontend/web developer.
- Strong proficiency in React.js, JavaScript/TypeScript, HTML, CSS.
- Experience with micro-frontend architecture and Backend for Frontend (BFF) patterns.
- Proficiency in frontend design frameworks and libraries (jQuery, Node.js).
- Strong understanding of build tools like Webpack, CI/CD using Jenkins.
- Experience with GCP and deploying to CDNs.
- Solid experience in RDBMS, SQL.
- Familiarity with Git and agile development practices.
- Excellent debugging, problem-solving, and communication skills.
- Bachelor’s/Master’s in Computer Science or a related field.
Nice to Have :
- Experience with Node.js.
- Previous experience working with web analytics frameworks.
- Exposure to JavaScript observability tools.
Interview Process :
1. Round 1 : Online Technical Interview (via Geektrust – 1 Hour)
2. Round 2 : Face-to-Face Interview with the Indian team in Bangalore (3 Hours – Mandatory)
3. Round 3 : Online Interview with CEO (30 Minutes)
Important Notes :
- Face-to-face interview in Bangalore is mandatory for Round 2.
- Preference given to candidates currently in Bangalore or willing to travel for interviews.
- Remote applicants who cannot attend the in-person round will not be considered.
Primary skill set: QA Automation, Python, BDD, SQL
As Senior Data Quality Engineer you will:
- Evaluate product functionality and create test strategies and test cases to assess product quality.
- Work closely with the on-shore and the offshore team.
- Work on multiple reports validation against the databases by running medium to complex SQL queries.
- Better understanding of Automation Objects and Integrations across various platforms/applications etc.
- Individual contributor exploring opportunities to improve performance and suggest/articulate the areas of improvements importance/advantages to management.
- Integrate with SCM infrastructure to establish a continuous build and test cycle using CICD tools.
- Comfortable working on Linux/Windows environment(s) and Hybrid infrastructure models hosted on Cloud platforms.
- Establish processes and tools set to maintain automation scripts and generate regular test reports.
- Peer review to provide feedback and to make sure the test scripts are flaw-less.
Core/Must have skills:
- Excellent understanding and hands on experience in ETL/DWH testing preferably DataBricks paired with Python experience.
- Hands on experience SQL (Analytical Functions and complex queries) along with knowledge of using SQL client utilities effectively.
- Clear & crisp communication and commitment towards deliverables
- Experience on BigData Testing will be an added advantage.
- Knowledge on Spark and Scala, Hive/Impala, Python will be an added advantage.
Good to have skills:
- Test automation using BDD/Cucumber / TestNG combined with strong hands-on experience with Java with Selenium. Especially working experience in WebDriver.IO
- Ability to effectively articulate technical challenges and solutions
- Work experience in qTest, Jira, WebDriver.IO
Job Title : Senior Software Engineer – Backend
Experience Required : 6 to 12 Years
Location : Bengaluru (Hybrid – 3 Days Work From Office)
Number of Openings : 2
Work Hours : 11:00 AM – 8:00 PM IST
Notice Period : 30 Days Preferred
Work Location : SmartWorks The Cube, Karle Town SEZ, Building No. 5, Nagavara, Bangalore – 560045
Note : Face-to-face interview in Bangalore is mandatory during the second round.
Role Overview :
We are looking for an experienced Senior Backend Developer to join our growing team. This is a hands-on role focused on building cloud-based, scalable applications in the mortgage finance domain.
Key Responsibilities :
- Design, develop, and maintain backend components for cloud-based web applications.
- Contribute to architectural decisions involving microservices and distributed systems.
- Work extensively with Node.js and RESTful APIs.
- Implement scalable solutions using AWS services (e.g., Lambda, SQS, SNS, RDS).
- Utilize both relational and NoSQL databases effectively.
- Collaborate with cross-functional teams to deliver robust and maintainable code.
- Participate in agile development practices and deliver rapid iterations based on feedback.
- Take ownership of system performance, scalability, and reliability.
Core Requirements :
- 5+ Years of total experience in backend development.
- Minimum 3 Years of experience in building scalable microservices or delivering large-scale products.
- Strong expertise in Node.js and REST APIs.
- Solid experience with RDBMS, SQL, and data modeling.
- Good understanding of distributed systems, scalability, and availability.
- Familiarity with AWS infrastructure and services.
- Development experience in Python and/or Java is a plus.
Preferred Skills :
- Experience with frontend frameworks like React.js or AngularJS.
- Working knowledge of Docker and containerized applications.
Interview Process :
- Round 1 : Online technical assessment (1 hour)
- Round 2 : Virtual technical interview
- Round 3 : In-person interview at the Bangalore office (2 hours – mandatory)
Job title - Python developer
Exp – 4 to 6 years
Location – Pune/Mum/B’lore
PFB JD
Requirements:
- Proven experience as a Python Developer
- Strong knowledge of core Python and Pyspark concepts
- Experience with web frameworks such as Django or Flask
- Good exposure to any cloud platform (GCP Preferred)
- CI/CD exposure required
- Solid understanding of RESTful APIs and how to build them
- Experience working with databases like Oracle DB and MySQL
- Ability to write efficient SQL queries and optimize database performance
- Strong problem-solving skills and attention to detail
- Strong SQL programing (stored procedure, functions)
- Excellent communication and interpersonal skill
Roles and Responsibilities
- Design, develop, and maintain data pipelines and ETL processes using pyspark
- Work closely with data scientists and analysts to provide them with clean, structured data.
- Optimize data storage and retrieval for performance and scalability.
- Collaborate with cross-functional teams to gather data requirements.
- Ensure data quality and integrity through data validation and cleansing processes.
- Monitor and troubleshoot data-related issues to ensure data pipeline reliability.
- Stay up to date with industry best practices and emerging technologies in data engineering.
Role: Data Engineer (14+ years of experience)
Location: Whitefield, Bangalore
Mode of Work: Hybrid (3 days from office)
Notice period: Immediate/ Serving with 30days left
Location: Candidate should be based out of Bangalore as one round has to be taken F2F
Job Summary:
Role and Responsibilities
● Design and implement scalable data pipelines for ingesting, transforming, and loading data from various tools and sources.
● Design data models to support data analysis and reporting.
● Automate data engineering tasks using scripting languages and tools.
● Collaborate with engineers, process managers, data scientists to understand their needs and design solutions.
● Act as a bridge between the engineering and the business team in all areas related to Data.
● Automate monitoring and alerting mechanism on data pipelines, products and Dashboards and troubleshoot any issues. On call requirements.
● SQL creation and optimization - including modularization and optimization which might need views, table creation in the sources etc.
● Defining best practices for data validation and automating as much as possible; aligning with the enterprise standards
● QA environment data management - e.g Test Data Management etc
Qualifications
● 14+ years of experience as a Data engineer or related role.
● Experience with Agile engineering practices.
● Strong experience in writing queries for RDBMS, cloud-based data warehousing solutions like Snowflake and Redshift.
● Experience with SQL and NoSQL databases.
● Ability to work independently or as part of a team.
● Experience with cloud platforms, preferably AWS.
● Strong experience with data warehousing and data lake technologies (Snowflake)
● Expertise in data modelling
● Experience with ETL/LT tools and methodologies .
● 5+ years of experience in application development including Python, SQL, Scala, or Java
● Experience working on real-time Data Streaming and Data Streaming platform.
NOTE: IT IS MANDATORY TO GIVE ONE TECHNICHAL ROUND FACE TO FACE.
Role overview
1) Overall 5 to 7 years of experience. Node.js experience is must.
2) At least 3+ years of experience or couple of large-scale products delivered on microservices.
3) Strong design skills on microservices and AWS platform infrastructure.
4) Excellent programming skill in Python, Node.js and Java.
5) Hands on development in rest API’s.
6) Good understanding of nuances of distributed systems, scalability, and availability.