50+ SQL Jobs in Bangalore (Bengaluru) | SQL Job openings in Bangalore (Bengaluru)
Apply to 50+ SQL Jobs in Bangalore (Bengaluru) on CutShort.io. Explore the latest SQL Job opportunities across top companies like Google, Amazon & Adobe.


Data Engineer
Experience: 4–6 years
Key Responsibilities
- Design, build, and maintain scalable data pipelines and workflows.
- Manage and optimize cloud-native data platforms on Azure with Databricks and Apache Spark (1–2 years).
- Implement CI/CD workflows and monitor data pipelines for performance, reliability, and accuracy.
- Work with relational databases (Sybase, DB2, Snowflake, PostgreSQL, SQL Server) and ensure efficient SQL query performance.
- Apply data warehousing concepts including dimensional modelling, star schema, data vault modelling, Kimball and Inmon methodologies, and data lake design.
- Develop and maintain ETL/ELT pipelines using open-source frameworks such as Apache Spark and Apache Airflow.
- Integrate and process data streams from message queues and streaming platforms (Kafka, RabbitMQ).
- Collaborate with cross-functional teams in a geographically distributed setup.
- Leverage Jupyter notebooks for data exploration, analysis, and visualization.
Required Skills
- 4+ years of experience in data engineering or a related field.
- Strong programming skills in Python with experience in Pandas, NumPy, Flask.
- Hands-on experience with pipeline monitoring and CI/CD workflows.
- Proficiency in SQL and relational databases.
- Familiarity with Git for version control.
- Strong communication and collaboration skills with ability to work independently.
- 8+ years of Data Engineering experience
- Strong SQL and Redshift experience
- CI/CD and orchestration experience using Bitbucket, Jenkins and Control-M
- Reporting experience preferably Tableau
- Location – Pune, Hyderabad, Bengaluru

Responsibilities • Design, develop, and maintain backend systems and RESTful APIs using Python (Django, FastAPI, or Flask)• Build real-time communication features using WebSockets, SSE, and async IO • Implement event-driven architectures using messaging systems like Kafka, RabbitMQ, Redis Streams, or NATS • Develop and maintain microservices that interact over messaging and streaming protocols • Ensure high scalability and availability of backend services • Collaborate with frontend developers, DevOps engineers, and product managers to deliver end-to-end solutions • Write clean, maintainable code with unit/integration tests • Lead technical discussions, review code, and mentor junior engineers
Requirements • 8+ years of backend development experience, with at least 8 years in Python • Strong experience with asynchronous programming in Python (e.g., asyncio, aiohttp, FastAPI) • Production experience with WebSockets and Server-Sent Events • Hands-on experience with at least one messaging system: Kafka, RabbitMQ, Redis Pub/Sub, or similar • Proficient in RESTful API design and microservices architecture • Solid experience with relational and NoSQL databases • Familiarity with Docker and container-based deployment • Strong understanding of API security, authentication, and performance optimization
Nice to Have • Experience with GraphQL or gRPC • Familiarity with stream processing frameworks (e.g., Apache Flink, Spark Streaming) • Cloud experience (AWS, GCP, Azure), particularly with managed messaging or pub/sub services • Knowledge of CI/CD and infrastructure as code • Exposure to AI engineering workflows and tools • Interest or experience in building Agentic AI systems or integrating backends with AI agents

What We’re Looking For:
- Strong experience in Python (5+ years).
- Hands-on experience with any database (SQL or NoSQL).
- Experience with frameworks like Flask, FastAPI, or Django.
- Knowledge of ORMs, API development, and unit testing.
- Familiarity with Git and Agile methodologies.
- Familiarity with the Kafka tool (Added Advantage)

What We’re Looking For:
- Strong experience in Python (4+ years).
- Hands-on experience with any database (SQL or NoSQL).
- Experience with frameworks like Flask, FastAPI, or Django.
- Knowledge of ORMs, API development, and unit testing.
- Familiarity with Git and Agile methodologies.
- Familiarity with the Kafka tool (Added Advantage)
Job Summary:
We are looking for a skilled and motivated Backend Engineer with 2 to 4 years of professional experience to join our dynamic engineering team. You will play a key role in designing, building, and maintaining the backend systems that power our products. You’ll work closely with cross-functional teams to deliver scalable, secure, and high-performance solutions that align with business and user needs.
This role is ideal for engineers ready to take ownership of systems, contribute to architectural decisions, and solve complex backend challenges.
Website: https://www.thealteroffice.com/about
Key Responsibilities:
- Design, build, and maintain robust backend systems and APIs that are scalable and maintainable.
- Collaborate with product, frontend, and DevOps teams to deliver seamless, end-to-end solutions.
- Model and manage data using SQL (e.g., MySQL, PostgreSQL) and NoSQL databases (e.g., MongoDB, Redis), incorporating caching where needed.
- Implement and manage authentication, authorization, and data security practices.
- Write clean, well-documented, and well-tested code following best practices.
- Work with cloud platforms (AWS, GCP, or Azure) to deploy, monitor, and scale services effectively.
- Use tools like Docker (and optionally Kubernetes) for containerization and orchestration of backend services.
- Maintain and improve CI/CD pipelines for faster and safer deployments.
- Monitor and debug production issues, using observability tools (e.g., Prometheus, Grafana, ELK) for root cause analysis.
- Participate in code reviews, contribute to improving development standards, and provide support to less experienced engineers.
- Work with event-driven or microservices-based architecture, and optionally use technologies like GraphQL, WebSockets, or message brokers such as Kafka or RabbitMQ when suitable for the solution.
Requirements:
- 2 to 4 years of professional experience as a Backend Engineer or similar role.
- Proficiency in at least one backend programming language (e.g., Python, Java, Go, Ruby, etc.).
- Strong understanding of RESTful API design, asynchronous programming, and scalable architecture patterns.
- Solid experience with both relational and NoSQL databases, including designing and optimizing data models.
- Familiarity with Docker, Git, and modern CI/CD workflows.
- Hands-on experience with cloud infrastructure and deployment processes (AWS, GCP, or Azure).
- Exposure to monitoring, logging, and performance profiling practices in production environments.
- A good understanding of security best practices in backend systems.
- Strong problem-solving, debugging, and communication skills.
- Comfortable working in a fast-paced, agile environment with evolving priorities.


Role Overview:
We are seeking a highly skilled and experienced Lead Web App Developer - Backend to join our dynamic team in Bengaluru. The ideal candidate will have a strong background in backend development, microservices architecture, and cloud technologies, with a proven ability to deliver robust, scalable solutions. This role involves designing, implementing, and maintaining complex distributed systems, primarily in the Mortgage Finance domain.
Key Responsibilities:
- Cloud-Based Web Applications Development:
- Lead backend development efforts for cloud-based web applications.
- Work on diverse projects within the Mortgage Finance domain.
- Microservices Design & Development:
- Design and implement microservices-based architectures.
- Ensure scalability, availability, and reliability of distributed systems.
- Programming & API Development:
- Write efficient, reusable, and maintainable code in Python, Node.js, and Java.
- Develop and optimize RESTful APIs.
- Infrastructure Management:
- Leverage AWS platform infrastructure to build secure and scalable solutions.
- Utilize tools like Docker for containerization and deployment.
- Database Management:
- Work with RDBMS (MySQL) and NoSQL databases to design efficient schemas and optimize queries.
- Team Collaboration:
- Collaborate with cross-functional teams to ensure seamless integration and delivery of projects.
- Mentor junior developers and contribute to the overall skill development of the team.
Core Requirements:
- Experience: Minimum 10+ years in backend development, with at least 3+ years of experience in designing and delivering large-scale products on microservices architecture.
- Technical Skills:
- Programming Languages: Python, Node.js, Java.
- Frameworks & Tools: AWS (Lambda, RDS, etc.), Docker.
- Database Expertise: Proficiency in RDBMS (MySQL) and NoSQL databases.
- API Development: Hands-on experience in developing REST APIs.
- System Design: Strong understanding of distributed systems, scalability, and availability.
Additional Skills (Preferred):
- Experience with modern frontend frameworks like React.js or AngularJS.
- Strong design and architecture capabilities.
What We Offer:
- Opportunity to work on cutting-edge technologies in a collaborative environment.
- Competitive salary and benefits package.
- Flexible hybrid working model.
- Chance to contribute to impactful projects in the Mortgage Finance domain.

Role Overview
We are looking for a highly skilled Product Engineer to join our dynamic team. This is an exciting opportunity to work on innovative FinTech solutions and contribute to the future of global payments. If you're passionate about backend development, API design, and scalable architecture, we'd love to hear from you!
Key Responsibilities
- Design, develop, and maintain scalable, high-performance backend systems.
- Write clean, maintainable, and efficient code while following best practices.
- Build and optimize RESTful APIs and database queries.
- Collaborate with cross-functional teams to deliver 0 to 1 products.
- Ensure smooth CI/CD pipeline implementation and deployment automation.
- Contribute to open-source projects and stay updated with industry trends.
- Maintain a strong focus on security, performance, and reliability.
- Work with payment protocols and financial regulations to ensure compliance.
Required Skills & Qualifications
- ✅ 3+ years of professional software development experience.
- ✅ Proficiency in any backend language (with preference for Ruby on Rails).
- ✅ Strong foundation in architecture, design, and database optimization.
- ✅ Experience in building APIs and working with SQL/NoSQL databases.
- ✅ Familiarity with CI/CD practices and automation tools.
- ✅ Excellent problem-solving and analytical skills.
- ✅ Strong track record of open-source contributions (minimum 50 stars on GitHub).
- ✅ Passion for FinTech and payment systems.
- ✅ Strong communication skills and ability to work collaboratively in a team.
Nice to Have
- Prior experience in financial services or payment systems.
- Exposure to microservices architecture and cloud platforms.
- Knowledge of containerization tools like Docker & Kubernetes.

Role overview
- Overall 5 to 7 years of experience. Node.js experience is must.
- At least 3+ years of experience or couple of large-scale products delivered on microservices.
- Strong design skills on microservices and AWS platform infrastructure.
- Excellent programming skill in Python, Node.js and Java.
- Hands on development in rest API’s.
- Good understanding of nuances of distributed systems, scalability, and availability.
What would you do here
- To Work as a Backend Developer in developing Cloud Web Applications
- To be part of the team working on various types of web applications related to Mortgage Finance.
- Experience in solving a real-world problem of Implementing, Designing and helping develop a new Enterprise-class Product from ground-up.
- You have expertise in the AWS Cloud Infrastructure and Micro-services architecture around the AWS Service stack like Lambdas, SQS, SNS, MySQL Databases along with Dockers and containerized solutions/applications.
- Experienced in Relational and No-SQL databases and scalable design.
- Experience in solving challenging problems by developing elegant, maintainable code.
- Delivered rapid iterations of software based on user feedback and metrics.
- Help the team make key decisions on our product and technology direction.
- You actively contribute to the adoption of frameworks, standards, and new technologies.

AccioJob is conducting a Walk-In Hiring Drive with Infrrd for the position of Java Full Stack Developer.
To apply, register and select your slot here: https://go.acciojob.com/3UTekG
Required Skills: DSA, OOPS, SQL, Java, Python
Eligibility:
- Degree: BTech./BE, MTech./ME
- Branch: Computer Science/CSE/Other CS related branch, IT
- Graduation Year: 2026
Work Details:
- Work Location: Bangalore (Onsite)
- Stipend Range: 30k
- Stipend Duration: 12 Months
- CTC: 6 LPA to 9 LPA
Evaluation Process:
Round 1: Offline Assessment at AccioJob Bangalore Centre
Further Rounds (for shortlisted candidates only):
- Profile Evaluation
- Technical Interview 1
- Technical Interview 2
Important Note: Bring your laptop & earphones for the test.
Register here: https://go.acciojob.com/3UTekG

Roles and responsibilities-
- Tech-lead in one of the feature teams, candidate need to be work along with team lead in handling the team without much guidance
- Good communication and leadership skills
- Nurture and build next level talent within the team
- Work in collaboration with other vendors and client development team(s)
- Flexible to learn new tech areas
- Lead complete lifecycle of feature - from feature inception to solution, story grooming, delivery, and support features in production
- Ensure and build the controls and processes for continuous delivery of applications, considering all stages of the process and its automations
- Interact with teammates from across the business and comfortable explaining technical concepts to nontechnical audiences
- Create robust, scalable, flexible, and relevant solutions that help transform product and businesses
Must haves:
- Spark
- Scala
- Postgres(or any SQL DB)s
- Elasticsearch(or any No-SQL DB)
- Azure (if not, any other cloud experience)
- Big data processing
Good to have:
- Golang
- Databricks
- Kubernetes
Position: Tableau Developer
Experience: 5-7 years
Location: Bangalore
Key Responsibilities:
· Design, develop, and maintain interactive dashboards and reports using Tableau, ensuring high-quality visualizations that meet business requirements.
· Write and optimize complex SQL queries to extract, manipulate, and analyse data from various sources, ensuring data integrity and accuracy.
· Stay updated on technologies and trends related to data visualization and analytics, including advanced analytics, big data, and data science. Familiarity with tools such as R, Python, and SAS is a plus.
· Utilize Snowflake for data warehousing solutions, including data modelling, ETL processes, and performance tuning to support Tableau reporting.
· Work effectively in interdisciplinary global teams, influencing stakeholders within a matrix organization to ensure alignment on reporting solutions.
· Incorporate Tableau best practices in reporting solutions and guide team members in their use to enhance overall reporting quality.
· Utilize excellent analytical and problem-solving skills to address data-related challenges and provide actionable insights.
· Communicate effectively with both technical and non-technical stakeholders to understand their reporting needs and deliver tailored solutions.
· Additional Skills: Experience with other visualization tools (e.g., Spotfire, Power BI) and programming languages (e.g., R, Python, JavaScript) is advantageous.
Qualifications:
· Bachelor’s degree in informatics, Information Systems, Data Science, or a related field.
· 5+ years of relevant professional experience in data analytics, performance management, or related fields.
· Strong understanding of clinical development and/or biopharma industry practices is preferred.
· Proven experience in completing computerized systems validation and testing methodologies, with an awarenes
About the company
Sigmoid is a leading data solutions company that partners with Fortune 500 enterprises to drive digital transformation through AI, big data, and cloud technologies. With a focus on scalability, performance, and innovation, Sigmoid delivers cutting-edge solutions to solve complex business challenges.
About the role
You will be responsible for building a highly scalable, extensible, and robust application. This position reports to the Engineering Manager.
Responsibilities:
- Align Sigmoid with key Client initiatives
- Interface daily with customers across leading Fortune 500 companies to understand strategic requirements
- Ability to understand business requirements and tie them to technology solutions
- Open to work from client location as per the demand of the project / customer
- Facilitate in Technical Aspects
- Develop and evolve highly scalable and fault-tolerant distributed components using Java technologies
- Excellent experience in Application development and support, integration development and quality assurance
- Provide technical leadership and manage it day to day basis
- Interface daily with customers across leading Fortune 500 companies to understand strategic requirements
- Stay up-to-date on the latest technology to ensure the greatest ROI for customer & Sigmoid
- Hands on coder with good understanding on enterprise level code
- Design and implement APIs, abstractions and integration patterns to solve challenging distributed computing problems
- Experience in defining technical requirements, data extraction, data transformation, automating jobs, productionizing jobs, and exploring new big data technologies within a Parallel Processing environment
Culture:
- Must be a strategic thinker with the ability to think unconventional / out-of-box
- Analytical and solution driven orientation
- Raw intellect, talent and energy are critical
- Entrepreneurial and Agile: understands the demands of a private, high growth company
- Ability to be both a leader and hands-on "doer"
Qualifications:
- Years of track record of relevant work experience and a computer Science or a related technical discipline is required
- Experience in development of Enterprise scale applications and capable in developing framework, design patterns etc. Should be able to understand and tackle technical challenges and propose comprehensive solutions
- Experience with functional and object-oriented programming, Java or Python is a must
- Experience in Springboot, API, SQL
- Good to have: GIT, Airflow, Node JS, Python, Angular
- Experience with database modeling and development, data mining and warehousing
- Unit, Integration and User Acceptance Testing
- Effective communication skills (both written and verbal)
- Ability to collaborate with a diverse set of engineers, data scientists and product managers
- Comfort in a fast-paced start-up environment
- Experience in Agile methodology
- Proficient with SQL and its variation among popular databases
- Develop, and maintain Java applications using Core Java, Spring framework, JDBC, and threading concepts.
- Strong understanding of the Spring framework and its various modules.
- Experience with JDBC for database connectivity and manipulation
- Utilize database management systems to store and retrieve data efficiently.
- Proficiency in Core Java8 and thorough understanding of threading concepts and concurrent programming.
- Experience in in working with relational and nosql databases.
- Basic understanding of cloud platforms such as Azure and GCP and gain experience on DevOps practices is added advantage.
- Knowledge of containerization technologies (e.g., Docker, Kubernetes)
- Perform debugging and troubleshooting of applications using log analysis techniques.
- Understand multi-service flow and integration between components.
- Handle large-scale data processing tasks efficiently and effectively.
- Hands on experience using Spark is an added advantage.
- Good problem-solving and analytical abilities.
- Collaborate with cross-functional teams to identify and solve complex technical problems.
- Knowledge of Agile methodologies such as Scrum or Kanban
- Stay updated with the latest technologies and industry trends to improve development processes continuously and methodologies
If interested please share your resume with details :
Total Experience -
Relevant Experience in Java,Spring,Data structures,Alogorithm,SQL, -
Relevant Experience in Cloud - AWS/Azure/GCP -
Current CTC -
Expected CTC -
Notice Period -
Reason for change -


Job Title: Backend Engineer – Python / Golang / Rust
Location: Bangalore, India
Experience Required: Minimum 2–3 years
About the Role
We are looking for a passionate Backend Engineer to join our growing engineering team. The ideal candidate should have hands-on experience in building enterprise-grade, scalable backend systems using microservices architecture. You will work closely with product, frontend, and DevOps teams to design, develop, and optimize robust backend solutions that can handle high traffic and ensure system reliability.
Key Responsibilities
• Design, develop, and maintain scalable backend services and APIs.
• Architect and implement microservices-based systems ensuring modularity and resilience.
• Optimize application performance, database queries, and service scalability.
• Collaborate with frontend engineers, product managers, and DevOps teams for seamless delivery.
• Implement security best practices and ensure data protection compliance.
• Write and maintain unit tests, integration tests, and documentation.
• Participate in code reviews, technical discussions, and architecture design sessions.
• Monitor, debug, and improve system performance in production environments.
Required Skills & Experience
• Programming Expertise:
• Advanced proficiency in Python (Django, FastAPI, or Flask), OR
• Strong experience in Golang or Rust for backend development.
• Microservices Architecture: Hands-on experience in designing and maintaining distributed systems.
• Database Management: Expertise in PostgreSQL, MySQL, MongoDB, including schema design and optimization.
• API Development: Strong experience in RESTful APIs and GraphQL.
• Cloud Platforms: Proficiency with AWS, GCP, or Azure for deployment and scaling.
• Containerization & Orchestration: Solid knowledge of Docker and Kubernetes.
• Messaging & Caching: Experience with Redis, RabbitMQ, Kafka, and caching strategies (Redis, Memcached).
• Version Control: Strong Git workflows and collaboration in team environments.
• Familiarity with CI/CD pipelines, DevOps practices, and cloud-native deployments.
• Proven experience working on production-grade, high-traffic applications.
Preferred Qualifications
• Understanding of software architecture patterns (event-driven, CQRS, hexagonal, etc.).
• Experience with Agile/Scrum methodologies.
• Contributions to open-source projects or strong personal backend projects.
• Experience with observability tools (Prometheus, Grafana, ELK, Jaeger).
Why Join Us?
• Work on cutting-edge backend systems that power enterprise-grade applications.
• Opportunity to learn and grow with a fast-paced engineering team.
• Exposure to cloud-native, microservices-based architectures.
• Collaborative culture that values innovation, ownership, and technical excellence.
Desired Competencies (Technical/Behavioral Competency)
Loan IQ Domain and Practical knowledge
Working knowledge of LoanIQ for 7+ yrs in development , support , QA.
In-depth knowledge of LoanIQ Database schema
Mainframe and Java and .Net coupled with DB
Strong knowledge of relational Databases like Oracle 12/19c
Strong business domain / Functional knowledge GOOD nderstanding of Mainframes, SQL, Unix,. Net, Java and Service now
Responsibility of / Expectations from the Role
Loan IQ Domain and Practical knowledge.
In-depth knowledge of LoanIQ Database schema. Mainframe and Java and .Net coupled with DB. Strong knowledge of relational Databases like Oracle
On the ground value driven thought leadership ( continuous improvement ). North American Commercial Banking and Domain Knowledge. Strong business domain / Functional knowledge of batch operations and understanding of Mainframes, SQL, Unix,. Net, Java and Service now
ON OFF coordination on shift basis. Familiarity with Log monitoring applications / software’s. Understanding of various reports. Review, assess and document application, infrastructure and middleware platform service levels requirements, document failure and recovery scenarios, performance against design and expectation
Flexibility to work in shifts. Excellent communication skills
Coordination with the Application Development, Infrastructure Engineering, Production Support as needed. Support triage for escalated issues and on call rotation. Comfortable with generating reporting and creating summaries. Experience with file transfer mechanisms and software. Ability to maintain and enhance complex job dependency charts
Profile: AWS Data Engineer
Mandate skills :AWS + Databricks + Pyspark + SQL role
Location: Bangalore/Pune/Hyderabad/Chennai/Gurgaon:
Notice Period: Immediate
Key Requirements :
- Design, build, and maintain scalable data pipelines to collect, process, and store from multiple datasets.
- Optimize data storage solutions for better performance, scalability, and cost-efficiency.
- Develop and manage ETL/ELT processes to transform data as per schema definitions, apply slicing and dicing, and make it available for downstream jobs and other teams.
- Collaborate closely with cross-functional teams to understand system and product functionalities, pace up feature development, and capture evolving data requirements.
- Engage with stakeholders to gather requirements and create curated datasets for downstream consumption and end-user reporting.
- Automate deployment and CI/CD processes using GitHub workflows, identifying areas to reduce manual, repetitive work.
- Ensure compliance with data governance policies, privacy regulations, and security protocols.
- Utilize cloud platforms like AWS and work on Databricks for data processing with S3 Storage.
- Work with distributed systems and big data technologies such as Spark, SQL, and Delta Lake.
- Integrate with SFTP to push data securely from Databricks to remote locations.
- Analyze and interpret spark query execution plans to fine-tune queries for faster and more efficient processing.
- Strong problem-solving and troubleshooting skills in large-scale distributed systems.
Location & Work Model:
- Position: Contract to hire
- Bangalore - Marathahalli;
- Work mode - Work from Office; all 5 days
- Looking for immediate joiners
Technical Requirements:
- Strong experience in Java Backend Development
- Proficiency in both SQL & NoSQL databases (e.g., MongoDB, PostgreSQL, MySQL)
- Basic knowledge of DevOps tools (CI/CD pipeline)
- Familiarity with cloud providers (AWS, GCP)
- Ability to quickly learn and adapt to emerging technologies, including AI-driven tools, and automation solutions
- Strong problem-solving mindset with an interest in leveraging AI and data-driven approaches for backend optimizations
Soft Skills & Mindset:
- Strong communication & articulation skills
- Structured thought process and a logical approach to problem-solving
- Interest & willingness to learn and adapt to new technologies
Roles and Responsibilities:
- Develop high-quality code and employ object-oriented design principles while strictly adhering to best coding practices.
- Ability to work independently
- Demonstrate substantial expertise in Core Java, Multithreading, and Spring/Spring Boot frameworks.
- Possess a solid understanding of Spring, Hibernate, Caching Frameworks, and Memory Management.
- Proficient in crafting complex analytical SQL queries to meet project requirements.
- Contribute to the development of Highly Scalable applications.
- Design and implement Rest-based applications with efficiency and precision.
- Create comprehensive unit tests using frameworks such as Junit and Mockito.
- Engage in the Continuous Integration/Continuous Deployment (CI/CD) process and utilize build tools like Git and Maven.
- Familiarity with any Cloud service provider is considered an added advantage.
Required Skills and Experience :
- Experience in cloud platform AWS is necessary.
- Experience in big data processing frameworks like Apache Spark, Flink, Kafka, etc
- In-depth proficiency in Java, Spring Boot, Spring Frameworks, Hibernate, SQL, and Unit Testing frameworks.
- Good knowledge of SQL and experience with complex queries is a must
- Experience in supporting and debugging issues in the production environment is a must.
- Experience in Analytical databases like Redshift, Big Query, Snowflake, Clickhouse, etc is a plus.
Senior Back-end Engineer Developer
What We Need
Looking for a senior back-end developer who will start working in our Bangalore office and then will be given an opportunity to move to Netherlands to work closely with our clients
- A highly motivated and experienced frontend software engineer / developer with a proven track record (at least 5 years of experience).
- A Bachelor’s degree in computer science.
- Someone who loves to work in a multidisciplinary team of engineers and business colleagues in a high-tech environment.
- You are able to work in a dynamic and demanding environment, a real team player and a speak-up mentality to promote your ideas in a concise way.
- You are a problem-solver and see yourself as a hardcore web developer.
- You have knowledge of, and experience with, different web technologies.
- You are skilled with implementing architecture & design patterns.
- You can write modular code that is configurable, extensible and testable.
- You have great analytical skills, conceptual understanding and able to quickly understand new technical concepts.
- You have a strong interest in the latest trends in software development & web technologies.
- You have strong communication skills to explain complex technical concepts.
- You are fluent in English both in verbal and written.
We are looking for a back-end engineer / developer:
Proficiency / experience with following technologies & tools:
- Thorough and deep understanding of Java JDK 11+, our foundational programming language
Spring Framework & AOP v5.2+
- Proven experience working with, and a deep understanding of Spring Boot 2.5+ and its modules (Web, Data JPA, Security OAuth2) and ability to explain complex use-cases related to persistency and web security
- Experience with Maven v3+
- Experience with containerization and deployments tools (eg. Docker v20+ and Kaniko, Helm (charts) v3+ with Kubernetes deployments)
- Experience working with CI/CD tools like GitLab SCM & pipelines and JFrog Artifactory
- Strong knowledge working with different types of SQL and NoSQL databases such as PostgreSQL v12+, MongoDB v4+ and Neo4J v4+
- Proficient in working with DevOps engineers on Cloud deployments (eg. Azure subscriptions)
- Experience in Agile/Scrum & (pref.) SAFe (Scaled Agile Framework) and enabling tooling – Atlassian Jira Cloud / Jira Align
- Experienced and skilled in full-stack development.
- Leading and solutioning product development of secure and high-performance applications.
- Good understanding of REST APIs and working knowledge of HTTP(S).
- Experienced in testing stack – Junit / Mockito
- Experience with software quality & vulnerability testing – SonarQube and Blackduck
- Proficient in writing software documentation on Atlassian Wiki
- Proficient in implementing data structures, algorithm design and OOPs concepts.
Reports Developer
Description - Data Insights Analyst specializing in dashboard development, data validation, and ETL testing using Tableau, Cognos, and SQL.
Work Experience: 5-9 years
Key Responsibilities
Insights Solution Development:
• Develop, maintain, and enhance dashboards and static reports using Tableau and IBM Cognos.
• Collaborate with Senior Data Insights specialists to design solutions that meet customer needs.
• Utilize data from modeled Business Data sources, Structured DB2 Datamarts, and DB2 Operational Data stores to fulfill business requirements.
• Conduct internal analytics testing on all data products to ensure accuracy and reliability.
• Use SQL to pull and prepare data for use in Dashboards and Reports
Data Management Tasks:
• Test new ETL developments, break-fixes, and enhancements using SQL-based tools to ensure the accuracy, volume, and quality of ETL changes.
• Participate in data projects, delivering larger-scale data solutions as part of a project team.
• Report defects using Jira and work closely with IT team professionals to ensure the timely retesting of data defects
• Utilize Spec. Documentation and Data lineage tools to understand flow of data into Analytics Data sources
• Develop repeatable testing processes using SQL based tools
Technical Experience
• SQL
• Tableau
• Data Visualization
• Report Design
• Cognos Analytics
• Cognos Transformer
• OLAP Modeling (Cognos)
Additional Skills
An Ideal Candidate would have the following additional Skills
• Python
• SAS Programming
• MS Access
• MS Excel
Work Hours: We would like to have the majority of the work hours align to U.S. Eastern time zone with people working until 2 p.m. or 3 p.m. est. so that work hours align to times the Senior Analysts are available and Data Bases are available.


Looking for 5+ years experienced Senior Fullstack Engineer for research-focused, product-based, US-based Startup
AI Assistant for Research using state of the art language models (ChatGPT for Research)
At SciSpace, we're using language models to automate and streamline research workflows from start to finish. And the best part? We're already making waves in the industry, with a whopping 5 million users on board as of November 2025!
Our users love us too, with a 40% MOM retention rate and 10% of them using our app more than once a week! We're growing by more than 50% every month, all thanks to our awesome users spreading the word (see it yourself on Twitter).
And with almost weekly feature launches since our inception, we're constantly pushing the boundaries of what's possible. Our team of experts in design, front-end, full-stack engineering, and machine learning is already in place, but we're always on the lookout for new talent to help us take things to the next level.
Our user base is super engaged and always eager to provide feedback, making Scispace one of the most advanced applications of language models out there.
We are looking for insatiably curious, always learning SDE 2 Fullstack Engineer. You could get a chance to work on the most important and challenging problems at scale.
Responsibilities:
- Work in managing products as part of SciSpace product suite.
- Partner with product owners in designing software that becomes part of researchers’ lives.
- Model real-world scenarios into code that can build the SciSpace platform.
- Test code that you write and continuously improve practices at SciSpace.
- Arrive at technology decisions after extensive debates with other engineers.
- Manage large projects from conceptualization, all the way through deployments.
- Evolve and ecosystem of tools and libraries that make it possible for SciSpace to provide reliable, always-on, performant services to our users.
- Partner with other engineers in developing an architecture that is resilient to changes in product requirements and usage.
- Work on the user-interface side and deliver a snappy, enjoyable experience to your users.
Our Ideal Candidate would:
- Strong grasp of one high-level language like Python, Javascript, etc.
- Strong grasp of front-end HTML/CSS, non-trivial browser-side javascript
- General awareness of SQL and database design concepts
- Solid understanding of testing fundamentals
- Strong communication skills
- Should have prior experience in managing and executing technology products.
Bonus:
- Prior experience working with high-volume, always-available web-applications
- Experience in ElasticSearch.
- Experience in Distributed systems.
- Experience working with Start-up is a plus point.
Integration Developer
ROLE TITLE
Integration Developer
ROLE LOCATION(S)
Bangalore/Hyderabad/Chennai/Coimbatore/Noida/Kolkata/Pune/Indore
ROLE SUMMARY
The Integration Developer is a key member of the operations team, responsible for ensuring the smooth integration and functioning of various systems and software within the organization. This role involves technical support, system troubleshooting, performance monitoring, and assisting with the implementation of integration solutions.
ROLE RESPONSIBILITIES
· Design, develop, and maintain integration solutions using Spring Framework, Apache Camel, and other integration patterns such as RESTful APIs, SOAP services, file-based FTP/SFTP, and OAuth authentication.
· Collaborate with architects and cross-functional teams to design integration solutions that are scalable, secure, and aligned with business requirements.
· Resolve complex integration issues, performance bottlenecks, and data discrepancies across multiple systems. Support Production issues and fixes.
· Document integration processes, technical designs, APIs, and workflows to ensure clarity and ease of use.
· Participate in on-call rotation to provide 24/7 support for critical production issues.
· Develop source code / version control management experience in a collaborative work environment.
TECHNICAL QUALIFICATIONS
· 5+ years of experience in Java development with strong expertise in Spring Framework and Apache Camel for building enterprise-grade integrations.
· Proficient with Azure DevOps (ADO) for version control, CI/CD pipeline implementation, and project management.
· Hands-on experience with RESTful APIs, SOAP services, and file-based integrations using FTP and SFTP protocols.
· Strong analytical and troubleshooting skills for resolving complex integration and system issues.
· Experience in Azure Services, including Azure Service Bus, Azure Kubernetes Service (AKS), Azure Container Apps, and ideally Azure API Management (APIM) is a plus.
· Good understanding of containerization and cloud-native development, with experience in using Docker, Kubernetes, and Azure AKS.
· Experience with OAuth for secure authentication and authorization in integration solutions.
· Strong experience level using GitHub Source Control application.
· Strong background in SQL databases (e.g., T-SQL, Stored Procedures) and working with data in an integration context.
· Skilled with Azure DevOps (ADO) for version control, CI/CD pipeline implementation, and project management.
· Experience in Azure Services, including Azure Service Bus, Azure Kubernetes Service (AKS), Azure Container Apps, and ideally Azure API Management (APIM) is a plus.
GENERAL QUALIFICATIONS
· Excellent analytical and problem-solving skills, with a keen attention to detail.
· Effective communication skills, with the ability to collaborate with technical and non-technical stakeholders.
· Experience working in a fast paced, production support environment with a focus on incident management and resolution.
· Experience in the insurance domain is considered a plus.
EDUCATION REQUIREMENTS
· Bachelor’s degree in Computer Science, Information Technology, or related field.


We’re Hiring: Fullstack Developer (Node.js/Python + React.js)
Location: Bangalore
Type: Full-Time
Experience: 2 to 5 Years
Key Responsibilities
Develop and maintain full-stack web applications using Node.js/Python backends and React.js frontend.
Build scalable backend APIs (Express.js, Nest.js, Django, FastAPI, Flask) and integrate with frontend components.
Write clean, maintainable JavaScript/TypeScript and Python code following best practices.
Participate in code reviews and contribute to technical discussions.
Collaborate with cross-functional teams to deliver high-quality products.
Debug and troubleshoot across the full stack, ensuring seamless deployments.
Optimize application performance and ensure responsive user experiences.
✅ Required Skills
2–5 years of hands-on full-stack development experience.
Strong proficiency in JavaScript (ES6+) and modern web development concepts.
Backend expertise with Node.js (Express.js, Nest.js) OR Python (Django, FastAPI, Flask).
Frontend expertise in React.js, including hooks and modern component patterns.
Experience with RESTful API development & integration.
Database knowledge in SQL (PostgreSQL, MySQL) and NoSQL (MongoDB).
Familiarity with Git and collaborative workflows.
Strong problem-solving, debugging, and performance optimization skills.
Knowledge of agile methodologies and software development lifecycle.
📌 Skills:
SQL | JavaScript | React.js | Node.js | Python


Job Title: Data Scientist
Location: Bangalore (Hybrid/On-site depending on project needs)
About the Role
We are seeking a highly skilled Data Scientist to join our team in Bangalore. In this role, you will take ownership of data science components across client projects, build production-ready ML and GenAI-powered applications, and mentor junior team members. You will collaborate with engineering teams to design and deploy impactful solutions that leverage cutting-edge machine learning and large language model technologies.
Key Responsibilities
ML & Data Science
- Develop, fine-tune, and evaluate ML models (classification, regression, clustering, recommendation systems).
- Conduct exploratory data analysis, preprocessing, and feature engineering.
- Ensure model reproducibility, scalability, and alignment with business objectives.
GenAI & LLM Applications
- Prototype and design solutions leveraging LLMs (OpenAI, Claude, Mistral, Llama).
- Build RAG (Retrieval-Augmented Generation) pipelines, prompt templates, and evaluation frameworks.
- Integrate LLMs with APIs and vector databases (Pinecone, FAISS, Weaviate).
Product & Engineering Collaboration
- Partner with engineering teams to productionize ML/GenAI models.
- Contribute to API development, data pipelines, technical documentation, and client presentations.
Team & Growth
- Mentor junior data scientists and review technical contributions.
- Stay up to date with the latest ML & GenAI research and tools; share insights across the team.
Required Skills & Qualifications
- 4.5–9 years of applied data science experience.
- Strong proficiency in Python and ML libraries (scikit-learn, XGBoost, LightGBM).
- Hands-on experience with LLM APIs (OpenAI, Cohere, Claude) and frameworks (LangChain, LlamaIndex).
- Strong SQL, data wrangling, and analysis skills (pandas, NumPy).
- Experience working with APIs, Git, and cloud platforms (AWS/GCP).
Good-to-Have
- Deployment experience with FastAPI, Docker, or serverless frameworks.
- Familiarity with MLOps tools (MLflow, DVC).
- Experience working with embeddings, vector databases, and similarity search.
Responsibilities:
- Design, develop, test, and deploy MicroStrategy Dashboards, Dossiers, Reports and Documents and supporting objects.
- Collaborate with business analysts and stakeholders to translate business needs into technical specifications.
- Liaise with the IT infrastructure team to manage MicroStrategy environments, including upgrade strategies and security considerations.
- Be able to work on feature enhancement pertaining to retail KPIs in existing reports.
- Interact and assist with ad-hoc reporting and access requests from business users.
- Experienced in communicating and applying Best Practices on the use of MicroStrategy.
- Works effectively with members of management.
- Maintains knowledge of the latest trends in BI, business analytics, and other technologies
Requirements
- 4+ years of experience in Administration tools like MicroStrategy Object Manager, MicroStrategy Command Manager, MicroStrategy Enterprise Manager in Version 11.3.
- Ability to create MicroStrategy schemas and application objects such as attributes, metrics, prompts, filters, templates.
- Knowledge of meta data creation (framework models/universe), creating report specifications, integration planning, unit testing planning and testing, UAT and implementation support.
- Knowledge of advanced filters, conditional formatting, transformation and level metrics and customer groups.
- Ability to define relationships between attributes, facts and other objects within MicroStrategy.
- Ability to define Hierarchies and drill functionality within MicroStrategy.
- Strong experience in writing SQL and good working knowledge of SQL Server databases.
- Ability to design and develop interactive dashboards and visualization in MicroStrategy, leveraging advanced features such as drilling, filtering and dynamic content to enable intuitive data exploration and analysis.
- Optimize MicroStrategy performance by fine tuning queries, optimizing data storage and implementing best practices for data modelling and schema design.
- Provide training and support to end users and business analysts on MicroStrategy functionality, data modelling concepts and best practices for data visualization and analysis.
Shift timings : Afternoon
Job Summary
We are seeking an experienced Senior Java Developer with strong expertise in legacy system migration, server management, and deployment. The candidate will be responsible for maintaining, enhancing, and migrating an existing Java/JSF (PrimeFaces), EJB, REST API, and SQL Server-based application to a modern Spring Boot architecture. The role involves ensuring smooth production deployments, troubleshooting server issues, and optimizing the existing infrastructure.
Key Responsibilities
● Maintain & Enhance the existing Java, JSF (PrimeFaces), EJB, REST API, andSQL Server application.
● Migrate the legacy system to Spring Boot while ensuring minimal downtime.
● Manage deployments using Ansible, GlassFish/Payara, and deployer.sh scripts.
● Optimize and troubleshoot server performance (Apache, Payara, GlassFish).
● Handle XML file generation, email integrations, and REST API maintenance.
● Database management (SQL Server) including query optimization and schema updates.
● Collaborate with teams to ensure smooth transitions during migration.
● Automate CI/CD pipelines using Maven, Ansible, and shell scripts.
● Document migration steps, deployment processes, and system architecture.
Required Skills & Qualifications
● 8+ years of hands-on experience with Java, JSF (PrimeFaces), EJB, and REST APIs.
● Strong expertise in Spring Boot (migration experience from legacy Java is a must).
● Experience with Payara/GlassFish server management and deployment.
● Proficient in Apache, Ansible, and shell scripting (deployer.sh).
● Solid knowledge of SQL Server (queries, stored procedures, optimization).
● Familiarity with XML processing, email integrations, and Maven builds.
● Experience in production deployments, server troubleshooting, and performance tuning.
● Ability to work independently and lead migration efforts.
Preferred Skills
● Knowledge of microservices architecture (helpful for modernization).
● Familiarity with cloud platforms (AWS/Azure) is a plus.

Java Developer – Job Description Wissen Technology is now hiring for a Java Developer - Bangalore with hands-on experience in Core Java, algorithms, data structures, multithreading and SQL. We are solving complex technical problems in the industry and need talented software engineers to join our mission and be a part of a global software development team. A brilliant opportunity to become a part of a highly motivated and expert team which has made a mark as a high-end technical consulting. Required Skills: • Exp. - 4 to 7 years. • Experience in Core Java and Spring Boot. • Extensive experience in developing enterprise-scale applications and systems. Should possess good architectural knowledge and be aware of enterprise application design patterns. • Should have the ability to analyze, design, develop and test complex, low-latency client facing applications. • Good development experience with RDBMS. • Good knowledge of multi-threading and high-performance server-side development. • Basic working knowledge of Unix/Linux. • Excellent problem solving and coding skills. • Strong interpersonal, communication and analytical skills. • Should have the ability to express their design ideas and thoughts. About Wissen Technology: Wissen Technology is a niche global consulting and solutions company that brings unparalleled domain expertise in Banking and Finance, Telecom and Startups. Wissen Technology is a part of Wissen Group and was established in the year 2015. Wissen has offices in the US, India, UK, Australia, Mexico, and Canada, with best-in-class infrastructure and development facilities. Wissen has successfully delivered projects worth $1 Billion for more than 25 of the Fortune 500 companies. The Wissen Group overall includes more than 4000 highly skilled professionals. Wissen Technology provides exceptional value in mission critical projects for its clients, through thought leadership, ownership, and assured on-time deliveries that are always ‘first time right’. Our team consists of 1200+ highly skilled professionals, with leadership and senior management executives who have graduated from Ivy League Universities like Wharton, MIT, IITs, IIMs, and NITs and with rich work experience in some of the biggest companies in the world. Wissen Technology offers an array of services including Application Development, Artificial Intelligence & Machine Learning, Big Data & Analytics, Visualization & Business Intelligence, Robotic Process Automation, Cloud, Mobility, Agile & DevOps, Quality Assurance & Test Automation. We have been certified as a Great Place to Work® for two consecutive years (2020-2022) and voted as the Top 20 AI/ML vendor by CIO Insider.

Role Overview
We're looking for experienced Data Engineers who can independently design, build, and manage scalable data platforms. You'll work directly with clients and internal teams to develop robust data pipelines that support analytics, AI/ML, and operational systems.
You’ll also play a mentorship role and help establish strong engineering practices across our data projects.
Key Responsibilities
- Design and develop large-scale, distributed data pipelines (batch and streaming)
- Implement scalable data models, warehouses/lakehouses, and data lakes
- Translate business requirements into technical data solutions
- Optimize data pipelines for performance and reliability
- Ensure code is clean, modular, tested, and documented
- Contribute to architecture, tooling decisions, and platform setup
- Review code/design and mentor junior engineers
Must-Have Skills
- Strong programming skills in Python and advanced SQL
- Solid grasp of ETL/ELT, data modeling (OLTP & OLAP), and stream processing
- Hands-on experience with frameworks like Apache Spark, Flink, etc.
- Experience with orchestration tools like Airflow
- Familiarity with CI/CD pipelines and Git
- Ability to debug and scale data pipelines in production
Preferred Skills
- Experience with cloud platforms (AWS preferred, GCP or Azure also fine)
- Exposure to Databricks, dbt, or similar tools
- Understanding of data governance, quality frameworks, and observability
- Certifications (e.g., AWS Data Analytics, Solutions Architect, Databricks) are a bonus
What We’re Looking For
- Problem-solver with strong analytical skills and attention to detail
- Fast learner who can adapt across tools, tech stacks, and domains
- Comfortable working in fast-paced, client-facing environments
- Willingness to travel within India when required

Job Summary:
We are looking for a highly skilled and experienced Data Engineer with deep expertise in Airflow, dbt, Python, and Snowflake. The ideal candidate will be responsible for designing, building, and managing scalable data pipelines and transformation frameworks to enable robust data workflows across the organization.
Key Responsibilities:
- Design and implement scalable ETL/ELT pipelines using Apache Airflow for orchestration.
- Develop modular and maintainable data transformation models using dbt.
- Write high-performance data processing scripts and automation using Python.
- Build and maintain data models and pipelines on Snowflake.
- Collaborate with data analysts, data scientists, and business teams to deliver clean, reliable, and timely data.
- Monitor and optimize pipeline performance and troubleshoot issues proactively.
- Follow best practices in version control, testing, and CI/CD for data projects.
Must-Have Skills:
- Strong hands-on experience with Apache Airflow for scheduling and orchestrating data workflows.
- Proficiency in dbt (data build tool) for building scalable and testable data models.
- Expert-level skills in Python for data processing and automation.
- Solid experience with Snowflake, including SQL performance tuning, data modeling, and warehouse management.
- Strong understanding of data engineering best practices including modularity, testing, and deployment.
Good to Have:
- Experience working with cloud platforms (AWS/GCP/Azure).
- Familiarity with CI/CD pipelines for data (e.g., GitHub Actions, GitLab CI).
- Exposure to modern data stack tools (e.g., Fivetran, Stitch, Looker).
- Knowledge of data security and governance best practices.
Note : One face-to-face (F2F) round is mandatory, and as per the process, you will need to visit the office for this.
Job Title: QA Tester – FinTech (Manual + Automation Testing)
Location: Bangalore, India
Job Type: Full-Time
Experience Required: 3 Years
Industry: FinTech / Financial Services
Function: Quality Assurance / Software Testing
Budget – Upto 6 LPA
About the Role:
We are looking for a skilled QA Tester with 3 years of experience in both manual and automation testing, ideally in the FinTech domain. The candidate will work closely with development and product teams to ensure that our financial applications meet the highest standards of quality, performance, and security.
Key Responsibilities:
- Analyze business and functional requirements for financial products and translate them into test scenarios.
- Design, write, and execute manual test cases for new features, enhancements, and bug fixes.
- Develop and maintain automated test scripts using tools such as Selenium, TestNG, or similar frameworks.
- Conduct API testing using Postman, Rest Assured, or similar tools.
- Perform functional, regression, integration, and system testing across web and mobile platforms.
- Work in an Agile/Scrum environment and actively participate in sprint planning, stand-ups, and retrospectives.
- Log and track defects using JIRA or a similar defect management tool.
- Collaborate with developers, BAs, and DevOps teams to improve quality across the SDLC.
- Ensure test coverage for critical fintech workflows like transactions, KYC, lending, payments, and compliance.
- Assist in setting up CI/CD pipelines for automated test execution using tools like Jenkins, GitLab CI, etc.
Required Skills and Experience:
- 3+ years of hands-on experience in manual and automation testing.
- Solid understanding of QA methodologies, STLC, and SDLC.
- Experience in testing FinTech applications such as digital wallets, online banking, investment platforms, etc.
- Strong experience with Selenium WebDriver, TestNG, Postman, and JIRA.
- Knowledge of API testing, including RESTful services.
- Familiarity with SQL to validate data in databases.
- Understanding of CI/CD processes and basic scripting for automation integration.
- Good problem-solving skills and attention to detail.
- Excellent communication and documentation skills.
Preferred Qualifications:
- Exposure to financial compliance and regulatory testing (e.g., PCI DSS, AML/KYC).
- Experience with mobile app testing (iOS/Android).
- Working knowledge of test management tools like TestRail, Zephyr, or Xray.
Role: Java Backend developer
Location: Bangalore
- 5+ years of industrial experience
- Experience in Core Java 1.8 and above, Data Structures, OOPS, Multithreading, Microservices, Spring, Kafka
- Should have the ability to analyse, design, develop and test complex, low-latency client-facing applications.
- Good development experience with RDBMS
- Good knowledge of multi-threading and high volume server side development
- Excellent problem solving and coding skills in Java
- Strong interpersonal, communication and analytical skills.

What You’ll Do:
- Architect & build our core backend using modern microservices patterns
- Develop intelligent AI/ML-driven systems for financial document processing at scale
- Own database design (SQL + NoSQL) for speed, reliability, and compliance
- Integrate vector search, caching layers, and pipelines to power real-time insights
- Ensure security, compliance, and data privacy at every layer of the stack
- Collaborate directly with founders to translate vision into shippable features
- Set engineering standards & culture for future hires
What You Bring:
- Core SkillsDeep expertise in Python (Django, FastAPI, or Flask)
- Strong experience in SQL & NoSQL database architecture
- Hands-on with vector databases and caching systems (e.g., Redis)
- Proven track record building scalable microservices
- Strong grounding in security best practices for sensitive data
Experience:
- 1+ years building production-grade backend systems
- History of owning technical decisions that impacted product direction
- Ability to solve complex, high-scale technical problems
- Bonus Points ForExperience building LLM-powered applications at scale
- Background in enterprise SaaS, or financial software
- Early-stage startup experience
- Familiarity with financial reporting/accounting concepts
Why Join Us:
- Founding team equity with significant upside
- Direct influence on product architecture & company direction
- Work with cutting-edge AI/ML tech on real-world financial data
- Backed by top-tier VC
- Join at ground zero and help shape our engineering culture

About the Role :
We are seeking an experienced Python Backend Lead to design, develop, and optimize scalable backend solutions.
The role involves working with large-scale data, building efficient APIs, integrating middleware solutions, and ensuring high performance and reliability.
You will lead a team of developers while also contributing hands-on to coding, design, and architecture.
Mandatory Skills : Python (Pandas, NumPy, Matplotlib, Plotly), FastAPI/FlaskAPI, SQL & NoSQL (MongoDB, CRDB, Postgres), Middleware tools (Mulesoft/BizTalk), CI/CD, RESTful APIs, OOP, OOD, DS & Algo, Design Patterns.
Key Responsibilities :
1. Lead backend development projects using Python (FastAPI/FlaskAPI).
2. Design, build, and maintain scalable APIs and microservices.
3. Work with SQL and NoSQL databases (MongoDB, CRDB, Postgres).
4. Implement and optimize middleware integrations (Mulesoft, BizTalk).
5. Ensure smooth deployment using CI/CD pipelines.
6. Apply Object-Oriented Programming (OOP), Design Patterns, and Data Structures & Algorithms to deliver high-quality solutions.
7. Collaborate with cross-functional teams (frontend, DevOps, product) to deliver business objectives.
8. Mentor and guide junior developers, ensuring adherence to best practices and coding standards.
Required Skills :
1. Strong proficiency in Python with hands-on experience in Pandas, NumPy, Matplotlib, Plotly.
2. Expertise in FastAPI / FlaskAPI frameworks.
3. Solid knowledge of SQL & NoSQL databases (MongoDB, CRDB, Postgres).
4. Experience with middleware tools such as Mulesoft or BizTalk.
5. Proficiency in RESTful APIs, Web Services, and CI/CD pipelines.
6. Strong understanding of OOP, OOD, Design Patterns, and DS & Algo.
7. Excellent problem-solving, debugging, and optimization skills.
8. Prior experience in leading teams is highly desirable.

Job Title : Python Backend Lead / Senior Python Developer
Experience : 6 to 10 Years
Location : Bangalore (CV Raman Nagar)
Openings : 8
Interview Rounds : 1 Virtual + 1 In-Person (Face-to-Face with Client)
Note : Only local Bangalore candidates will be considered
About the Role :
We are seeking an experienced Python Backend Lead / Senior Python Developer to design, develop, and optimize scalable backend solutions.
The role involves working with large-scale data, building efficient APIs, integrating middleware solutions, and ensuring high performance and reliability.
You will lead a team of developers while also contributing hands-on to coding, design, and architecture.
Mandatory Skills : Python (Pandas, NumPy, Matplotlib, Plotly), FastAPI/FlaskAPI, SQL & NoSQL (MongoDB, CRDB, Postgres), Middleware tools (Mulesoft/BizTalk), CI/CD, RESTful APIs, OOP, OOD, DS & Algo, Design Patterns.
Key Responsibilities :
- Lead backend development projects using Python (FastAPI/FlaskAPI).
- Design, build, and maintain scalable APIs and microservices.
- Work with SQL and NoSQL databases (MongoDB, CRDB, Postgres).
- Implement and optimize middleware integrations (Mulesoft, BizTalk).
- Ensure smooth deployment using CI/CD pipelines.
- Apply Object-Oriented Programming (OOP), Design Patterns, and Data Structures & Algorithms to deliver high-quality solutions.
- Collaborate with cross-functional teams (frontend, DevOps, product) to deliver business objectives.
- Mentor and guide junior developers, ensuring adherence to best practices and coding standards.
Required Skills :
- Strong proficiency in Python with hands-on experience in Pandas, NumPy, Matplotlib, Plotly.
- Expertise in FastAPI / FlaskAPI frameworks.
- Solid knowledge of SQL & NoSQL databases (MongoDB, CRDB, Postgres).
- Experience with middleware tools such as Mulesoft or BizTalk.
- Proficiency in RESTful APIs, Web Services, and CI/CD pipelines.
- Strong understanding of OOP, OOD, Design Patterns, and DS & Algo.
- Excellent problem-solving, debugging, and optimization skills.
- Prior experience in leading teams is highly desirable.
🚀 Hiring: Tableau Developer
⭐ Experience: 5+ Years
📍 Location: Pune, Gurgaon, Bangalore, Chennai, Hyderabad
⭐ Work Mode:- Hybrid
⏱️ Notice Period: Immediate Joiners or 15 Days
(Only immediate joiners & candidates serving notice period)
We are looking for a skilled Tableau Developer to join our team. The ideal candidate will have hands-on experience in creating, maintaining, and optimizing dashboards, reports, and visualizations that enable data-driven decision-making across the organization.
⭐ Key Responsibilities:
✅Develop and maintain Tableau dashboards & reports
✅Translate business needs into data visualizations
✅Work with SQL & multiple data sources for insights
✅Optimize dashboards for performance & usability
✅Collaborate with stakeholders for BI solutions

Role Description:
As a Senior Data Science and Modeling Specialist at Incedo, you will be responsible for developing and deploying predictive models and machine learning algorithms to support business decision-making. You will work with data scientists, data engineers, and business analysts to understand business requirements and develop data-driven solutions. You will be skilled in programming languages such as Python or R and have experience in data science tools such as TensorFlow or Keras. You will be responsible for ensuring that models are accurate, efficient, and scalable.
Roles & Responsibilities:
- Developing and implementing machine learning models and algorithms to solve complex business problems
- Conducting data analysis and modeling using statistical and data analysis tools
- Collaborating with other teams to ensure the consistency and integrity of data
- Providing guidance and mentorship to junior data science and modeling specialists
- Presenting findings and recommendations to stakeholders
Technical Skills
Skills Requirements:
- Proficiency in statistical analysis techniques such as regression analysis, hypothesis testing, or time-series analysis.
- Knowledge of machine learning algorithms and techniques such as supervised learning, unsupervised learning, or reinforcement learning.
- Experience with data wrangling and data cleaning techniques using tools such as Python, R, or SQL.
- Understanding of big data technologies such as Hadoop, Spark, or Hive.
- Must have excellent communication skills and be able to communicate complex technical information to non-technical stakeholders in a clear and concise manner.
- Must understand the company's long-term vision and align with it.
- Provide leadership, guidance, and support to team members, ensuring the successful completion of tasks, and promoting a positive work environment that fosters collaboration and productivity, taking responsibility of the whole team.
Qualifications
- 4-6 years of work experience in relevant field
- B.Tech/B.E/M.Tech or MCA degree from a reputed university. Computer science background is preferred

Primary Skills - Magento, Magento 1.X,SQL, .Net
Should be willing to work on shifts - 02:00 PM – 11:00 PM IST
5 + Years of Strong experience in Magento 1.x and Magento 2.x development (both front-end and back-end).
Hands-on experience with Magento Extensions – installation, customization, and development.
Proficiency in PHP, MySQL, and OOP principles.
Experience with Magento theme customization, module development, and extension integration.
Knowledge of JavaScript, Knockout.js, Require JS, jQuery, HTML5, CSS3.
Understanding of Magento REST & GraphQL APIs for integrations.
Experience with version control (Git/Bitbucket) and build tools (Composer).
Exposure to production support environments (incident, problem, change management – ITIL knowledge preferred).
Job description
Note: Applications are welcome from candidates who are immediate joiners or currently serving their notice period.
Key Responsibilities:
- Design, develop, and test low-latency, high-volume client-facing applications.
- Develop and maintain enterprise-scale, n-tier applications for the investment banking/capital markets domain.
- Implement microservices-based solutions using Spring Boot and other modern frameworks.
- Work extensively with Core Java 5.0 and above, Spring Framework, and CXF
- Optimize applications for multi-threading and high-performance server-side development
- Collaborate with cross-functional teams to design and implement scalable, secure, and efficient solutions
- Work with RDBMS (preferably Sybase) to handle large-scale data processing.
- Develop and maintain applications on Unix/Linux environments
- Utilize enterprise application design patterns to build robust and scalable solutions.
- Troubleshoot and resolve complex technical issues while ensuring application stability and performance.
Required Skills & Experience:
- 5 to 10 years of hands-on experience in Java development
- Strong knowledge of microservices architecture and cloud-based deployment.
- Expertise in Spring Framework, Spring Boot, and CXF
- Experience in developing and optimizing applications for high-volume, multi-threaded environments
- Solid understanding of financial domain applications, sales, and trading platforms
- Proficiency in working with relational databases (Sybase preferred)
- Familiarity with Unix/Linux environments and shell scripting.
- Strong analytical, problem-solving, and communication skills.
- Ability to articulate and present design ideas effectively.
Job Overview:
We are looking for a Senior Analyst who has led teams and managed system operations.
Key Responsibilities:
- Lead and mentor a team of analysts to drive high-quality execution.
- Design, write, and optimize SQL queries to derive actionable insights.
- Manage, monitor, and enhance Payment Governance Systems for accuracy and efficiency.
- Work cross-functionally with Finance, Tech, and Operations teams to maintain data integrity.
- Build and automate dashboards/reports to track key metrics and system performance.
- Identify anomalies and lead root cause analysis for payment-related issues.
- Define and document processes, SOPs, and governance protocols.
- Ensure compliance with internal control frameworks and audit readiness.
Requirements:
We require candidates with the following qualifications:
- 3–5 years of experience in analytics, data systems, or operations.
- Proven track record of leading small to mid-size teams.
- Strong command over SQL and data querying techniques.
- Experience with payment systems, reconciliation, or financial data platforms.
- Analytical mindset with problem-solving abilities.
- Ability to work in a fast-paced, cross-functional environment.
- Excellent communication and stakeholder management skills.
We are looking for a skilled Java Developer to join our growing team. The ideal candidate should have hands-on experience in designing, developing, and maintaining high-performance Java applications. You will be responsible for writing clean, efficient, and scalable code while collaborating with cross-functional teams to deliver robust solutions.
Key Responsibilities:
- Design, develop, test, and deploy Java-based applications.
- Write clean, maintainable, and efficient code.
- Work with databases (SQL/NoSQL) and ensure smooth integration.
- Debug, troubleshoot, and optimize application performance.
- Collaborate with the team to understand requirements and deliver within timelines.
- Participate in code reviews and maintain coding standards.
Requirements:
- 3–6 years of hands-on experience in Java development.
- Strong knowledge of Core Java, OOPs, Collections, Multithreading.
- Experience with Spring / Spring Boot frameworks.
- Familiarity with RESTful APIs, Microservices architecture.
- Knowledge of relational databases (MySQL, PostgreSQL, Oracle, etc.).
- Good understanding of version control (Git).
- Strong problem-solving and debugging skills.
Good to Have Skills:
- Experience with Hibernate/JPA.
- Exposure to cloud platforms (AWS, Azure, GCP).
- Familiarity with CI/CD tools (Jenkins, Docker, Kubernetes).
- Knowledge of Agile/Scrum methodologies.
Why Join Us?
- Opportunity to work on challenging and scalable projects.
- Growth-oriented environment with learning opportunities.
- Collaborative and inclusive work culture.

Experience: 2–6 Years
Location: [Bangalore]
Employment Type: Full-time/ WFO
Notice - 30 days
Role Overview
We are looking for a skilled and motivated .NET Developer with 2–6 years of experience to join our dynamic team. The ideal candidate should have a strong foundation in .NET technologies, relational databases, and cloud-based application development. You will be responsible for designing, developing, and maintaining scalable, secure, and high-performance applications while collaborating with cross-functional teams in an Agile environment.
Key Responsibilities
- Design, develop, and maintain applications using .NET technologies.
- Work extensively with relational databases and optimize SQL queries for performance.
- Develop and maintain API-driven solutions and integrate with ETL/data warehouse environments handling large volumes of data.
- Apply Object-Oriented Programming (OOP) principles and design patterns to build scalable and maintainable solutions.
- Develop, migrate, and deploy applications in Microsoft Azure Cloud (Managed SQL, VMs, containerized architecture).
- Write and optimize SSAS queries for analytics and reporting.
- Collaborate with Product Owners, Architects, and QA in an Agile development environment (Scrum/Kanban).
- Participate in code reviews, sprint planning, and daily standups to ensure high-quality deliverables.
- Implement best practices in security, performance, and cloud-native development.
- Build and maintain frontend components using Angular frameworks (2–3 years preferred).
Required Skills & Qualifications
- 2–6 years of experience in software development using .NET technologies.
- Strong understanding of OOP concepts and design patterns.
- Hands-on experience with SQL, data warehouses, ETL pipelines, and high-volume data processing.
- Proven experience in API development and integration.
- 3–5 years of experience in Azure Cloud with cloud-native development and application migration.
- Proficiency in Azure services: Managed SQL, VMs, container-based deployments.
- Experience with SSAS query writing and optimization.
- 2–3 years of experience in frontend development with Angular frameworks.
- Familiarity with Agile methodologies and collaboration tools (JIRA, Azure DevOps, Git, etc.).
- Strong problem-solving skills, attention to detail, and ability to work independently as well as in a team.
Good to Have
- Experience in Azure DevOps CI/CD pipelines.
- Knowledge of other cloud providers (AWS, GCP) is a plus.
- Exposure to performance tuning, monitoring, and troubleshooting in cloud-hosted applications.
Education
- Bachelor’s or Master’s degree in Computer Science, Information Technology, or related field.



Proven experience as a Data Scientist or similar role with relevant experience of at least 4 years and total experience 6-8 years.
· Technical expertiseregarding data models, database design development, data mining and segmentation techniques
· Strong knowledge of and experience with reporting packages (Business Objects and likewise), databases, programming in ETL frameworks
· Experience with data movement and management in the Cloud utilizing a combination of Azure or AWS features
· Hands on experience in data visualization tools – Power BI preferred
· Solid understanding of machine learning
· Knowledge of data management and visualization techniques
· A knack for statistical analysis and predictive modeling
· Good knowledge of Python and Matlab
· Experience with SQL and NoSQL databases including ability to write complex queries and procedures

Responsibilities:
- Design and develop scalable, secure, and high-performance applications using Python (Django framework).
- Architect system components, define database schemas, and optimize backend services for speed and efficiency.
- Lead and implement design patterns and software architecture best practices.
- Ensure code quality through comprehensive unit testing, integration testing, and participation in code reviews.
- Collaborate closely with Product, DevOps, QA, and Frontend teams to build seamless end-to-end solutions.
- Drive performance improvements, monitor system health, and troubleshoot production issues.
- Apply domain knowledge in payments and finance, including transaction processing, reconciliation, settlements, wallets, UPI, etc.
- Contribute to technical decision-making and mentor junior developers.
Requirements:
- 6 to 10 years of professional backend development experience with Python and Django.
- Strong background in payments/financial systems or FinTech applications.
- Proven experience in designing software architecture in a microservices or modular monolith environment.
- Experience working in fast-paced startup environments with agile practices.
- Proficiency in RESTful APIs, SQL (PostgreSQL/MySQL), NoSQL (MongoDB/Redis).
- Solid understanding of Docker, CI/CD pipelines, and cloud platforms (AWS/GCP/Azure).
- Hands-on experience with test-driven development (TDD) and frameworks like pytest, unittest, or factory_boy.
- Familiarity with security best practices in financial applications (PCI compliance, data encryption, etc.).
Preferred Skills:
- Exposure to event-driven architecture (Celery, Kafka, RabbitMQ).
- Experience integrating with third-party payment gateways, banking APIs, or financial instruments.
- Understanding of DevOps and monitoring tools (Prometheus, ELK, Grafana).
- Contributions to open-source or personal finance-related projects.
Key Responsibilities:
- Develop, maintain, and optimize data pipelines using DBT and SQL.
- Collaborate with data analysts and business teams to build scalable data models.
- Implement data transformations, testing, and documentation within the DBT framework.
- Work with Snowflake for data warehousing tasks, including data ingestion, query optimization, and performance tuning.
- Use Python (preferred) for automation, scripting, and additional data processing as needed.
Required Skills:
- 4–6 years of experience in data engineering or related roles.
- Strong hands-on expertise with DBT and advanced SQL.
- Experience working with modern data warehouses, preferably Snowflake.
- Knowledge of Python for data manipulation and workflow automation (preferred but not mandatory).
- Good understanding of data modeling concepts, ETL/ELT processes, and best practices.
Java Developer – Job Description Wissen Technology is now hiring for a Java Developer - Bangalore with hands-on experience in Core Java, algorithms, data structures, multithreading and SQL. We are solving complex technical problems in the industry and need talented software engineers to join our mission and be a part of a global software development team. A brilliant opportunity to become a part of a highly motivated and expert team which has made a mark as a high-end technical consulting. Required Skills: • Exp. - 4 to 7 years. • Experience in Core Java and Spring Boot. • Extensive experience in developing enterprise-scale applications and systems. Should possess good architectural knowledge and be aware of enterprise application design patterns. • Should have the ability to analyze, design, develop and test complex, low-latency client facing applications. • Good development experience with RDBMS. • Good knowledge of multi-threading and high-performance server-side development. • Basic working knowledge of Unix/Linux. • Excellent problem solving and coding skills. • Strong interpersonal, communication and analytical skills. • Should have the ability to express their design ideas and thoughts. About Wissen Technology: Wissen Technology is a niche global consulting and solutions company that brings unparalleled domain expertise in Banking and Finance, Telecom and Startups. Wissen Technology is a part of Wissen Group and was established in the year 2015. Wissen has offices in the US, India, UK, Australia, Mexico, and Canada, with best-in-class infrastructure and development facilities. Wissen has successfully delivered projects worth $1 Billion for more than 25 of the Fortune 500 companies. The Wissen Group overall includes more than 4000 highly skilled professionals. Wissen Technology provides exceptional value in mission critical projects for its clients, through thought leadership, ownership, and assured on-time deliveries that are always ‘first time right’. Our team consists of 1200+ highly skilled professionals, with leadership and senior management executives who have graduated from Ivy League Universities like Wharton, MIT, IITs, IIMs, and NITs and with rich work experience in some of the biggest companies in the world. Wissen Technology offers an array of services including Application Development, Artificial Intelligence & Machine Learning, Big Data & Analytics, Visualization & Business Intelligence, Robotic Process Automation, Cloud, Mobility, Agile & DevOps, Quality Assurance & Test Automation. We have been certified as a Great Place to Work® for two consecutive years (2020-2022) and voted as the Top 20 AI/ML vendor by CIO Insider.

Job Title: Senior Data Engineer
Location: Bangalore | Hybrid
Company: krtrimaIQ Cognitive Solutions
Role Overview:
As a Senior Data Engineer, you will design, build, and optimize robust data foundations and end-to-end solutions to unlock maximum value from data across the organization. You will play a key role in fostering data-driven thinking, not only within the IT function but also among broader business stakeholders. You will serve as a technology and subject matter expert, providing mentorship to junior engineers and translating the company’s vision and Data Strategy into actionable, high-impact IT solutions.
Key Responsibilities:
- Design, develop, and implement scalable data solutions to support business objectives and drive digital transformation.
- Serve as a subject matter expert in data engineering, providing guidance and mentorship to junior team members.
- Enable and promote data-driven culture throughout the organization, engaging both technical and business stakeholders.
- Lead the design and delivery of Data Foundation initiatives, ensuring adoption and value realization across business units.
- Collaborate with business and IT teams to capture requirements, design optimal data models, and deliver high-value insights.
- Manage and drive change management, incident management, and problem management processes related to data platforms.
- Present technical reports and actionable insights to stakeholders and leadership teams, acting as the expert in Data Analysis and Design.
- Continuously improve efficiency and effectiveness of solution delivery, driving down costs and reducing implementation times.
- Contribute to organizational knowledge-sharing and capability building (e.g., Centers of Excellence, Communities of Practice).
- Champion best practices in code quality, DevOps, CI/CD, and data governance throughout the solution lifecycle.
Key Characteristics:
- Technology expert with a passion for continuous learning and exploring multiple perspectives.
- Deep expertise in the data engineering/technology domain, with hands-on experience across the full data stack.
- Excellent communicator, able to bridge the gap between technical teams and business stakeholders.
- Trusted leader, respected across levels for subject matter expertise and collaborative approach.
Mandatory Skills & Experience:
- Mastery in public cloud platforms: AWS, Azure, SAP
- Mastery in ELT (Extract, Load, Transform) operations
- Advanced data modeling expertise for enterprise data platforms
Hands-on skills:
- Data Integration & Ingestion
- Data Manipulation and Processing
- Source/version control and DevOps tools: GITHUB, Actions, Azure DevOps
- Data engineering/data platform tools: Azure Data Factory, Databricks, SQL Database, Synapse Analytics, Stream Analytics, AWS Glue, Apache Airflow, AWS Kinesis, Amazon Redshift, SonarQube, PyTest
- Experience building scalable and reliable data pipelines for analytics and other business applications
Optional/Preferred Skills:
- Project management experience, especially running or contributing to Scrum teams
- Experience working with BPC (Business Planning and Consolidation), Planning tools
- Exposure to working with external partners in the technology ecosystem and vendor management
What We Offer:
- Opportunity to leverage cutting-edge technologies in a high-impact, global business environment
- Collaborative, growth-oriented culture with strong community and knowledge-sharing
- Chance to influence and drive key data initiatives across the organization


Quidcash is seeking a skilled Backend Developer to architect, build, and optimize mission-critical financial systems. You’ll leverage your expertise in JavaScript, Python, and OOP to develop scalable backend services that power our fintech/lending solutions. This role offers
the chance to solve complex technical challenges, integrate cutting-edge technologies, and directly impact the future of financial services for Indian SMEs.
If you are a leader who thrives on technical challenges, loves building high-performing teams, and is excited by the potential of AI/ML in fintech, we want to hear from you!
What You ll Do:
Design & Development: Build scalable backend services using JavaScript(Node.js) and Python, adhering to OOP principles and microservices architecture.
Fintech Integration: Develop secure APIs (REST/gRPC) for financial workflows(e.g., payments, transactions, data processing) and ensure compliance with regulations (PCI-DSS, GDPR).
System Optimization: Enhance performance, reliability, and scalability of cloud- native applications on AWS.
Collaboration: Partner with frontend, data, and product teams to deliver end-to-end features in Agile/Scrum cycles.
Quality Assurance: Implement automated testing (unit/integration), CI/CD pipelines, and DevOps practices.
Technical Innovation: Contribute to architectural decisions and explore AI/ML integration opportunities in financial products.
What You'll Bring (Must-Haves):
Experience:
3–5 years of backend development with JavaScript (Node.js) and Python.
Proven experience applying OOP principles, design patterns, and micro services.
Background in fintech, banking, or financial systems (e.g., payment gateways, risk engines, transactional platforms).
Technical Acumen:
Languages/Frameworks:
JavaScript (Node.js + Express.js/Fastify)
Python (Django/Flask/FastAPI)
Databases: SQL (PostgreSQL/MySQL) and/or NoSQL (MongoDB/Redis).
Cloud & DevOps: AWS/GCP/Azure, Docker, Kubernetes, CI/CD tools (Jenkins/GitLab).
Financial Tech: API security (OAuth2/JWT), message queues (Kafka/RabbitMQ), and knowledge of financial protocols (e.g., ISO 20022).
Mindset:
Problem-solver with a passion for clean, testable code and continuous improvement.
Adaptability in fast-paced environments and commitment to deadlines.
Collaborative spirit with strong communication skills.
Why Join Quidcash?
Impact: Play a pivotal role in shaping a product that directly impacts Indian SMEs' business growth.
Innovation: Work with cutting-edge technologies, including AI/ML, in a forward-thinking environment.
Growth: Opportunities for professional development and career advancement in a growing company.
Culture: Be part of a collaborative, supportive, and brilliant team that values every contribution.
Benefits: Competitive salary, comprehensive benefits package, and be a part of the next fintech evolution.
If you are interested, pls share your profile to smithaquidcash.in

Key Responsibilities
- Data Architecture & Pipeline Development
- Design, implement, and optimize ETL/ELT pipelines using Azure Data Factory, Databricks, and Synapse Analytics.
- Integrate structured, semi-structured, and unstructured data from multiple sources.
- Data Storage & Management
- Develop and maintain Azure SQL Database, Azure Synapse Analytics, and Azure Data Lake solutions.
- Ensure proper indexing, partitioning, and storage optimization for performance.
- Data Governance & Security
- Implement role-based access control, data encryption, and compliance with GDPR/CCPA.
- Ensure metadata management and data lineage tracking with Azure Purview or similar tools.
- Collaboration & Stakeholder Engagement
- Work closely with BI developers, analysts, and business teams to translate requirements into data solutions.
- Provide technical guidance and best practices for data integration and transformation.
- Monitoring & Optimization
- Set up monitoring and alerting for data pipelines.

Job Title: Full Stack Developer – Java + React
Location: Bangalore
Experience: 10 to 14 Years
Job Summary:
We are looking for a skilled Full Stack Developer with strong experience in Java (backend) and React.js (frontend) to join our dynamic engineering team. The ideal candidate will have hands-on experience in building scalable web applications, RESTful services, and responsive UI components.
Key Responsibilities:
- Develop and maintain scalable backend services using Core Java / Spring Boot
- Design and implement responsive, high-quality front-end interfaces using React.js
- Integrate backend APIs with frontend components seamlessly
- Collaborate with product managers, architects, and QA to deliver quality software
- Ensure code quality through unit testing, code reviews, and performance tuning
- Troubleshoot and debug production issues as needed
Key Skills Required:
Backend:
- Strong programming skills in Core Java, Java 8+
- Experience with Spring Boot, RESTful APIs, Microservices architecture
- Good understanding of JPA/Hibernate, and SQL/NoSQL databases
Frontend:
- Proficient in React.js, JavaScript (ES6+), HTML5, CSS3
- Experience with Redux, React Hooks, and component-based architecture
- Familiarity with front-end build tools (Webpack, Babel, NPM)