50+ SQL Jobs in Mumbai | SQL Job openings in Mumbai
Apply to 50+ SQL Jobs in Mumbai on CutShort.io. Explore the latest SQL Job opportunities across top companies like Google, Amazon & Adobe.

Company Name – Wissen Technology
Group of companies in India – Wissen Technology & Wissen Infotech
Work Location - Bangalore/Mumbai
Java Developer – Job Description
Wissen Technology is now hiring for a Java Developer - Mumbai with hands-on experience in Core Java, algorithms, data structures, multithreading and SQL.
We are solving complex technical problems in the industry and need talented software engineers to join our mission and be a part of a global software development team.
A brilliant opportunity to become a part of a highly motivated and expert team which has made a mark as a high-end technical consulting.
Required Skills:
- Exp. - 5+ years.
- Experience in Core Java and Spring Boot.
- Extensive experience in developing enterprise-scale applications and systems. Should possess good architectural knowledge and be aware of enterprise application design patterns.
- Should have the ability to analyze, design, develop and test complex, low-latency client- facing applications.
- Good development experience with RDBMS.
- Good knowledge of multi-threading and high-performance server-side development.
- Basic working knowledge of Unix/Linux.
- Excellent problem solving and coding skills.
- Strong interpersonal, communication and analytical skills.
- Should have the ability to express their design ideas and thoughts.
About Wissen Technology:
Wissen Technology is a niche global consulting and solutions company that brings unparalleled domain expertise in Banking and Finance, Telecom and Startups. Wissen Technology is a part of Wissen Group and was established in the year 2015. Wissen has offices in the US, India, UK, Australia, Mexico, and Canada, with best-in-class infrastructure and development facilities. Wissen has successfully delivered projects worth $1 Billion for more than 25 of the Fortune 500 companies. The Wissen Group overall includes more than 4000 highly skilled professionals.
Wissen Technology provides exceptional value in mission critical projects for its clients, through thought leadership, ownership, and assured on-time deliveries that are always ‘first time right’.
Our team consists of 1200+ highly skilled professionals, with leadership and senior management executives who have graduated from Ivy League Universities like Wharton, MIT, IITs, IIMs, and NITs and with rich work experience in some of the biggest companies in the world.
Wissen Technology offers an array of services including Application Development, Artificial Intelligence & Machine Learning, Big Data & Analytics, Visualization & Business Intelligence, Robotic Process Automation, Cloud, Mobility, Agile & DevOps, Quality Assurance & Test Automation.
We have been certified as a Great Place to Work® for two consecutive years (2020-2022) and voted as the Top 20 AI/ML vendor by CIO Insider.

What We’re Looking For:
- Strong experience in Python (3+ years).
- Hands-on experience with any database (SQL or NoSQL).
- Experience with frameworks like Flask, FastAPI, or Django.
- Knowledge of ORMs, API development, and unit testing.
- Familiarity with Git and Agile methodologies.
Job Title : Oracle Analytics Cloud (OAC) / Fusion Data Intelligence (FDI) Specialist
Experience : 3 to 8 years
Location : All USI locations – Hyderabad, Bengaluru, Mumbai, Gurugram (preferred) and Pune, Chennai, Kolkata
Work Mode : Hybrid Only (2-3 days from office or all 5 days from office)
Mandatory Skills : Oracle Analytics Cloud (OAC), Fusion Data Intelligence (FDI), RPD, OAC Reports, Data Visualizations, SQL, PL/SQL, Oracle Databases, ODI, Oracle Cloud Infrastructure (OCI), DevOps tools, Agile methodology.
Key Responsibilities :
- Design, develop, and maintain solutions using Oracle Analytics Cloud (OAC).
- Build and optimize complex RPD models, OAC reports, and data visualizations.
- Utilize SQL and PL/SQL for data querying and performance optimization.
- Develop and manage applications hosted on Oracle Cloud Infrastructure (OCI).
- Support Oracle Cloud migrations, OBIEE upgrades, and integration projects.
- Collaborate with teams using the ODI (Oracle Data Integrator) tool for ETL processes.
- Implement cloud scripting using CURL for Oracle Cloud automation.
- Contribute to the design and implementation of Business Continuity and Disaster Recovery strategies for cloud applications.
- Participate in Agile development processes and DevOps practices including CI/CD and deployment orchestration.
Required Skills :
- Strong hands-on expertise in Oracle Analytics Cloud (OAC) and/or Fusion Data Intelligence (FDI).
- Deep understanding of data modeling, reporting, and visualization techniques.
- Proficiency in SQL, PL/SQL, and relational databases on Oracle.
- Familiarity with DevOps tools, version control, and deployment automation.
- Working knowledge of Oracle Cloud services, scripting, and monitoring.
Good to Have :
- Prior experience in OBIEE to OAC migrations.
- Exposure to data security models and cloud performance tuning.
- Certification in Oracle Cloud-related technologies.

Job Title : IBM Sterling Integrator Developer
Experience : 3 to 5 Years
Locations : Hyderabad, Bangalore, Mumbai, Gurgaon, Chennai, Pune
Employment Type : Full-Time
Job Description :
We are looking for a skilled IBM Sterling Integrator Developer with 3–5 years of experience to join our team across multiple locations.
The ideal candidate should have strong expertise in IBM Sterling and integration, along with scripting and database proficiency.
Key Responsibilities :
- Develop, configure, and maintain IBM Sterling Integrator solutions.
- Design and implement integration solutions using IBM Sterling.
- Collaborate with cross-functional teams to gather requirements and provide solutions.
- Work with custom languages and scripting to enhance and automate integration processes.
- Ensure optimal performance and security of integration systems.
Must-Have Skills :
- Hands-on experience with IBM Sterling Integrator and associated integration tools.
- Proficiency in at least one custom scripting language.
- Strong command over Shell scripting, Python, and SQL (mandatory).
- Good understanding of EDI standards and protocols is a plus.
Interview Process :
- 2 Rounds of Technical Interviews.
Additional Information :
- Open to candidates from Hyderabad, Bangalore, Mumbai, Gurgaon, Chennai, and Pune.

Job Summary:
As an AWS Data Engineer, you will be responsible for designing, developing, and maintaining scalable, high-performance data pipelines using AWS services. With 6+ years of experience, you’ll collaborate closely with data architects, analysts, and business stakeholders to build reliable, secure, and cost-efficient data infrastructure across the organization.
Key Responsibilities:
- Design, develop, and manage scalable data pipelines using AWS Glue, Lambda, and other serverless technologies
- Implement ETL workflows and transformation logic using PySpark and Python on AWS Glue
- Leverage AWS Redshift for warehousing, performance tuning, and large-scale data queries
- Work with AWS DMS and RDS for database integration and migration
- Optimize data flows and system performance for speed and cost-effectiveness
- Deploy and manage infrastructure using AWS CloudFormation templates
- Collaborate with cross-functional teams to gather requirements and build robust data solutions
- Ensure data integrity, quality, and security across all systems and processes
Required Skills & Experience:
- 6+ years of experience in Data Engineering with strong AWS expertise
- Proficient in Python and PySpark for data processing and ETL development
- Hands-on experience with AWS Glue, Lambda, DMS, RDS, and Redshift
- Strong SQL skills for building complex queries and performing data analysis
- Familiarity with AWS CloudFormation and infrastructure as code principles
- Good understanding of serverless architecture and cost-optimized design
- Ability to write clean, modular, and maintainable code
- Strong analytical thinking and problem-solving skills

- Strong Snowflake Cloud database experience Database developer.
- Knowledge of Spark and Databricks is desirable.
- Strong technical background in data modelling, database design and optimization for data warehouses, specifically on column oriented MPP architecture
- Familiar with technologies relevant to data lakes such as Snowflake
- Candidate should have strong ETL & database design/modelling skills.
- Experience creating data pipelines
- Strong SQL skills and debugging knowledge and Performance Tuning exp.
- Experience with Databricks / Azure is add on /good to have .
- Experience working with global teams and global application environments
- Strong understanding of SDLC methodologies with track record of high quality deliverables and data quality, including detailed technical design documentation desired
Java Developer – Job Description
Wissen Technology is now hiring for a Java Developer - Bangalore with hands-on experience in Core Java, algorithms, data structures, multithreading and SQL. We are solving complex technical problems in the industry and need talented software engineers to join our mission and be a part of a global software development team. A brilliant opportunity to become a part of a highly motivated and expert team which has made a mark as a high-end technical consulting. Required Skills: • Exp. - 5 to 12 years. • Experience in Core Java and Spring Boot. • Extensive experience in developing enterprise-scale applications and systems. Should possess good architectural knowledge and be aware of enterprise application design patterns. • Should have the ability to analyze, design, develop and test complex, low-latency client facing applications. • Good development experience with RDBMS. • Good knowledge of multi-threading and high-performance server-side development. • Basic working knowledge of Unix/Linux. • Excellent problem solving and coding skills. • Strong interpersonal, communication and analytical skills. • Should have the ability to express their design ideas and thoughts. About Wissen Technology: Wissen Technology is a niche global consulting and solutions company that brings unparalleled domain expertise in Banking and Finance, Telecom and Startups. Wissen Technology is a part of Wissen Group and was established in the year 2015. Wissen has offices in the US, India, UK, Australia, Mexico, and Canada, with best-in-class infrastructure and development facilities. Wissen has successfully delivered projects worth $1 Billion for more than 25 of the Fortune 500 companies. The Wissen Group overall includes more than 4000 highly skilled professionals. Wissen Technology provides exceptional value in mission critical projects for its clients, through thought leadership, ownership, and assured on-time deliveries that are always ‘first time right’. Our team consists of 1200+ highly skilled professionals, with leadership and senior management executives who have graduated from Ivy League Universities like Wharton, MIT, IITs, IIMs, and NITs and with rich work experience in some of the biggest companies in the world. Wissen Technology offers an array of services including Application Development, Artificial Intelligence & Machine Learning, Big Data & Analytics, Visualization & Business Intelligence, Robotic Process Automation, Cloud, Mobility, Agile & DevOps, Quality Assurance & Test Automation. We have been certified as a Great Place to Work® for two consecutive years (2020-2022) and voted as the Top 20 AI/ML vendor by CIO Insider.
Position: QA Automation Engineer
Location: Mumbai, India
Experience: 2+ Years
Type: Full-Time
Company: Deqode
Overview:
Deqode is seeking a skilled QA Automation Engineer with a passion for quality, automation, and robust testing practices. This role is ideal for professionals from product-based software development companies who have worked on e-commerce platforms.
Required Skills:
- Proficiency in Selenium (Grid, parallel execution), TestNG.
- Experience with API testing (Postman, RestAssured).
- Strong SQL and backend data validation.
- Experience using Git, Jira, Asana.
- Familiarity with Jenkins, Confluence.
- Understanding of cross-browser testing.
- Exposure to Mocha_Chai for frontend/backend automation is a plus.
- Eye for design and UX consistency.
- Strong written and verbal communication.
Preferred Background:
- Must be from a product-based software development company.
- Prior experience in e-commerce projects is a major plus.
- Ability to work on time-critical and fast-paced projects.
Key Responsibilities:
- Design and maintain automated test frameworks for web and API applications.
- Perform manual and automated tests to ensure product quality.
- Build and execute test cases using Selenium with TestNG.
- Conduct comprehensive REST API testing.
- Write and optimize complex SQL queries for test data validation.
- Use Jira/Asana for issue tracking and documentation.
- Collaborate using Confluence for test documentation.
- Execute tests across multiple browsers and operating systems.
- Participate in Agile processes like sprint planning and retrospectives.
- Identify and troubleshoot issues during testing.
- Maintain CI pipelines using Jenkins.

Required Skills:
- Hands-on experience with Databricks, PySpark
- Proficiency in SQL, Python, and Spark.
- Understanding of data warehousing concepts and data modeling.
- Experience with CI/CD pipelines and version control (e.g., Git).
- Fundamental knowledge of any cloud services, preferably Azure or GCP.
Good to Have:
- Bigquery
- Experience with performance tuning and data governance.
Position Overview
We're seeking a skilled Full Stack Developer to build and maintain scalable web applications using modern technologies. You'll work across the entire development stack, from database design to user interface implementation.
Key Responsibilities
- Develop and maintain full-stack web applications using Node.js and TypeScript
- Design and implement RESTful APIs and microservices
- Build responsive, user-friendly front-end interfaces
- Design and optimize SQL databases and write efficient queries
- Collaborate with cross-functional teams on feature development
- Participate in code reviews and maintain high code quality standards
- Debug and troubleshoot application issues across the stack
Required Skills
- Backend: 3+ years experience with Node.js and TypeScript
- Database: Proficient in SQL (PostgreSQL, MySQL, or similar)
- Frontend: Experience with modern JavaScript frameworks (React, Vue, or Angular)
- Version Control: Git and collaborative development workflows
- API Development: RESTful services and API design principles
Preferred Qualifications
- Experience with cloud platforms (AWS, Azure, or GCP)
- Knowledge of containerization (Docker)
- Familiarity with testing frameworks (Jest, Mocha, or similar)
- Understanding of CI/CD pipelines
What We Offer
- Competitive salary and benefits
- Flexible work arrangements
- Professional development opportunities
- Collaborative team environment
Required Technical Skill Set:Teradata with Marketing Campaign knowledge and SAS
Desired Competencies (Technical/Behavioral Competency)
Must-Have
1. Advanced coding skills in Teradata SQL and SAS is required
2. Experience with customer segmentation, marketing optimization, and marketing automation. Thorough understanding of customer contact management principles
3. Design and execution of campaign on consumer and business products using Teradata communication manager and inhouse tools
4. Analyzing effectiveness of various campaigns by doing necessary analysis to add insights and improve future campaigns
5. Timely resolution of Marketing team queries and other ad-hoc request
Good-to-Have
1. Awareness of CRM tools & process, automation
2. Knowledge of commercial databases preferable
3. People & team management skills
Experience: 5+ years
Location: Mumbai - Andheri Marol
Company: Wissen Technology
Website: www.wissen.com
Job Description:
Wissen Technology is currently hiring experienced Full Stack Java Developers with strong expertise in Core Java, Spring Boot, front-end technologies (Angular/React), REST APIs, multithreading, data structures, and SQL.
You will be working on enterprise-grade solutions as part of a global development team tackling complex problems in domains like Banking and Finance.
This is an excellent opportunity to be part of a high-caliber, technically strong team delivering impactful solutions for Fortune 500 companies.
Key Responsibilities:
- Develop and maintain scalable, high-performance applications with a focus on both backend (Java/Spring Boot) and frontend (Angular/React).
- Participate in all phases of the development lifecycle – design, coding, testing, deployment, and maintenance.
- Work closely with cross-functional teams to understand requirements and deliver quality solutions.
- Optimize application performance and ensure responsiveness across platforms.
- Write clean, maintainable, and well-documented code.
Required Skills:
- 5+ years of hands-on experience in Java, Spring Boot.
- Strong proficiency in frontend frameworks like Angular or React.
- Experience with RESTful APIs, microservices, and web technologies (HTML, CSS, JavaScript).
- Sound knowledge of data structures, algorithms, and multithreading.
- Experience in developing and deploying enterprise-grade applications.
- Solid understanding of RDBMS (e.g., Oracle, MySQL, PostgreSQL).
- Exposure to DevOps practices, version control (Git), and CI/CD pipelines.
- Familiarity with Unix/Linux environments.
- Strong problem-solving, communication, and analytical skills.
Good to Have:
- Exposure to cloud platforms like AWS, Azure, or GCP.
- Understanding of containerization using Docker/Kubernetes.
- Knowledge of Agile methodologies.
About Wissen Technology:
Wissen Technology is a niche global consulting and solutions company that brings unparalleled domain expertise in Banking and Finance, Telecom and Startups. Wissen Technology is a part of Wissen Group and was established in the year 2015. Wissen has offices in the US, India, UK, Australia, Mexico, and Canada, with best-in-class infrastructure and development facilities. Wissen has successfully delivered projects worth $1 Billion for more than 25 of the Fortune 500 companies. The Wissen Group overall includes more than 4000 highly skilled professionals.
Wissen Technology provides exceptional value in mission critical projects for its clients, through thought leadership, ownership, and assured on-time deliveries that are always ‘first time right’.
Our team consists of 1200+ highly skilled professionals, with leadership and senior management executives who have graduated from Ivy League Universities like Wharton, MIT, IITs, IIMs, and NITs and with rich work experience in some of the biggest companies in the world.
Wissen Technology offers an array of services including Application Development, Artificial Intelligence & Machine Learning, Big Data & Analytics, Visualization & Business Intelligence, Robotic Process Automation, Cloud, Mobility, Agile & DevOps, Quality Assurance & Test Automation.
We have been certified as a Great Place to Work® for two consecutive years (2020-2022) and voted as the Top 20 AI/ML vendor by CIO Insider.
About Fundly
Fundly is building a retailer-centric Pharma Digital Supply Chain Finance platform and marketplace for over 10 million pharma retailers in India.
- Founded by experienced industry professionals with a cumulative experience of 30+ years
- Grown to 100+ people across 20 cities in less than 3 years
- AUM of INR 100+ crores
- Raised venture capital of USD 5M so far
- Fast-growing: 3000+ retailers, 36,000 transactions, and ₹200+ crore disbursed in the last 2 years
- Technology-first and customer-first fintech organization
Opportunity at Fundly
- Be an early team member
- Gain visibility and influence the product and technology roadmap
Responsibilities
- Understand business requirements, customer persona, and product/applications
- Plan and execute test strategy for different projects
- Create test documentation, test plans, and test cases
- Reporting, providing feedback, and suggesting improvements
- Collaborate with other stakeholders
Qualifications
- 2+ years of hands-on experience in QA processes, test planning, and execution
- Hands-on experience in SQL and NoSQL databases like MySQL and Postgres
Who You Are
- Love to understand and solve problems—be it technology, business, or people-related
- Like to take responsibility and accountability
- Have worked in fast-paced environments and are willing to break standard benchmarks of performance
- Hands-on experience with STLC and automation testing
About Fundly
- Fundly is building a retailer-centric Pharma Digital Supply Chain Finance platform and Marketplace for over 10 million pharma retailers in India
- Founded by experienced industry professionals with cumulative experience of 30+ years
- Has grown to 60+ people in 12 cities in less than 2 years
- AUM of INR 100+ crores
- Raised venture capital of USD 3M so far
Opportunity at Fundly
- Fast growing – 3000+ retailers, 36,000+ transactions and ₹200+ crore disbursement in the last 2 years
- Technology-first and customer-first fintech organization
- Be an early team member, visible and influence the product and technology roadmap
Responsibilities
- Establish the QA Strategy and Processes for the Engineering team
- Build and Mentor the QA team
- Plan and Execute Test Strategy for different projects
- Review Test Documentation, Test Plans and Test Cases
- Reporting, Providing Feedback and Suggesting Improvements
- Collaboration with other stakeholders
Qualifications
- 6+ years of hands-on experience in QA Processes, Test Planning and Execution
- Hands-on experience in SQL and NoSQL databases like MySQL, Postgres
- Leadership experience of managing a team of 4–5 QA Engineers
Who You Are
- Loves to understand and solve problems — be it technology, business, or people problems
- Likes to lead, take responsibility and accountability
- Has led a team of 4–5 engineers and delivered results in the past
- Excellent understanding and hands-on experience of STLC and Automation Testing
About Fundly
- Fundly is building a retailer centric Pharma Supply Chain platform and Marketplace for over 10 million pharma retailers in India
- Founded by experienced industry professionals with cumulative experience of 30+ years
- Has grown to 60+ people in 12 cities in less than 2 years 4. Monthly disbursement of INR 50 Cr 5. Raised venture capital of USD 5M so far from Accel Partners which is biggest VC Fund of India
Opportunity at Fundly
- Building a retailer centric ecosystem in Pharma Supply Chain
- 2. Fast growing– 3000+ retailers, 36000 Transactions and 200+ Cr disbursement in last 2 years
- Technology First and Customer first fintech organization
- Be an early team member, visible and influence the product and technology roadmap
- Be a leader and own responsibility and accountability
Responsibilities
- Be hands-on and ship good quality code Fast
- Execute and deploy technical solutions
- Understand existing code, maintain and improve it
- Control Technical Debt
- Ensure healthy software engineering practices like planning, estimation, documentation, code review
Qualifications
- 3+ years of Hands-on experience in Java, Spring Boot, Spring MVC, Hibernate, Play
- Hands on experience in SQL and NoSQL databases like Postgres, MongoDB, ElasticSearch, Redis
About Fundly
- Fundly is building a retailer centric Pharma Supply Chain platform and Marketplace for over 10 million pharma retailers in India
- Founded by experienced industry professionals with cumulative experience of 30+ years
- Has grown to 60+ people in 12 cities in less than 2 years 4. Monthly disbursement of INR 50 Cr 5. Raised venture capital of USD 5M so far from Accel Partners which is biggest VC Fund of India
Opportunity at Fundly
- Building a retailer centric ecosystem in Pharma Supply Chain
- Fast growing– 3000+ retailers, 36000 Transactions and 200+ Cr disbursement in last 2 years
- Technology First and Customer first fintech organization
- Be an early team member, visible and influence the product and technology roadmap
- Be a leader and own responsibility and accountability
Responsibilities
- Be hands-on and ship good quality code Fast
- Execute and deploy technical solutions
- Understand existing code, maintain and improve it
- Control Technical Debt
- Ensure healthy software engineering practices like planning, estimation, documentation, code review
Qualifications
- 1+ year of Hands-on experience in Java, Spring Boot, Spring MVC, Hibernate, Play
- Hands on experience in SQL and NoSQL databases like Postgres, MongoDB, ElasticSearch, Redis


We are seeking an experienced WordPress Developer with expertise in both frontend and backend development. The ideal candidate will have a deep understanding of headless WordPress architecture, where the backend is managed with WordPress, and the frontend is built using React.js (or Next.js). The developer should follow best coding practices to ensure the website is secure, high-performing, scalable, and fully responsive.
Key Responsibilities:
Backend Development (WordPress):
- Develop and maintain a headless WordPress CMS to serve content via REST API / GraphQL.
- Create custom WordPress plugins and themes to optimize content delivery.
- Ensure secure authentication and role-based access for API endpoints.
- Optimize WordPress database queries for better performance.
Frontend Development (React.js / Next.js):
- Build a decoupled frontend using React.js (or Next.js) that fetches content from WordPress.
- Experience with Figma for translating UI/UX designs to code.
- Ensure seamless integration of frontend with WordPress APIs.
- Implement modern UI/UX principles to create responsive, fast-loading web pages.
Code quality, Performance & Security Optimization:
- Optimize website speed using caching, lazy loading, and CDN integration.
- Ensure the website follows SEO best practices and is mobile-friendly.
- Implement security best practices to prevent vulnerabilities such as SQL injection, XSS, and CSRF.
- Write clean, maintainable, and well-documented code following industry standards.
- Implement version control using Git/GitHub/GitLab.
- Conduct regular code reviews and debugging to ensure a high-quality product.
Collaboration & Deployment:
- Work closely with designers, content teams, and project managers.
- Deploy and manage WordPress and frontend code in staging and production environments.
- Monitor website performance and implement improvements.
Required Skills & Qualifications:
- B.E/B. Tech Degree, Master’s Degree required
- Experience: 6 – 8 Years
- Strong experience in React.js / Next.js for building frontend applications.
- Proficiency in JavaScript (ES6+), TypeScript, HTML5, CSS3, and TailwindCSS.
- Familiarity with SSR (Server Side Rendering) and SSG (Static Site Generation).
- Experience in WordPress development (PHP, MySQL, WP REST API, GraphQL).
- Experience with ACF (Advanced Custom Fields), Custom Post Types, WP Headless CMS.
- Strong knowledge of WordPress security, database optimization, and caching techniques.
Why Join Us:
- Competitive salary and benefits package.
- Work in a dynamic, collaborative, and creative environment.
- Opportunity to lead and influence design decisions across various platforms.
- Professional development opportunities and career growth potential.
Job Summary:
Seeking a seasoned SQL + ETL Developer with 4+ years of experience in managing large-scale datasets and cloud-based data pipelines. The ideal candidate is hands-on with MySQL, PySpark, AWS Glue, and ETL workflows, with proven expertise in AWS migration and performance optimization.
Key Responsibilities:
- Develop and optimize complex SQL queries and stored procedures to handle large datasets (100+ million records).
- Build and maintain scalable ETL pipelines using AWS Glue and PySpark.
- Work on data migration tasks in AWS environments.
- Monitor and improve database performance; automate key performance indicators and reports.
- Collaborate with cross-functional teams to support data integration and delivery requirements.
- Write shell scripts for automation and manage ETL jobs efficiently.
Required Skills:
- Strong experience with MySQL, complex SQL queries, and stored procedures.
- Hands-on experience with AWS Glue, PySpark, and ETL processes.
- Good understanding of AWS ecosystem and migration strategies.
- Proficiency in shell scripting.
- Strong communication and collaboration skills.
Nice to Have:
- Working knowledge of Python.
- Experience with AWS RDS.

Profile: AWS Data Engineer
Mode- Hybrid
Experience- 5+7 years
Locations - Bengaluru, Pune, Chennai, Mumbai, Gurugram
Roles and Responsibilities
- Design and maintain ETL pipelines using AWS Glue and Python/PySpark
- Optimize SQL queries for Redshift and Athena
- Develop Lambda functions for serverless data processing
- Configure AWS DMS for database migration and replication
- Implement infrastructure as code with CloudFormation
- Build optimized data models for performance
- Manage RDS databases and AWS service integrations
- Troubleshoot and improve data processing efficiency
- Gather requirements from business stakeholders
- Implement data quality checks and validation
- Document data pipelines and architecture
- Monitor workflows and implement alerting
- Keep current with AWS services and best practices
Required Technical Expertise:
- Python/PySpark for data processing
- AWS Glue for ETL operations
- Redshift and Athena for data querying
- AWS Lambda and serverless architecture
- AWS DMS and RDS management
- CloudFormation for infrastructure
- SQL optimization and performance tuning

Job Overview:
We are seeking an experienced AWS Data Engineer to join our growing data team. The ideal candidate will have hands-on experience with AWS Glue, Redshift, PySpark, and other AWS services to build robust, scalable data pipelines. This role is perfect for someone passionate about data engineering, automation, and cloud-native development.
Key Responsibilities:
- Design, build, and maintain scalable and efficient ETL pipelines using AWS Glue, PySpark, and related tools.
- Integrate data from diverse sources and ensure its quality, consistency, and reliability.
- Work with large datasets in structured and semi-structured formats across cloud-based data lakes and warehouses.
- Optimize and maintain data infrastructure, including Amazon Redshift, for high performance.
- Collaborate with data analysts, data scientists, and product teams to understand data requirements and deliver solutions.
- Automate data validation, transformation, and loading processes to support real-time and batch data processing.
- Monitor and troubleshoot data pipeline issues and ensure smooth operations in production environments.
Required Skills:
- 5 to 7 years of hands-on experience in data engineering roles.
- Strong proficiency in Python and PySpark for data transformation and scripting.
- Deep understanding and practical experience with AWS Glue, AWS Redshift, S3, and other AWS data services.
- Solid understanding of SQL and database optimization techniques.
- Experience working with large-scale data pipelines and high-volume data environments.
- Good knowledge of data modeling, warehousing, and performance tuning.
Preferred/Good to Have:
- Experience with workflow orchestration tools like Airflow or Step Functions.
- Familiarity with CI/CD for data pipelines.
- Knowledge of data governance and security best practices on AWS.
Role - ETL Developer
Work Mode - Hybrid
Experience- 4+ years
Location - Pune, Gurgaon, Bengaluru, Mumbai
Required Skills - AWS, AWS Glue, Pyspark, ETL, SQL
Required Skills:
- 4+ years of hands-on experience in MySQL, including SQL queries and procedure development
- Experience in Pyspark, AWS, AWS Glue
- Experience in AWS ,Migration
- Experience with automated scripting and tracking KPIs/metrics for database performance
- Proficiency in shell scripting and ETL.
- Strong communication skills and a collaborative team player
- Knowledge of Python and AWS RDS is a plus
Job Title: Java Developer
Java Developer – Job Description Wissen Technology is now hiring for a Java Developer - Bangalore with hands-on experience in Core Java, algorithms, data structures, multithreading and SQL. We are solving complex technical problems in the industry and need talented software engineers to join our mission and be a part of a global software development team. A brilliant opportunity to become a part of a highly motivated and expert team which has made a mark as a high-end technical consulting.
Required Skills: • Exp. - 4 to 14 years.
• Experience in Core Java and Spring Boot.
• Extensive experience in developing enterprise-scale applications and systems. Should possess good architectural knowledge and be aware of enterprise application design patterns.
• Should have the ability to analyze, design, develop and test complex, low-latency clientfacing applications.
• Good development experience with RDBMS.
• Good knowledge of multi-threading and high-performance server-side development.
• Basic working knowledge of Unix/Linux.
• Excellent problem solving and coding skills.
• Strong interpersonal, communication and analytical skills.
• Should have the ability to express their design ideas and thoughts.
Hello Everyone,
Job Description Wissen Technology is now hiring for a Java Developer - Bangalore with hands-on experience in Core Java, algorithms, data structures, multithreading and SQL. We are solving complex technical problems in the industry and need talented software engineers to join our mission and be a part of a global software development team. A brilliant opportunity to become a part of a highly motivated and expert team which has made a mark as a high-end technical consulting.
Required Skills: • Exp. - 5 to 14 years.
• Experience in Core Java and Spring Boot.
• Extensive experience in developing enterprise-scale applications and systems. Should possess good architectural knowledge and be aware of enterprise application design patterns.
• Should have the ability to analyze, design, develop and test complex, low-latency clientfacing applications.
• Good development experience with RDBMS.
• Good knowledge of multi-threading and high-performance server-side development.
• Basic working knowledge of Unix/Linux.
• Excellent problem solving and coding skills.
• Strong interpersonal, communication and analytical skills.
• Should have the ability to express their design ideas and thoughts.
About Wissen Technology:
Wissen Technology is a niche global consulting and solutions company that brings unparalleled domain expertise in Banking and Finance, Telecom and Startups. Wissen Technology is a part of Wissen Group and was established in the year 2015. Wissen has offices in the US, India, UK, Australia, Mexico, and Canada, with best-in-class infrastructure and development facilities. Wissen hassuccessfully delivered projects worth $1 Billion for more than 25 of the Fortune 500 companies. The Wissen Group overall includes more than 4000 highly skilled professionals. Wissen Technology provides exceptional value in mission critical projects for its clients, through thought leadership, ownership, and assured on-time deliveries that are always ‘first time right’. Our team consists of 1200+ highly skilled professionals, with leadership and senior management executives who have graduated from Ivy League Universities like Wharton, MIT, IITs, IIMs, and NITs and with rich work experience in some of the biggest companies in the world. Wissen Technology offers an array of services including Application Development, Artificial Intelligence & Machine Learning, Big Data & Analytics, Visualization & Business Intelligence, Robotic Process Automation, Cloud, Mobility, Agile & DevOps, Quality Assurance & Test Automation. We have been certified as a Great Place to Work® for two consecutive years (2020-2022) and voted as the Top 20 AI/ML vendor by CIO Insider.


Senior Data Engineer
Location: Bangalore, Gurugram (Hybrid)
Experience: 4-8 Years
Type: Full Time | Permanent
Job Summary:
We are looking for a results-driven Senior Data Engineer to join our engineering team. The ideal candidate will have hands-on expertise in data pipeline development, cloud infrastructure, and BI support, with a strong command of modern data stacks. You’ll be responsible for building scalable ETL/ELT workflows, managing data lakes and marts, and enabling seamless data delivery to analytics and business intelligence teams.
This role requires deep technical know-how in PostgreSQL, Python scripting, Apache Airflow, AWS or other cloud environments, and a working knowledge of modern data and BI tools.
Key Responsibilities:
PostgreSQL & Data Modeling
· Design and optimize complex SQL queries, stored procedures, and indexes
· Perform performance tuning and query plan analysis
· Contribute to schema design and data normalization
Data Migration & Transformation
· Migrate data from multiple sources to cloud or ODS platforms
· Design schema mapping and implement transformation logic
· Ensure consistency, integrity, and accuracy in migrated data
Python Scripting for Data Engineering
· Build automation scripts for data ingestion, cleansing, and transformation
· Handle file formats (JSON, CSV, XML), REST APIs, cloud SDKs (e.g., Boto3)
· Maintain reusable script modules for operational pipelines
Data Orchestration with Apache Airflow
· Develop and manage DAGs for batch/stream workflows
· Implement retries, task dependencies, notifications, and failure handling
· Integrate Airflow with cloud services, data lakes, and data warehouses
Cloud Platforms (AWS / Azure / GCP)
· Manage data storage (S3, GCS, Blob), compute services, and data pipelines
· Set up permissions, IAM roles, encryption, and logging for security
· Monitor and optimize cost and performance of cloud-based data operations
Data Marts & Analytics Layer
· Design and manage data marts using dimensional models
· Build star/snowflake schemas to support BI and self-serve analytics
· Enable incremental load strategies and partitioning
Modern Data Stack Integration
· Work with tools like DBT, Fivetran, Redshift, Snowflake, BigQuery, or Kafka
· Support modular pipeline design and metadata-driven frameworks
· Ensure high availability and scalability of the stack
BI & Reporting Tools (Power BI / Superset / Supertech)
· Collaborate with BI teams to design datasets and optimize queries
· Support development of dashboards and reporting layers
· Manage access, data refreshes, and performance for BI tools
Required Skills & Qualifications:
· 4–6 years of hands-on experience in data engineering roles
· Strong SQL skills in PostgreSQL (tuning, complex joins, procedures)
· Advanced Python scripting skills for automation and ETL
· Proven experience with Apache Airflow (custom DAGs, error handling)
· Solid understanding of cloud architecture (especially AWS)
· Experience with data marts and dimensional data modeling
· Exposure to modern data stack tools (DBT, Kafka, Snowflake, etc.)
· Familiarity with BI tools like Power BI, Apache Superset, or Supertech BI
· Version control (Git) and CI/CD pipeline knowledge is a plus
· Excellent problem-solving and communication skills
🚀 We're Hiring: Technical Lead – Java Backend & Integration
📍 Bangalore | Hybrid | Full-Time
👨💻 9+ Years Experience | Enterprise Product Development
🏥 Healthcare Tech | U.S. Health Insurance Domain
Join Leading HealthTech, a U.S.-based product company driving innovation in the $1.1 trillion health insurance industry. We power over 81 million lives, with 130+ customers and 100+ third-party integrations. At our growing Bangalore tech hub, you’ll solve real-world, large-scale problems and help modernize one of the most stable and impactful industries in the world.
🔧 What You'll Work On:
- Architect and build backend & integration solutions using Java, J2EE, WebLogic, Spring, Apache Camel
- Transition monolith systems to microservices-based architecture
- Lead design reviews, customer discussions, code quality, UAT & production readiness
- Work with high-volume transactional systems processing millions of health claims daily
- Coach & mentor engineers, contribute to platform modernization
🧠 What You Bring:
- 9+ years in backend Java development and enterprise system integration
- Hands-on with REST, SOAP, JMS, SQL, stored procedures, XML, ESBs
- Solid understanding of SOA, data structures, system design, and performance tuning
- Experience with Agile, CI/CD, unit testing, and code quality tools
- Healthcare/payor domain experience is a huge plus!
💡 Why this opportunity?
- Global product impact from our India technology center
- Work on mission-critical systems in a stable and recession-resilient sector
- Be part of a journey to modernize healthcare through tech
- Solve complex challenges at scale that few companies offer
🎯 Ready to drive change at the intersection of tech and healthcare?

About the company:
Ketto is Asia's largest tech-enabled crowdfunding platform with a vision - Healthcare for all. We are a profit-making organization with a valuation of more than 100 Million USD. With over 1,100 crores raised from more than 60 lakh donors, we have positively impacted the lives of 2 lakh+ campaigners. Ketto has embarked on a high-growth journey, and we would like you to be part of our family, helping us to create a large-scale impact on a daily basis by taking our product to the next level
Role Overview:
Ketto, Asia's largest crowdfunding platform, is looking for an innovative Product Analyst to take charge of our data systems, reporting frameworks, and generative AI initiatives. This role is pivotal in ensuring data integrity and reliability, driving key insights that fuel strategic decisions, and implementing automation through AI. This position encompasses the full data and analytics lifecycle—from requirements gathering to design planning—alongside implementing advanced analytics and generative AI solutions to support Ketto’s mission.
Key Responsibilities
● Data Strategy & Automation:
○ Lead data collection, processing, and quality assurance processes to ensure accuracy, completeness, and relevance.
○ Explore opportunities to incorporate generative AI models to automate and optimize processes, enhancing efficiencies in analytics, reporting, and decision-making.
● Data Analysis & Insight Generation:
○ Conduct in-depth analyses of user behaviour, campaign performance, and platform metrics to uncover insights that support crowdfunding success.
○ Translate complex data into clear, actionable insights that drive strategic decisions, providing stakeholders with the necessary information to enhance business outcomes.
● Reporting & Quality Assurance:
○ Design and maintain a robust reporting framework to deliver timely insights, enhancing data reliability and ensuring stakeholders are well-informed.
○ Monitor and improve data accuracy, consistency, and integrity across all data processes, identifying and addressing areas for enhancement.
● Collaboration & Strategic Planning:
○ Work closely with Business, Product, and IT teams to align data initiatives with Ketto’s objectives and growth strategy.
○ Propose data-driven strategies that leverage AI and automation to tackle business challenges and scale impact across the platform.
○ Mentor junior data scientists and analysts, fostering a culture of data-driven decision-making.
Required Skills and Qualifications
● Technical Expertise:
○ Strong background in SQL, Statistics and Maths
● Analytical & Strategic Mindset:
○ Proven ability to derive meaningful, actionable insights from large data sets and translate findings into business strategies.
○ Experience with statistical analysis, advanced analytics
● Communication & Collaboration:
○ Exceptional written and verbal communication skills, capable of explaining complex data insights to non-technical stakeholders.
○ Strong interpersonal skills to work effectively with cross-functional teams, aligning data initiatives with organisational goals.
● Preferred Experience:
○ Proven experience in advanced analytics roles
○ Experience leading data lifecycle management, model deployment, and quality assurance initiatives.
Why Join Ketto?
At Ketto, we’re committed to driving social change through innovative data and AI solutions. As our Sr. Product Analyst, you’ll have the unique opportunity to leverage advanced data science and generative AI to shape the future of crowdfunding in Asia. If you’re passionate about using data and AI for social good, we’d love to hear from you!
A Production Support Engineer ensures the smooth operation of software applications and IT systems in a production environment. Here’s a breakdown of the role:
Key Responsibilities
- Monitoring System Performance: Continuously track application health and resolve performance issues.
- Incident Management: Diagnose and fix software failures, collaborating with developers and system administrators.
- Troubleshooting & Debugging: Analyze logs, use diagnostic tools, and implement solutions to improve system reliability.
- Documentation & Reporting: Maintain records of system issues, resolutions, and process improvements.
- Collaboration: Work with cross-functional teams to enhance system efficiency and reduce downtime.
- Process Optimization: Suggest improvements to reduce production costs and enhance system stability.
Required Skills
- Strong knowledge of SQL, UNIX/Linux, Java, Oracle, and Splunk.
- Experience in incident management and debugging.
- Ability to analyze system failures and optimize performance.
- Good communication and problem-solving skills.

Work Mode: Hybrid
Need B.Tech, BE, M.Tech, ME candidates - Mandatory
Must-Have Skills:
● Educational Qualification :- B.Tech, BE, M.Tech, ME in any field.
● Minimum of 3 years of proven experience as a Data Engineer.
● Strong proficiency in Python programming language and SQL.
● Experience in DataBricks and setting up and managing data pipelines, data warehouses/lakes.
● Good comprehension and critical thinking skills.
● Kindly note Salary bracket will vary according to the exp. of the candidate -
- Experience from 4 yrs to 6 yrs - Salary upto 22 LPA
- Experience from 5 yrs to 8 yrs - Salary upto 30 LPA
- Experience more than 8 yrs - Salary upto 40 LPA

Job Description: Data Engineer
Position Overview:
Role Overview
We are seeking a skilled Python Data Engineer with expertise in designing and implementing data solutions using the AWS cloud platform. The ideal candidate will be responsible for building and maintaining scalable, efficient, and secure data pipelines while leveraging Python and AWS services to enable robust data analytics and decision-making processes.
Key Responsibilities
· Design, develop, and optimize data pipelines using Python and AWS services such as Glue, Lambda, S3, EMR, Redshift, Athena, and Kinesis.
· Implement ETL/ELT processes to extract, transform, and load data from various sources into centralized repositories (e.g., data lakes or data warehouses).
· Collaborate with cross-functional teams to understand business requirements and translate them into scalable data solutions.
· Monitor, troubleshoot, and enhance data workflows for performance and cost optimization.
· Ensure data quality and consistency by implementing validation and governance practices.
· Work on data security best practices in compliance with organizational policies and regulations.
· Automate repetitive data engineering tasks using Python scripts and frameworks.
· Leverage CI/CD pipelines for deployment of data workflows on AWS.
Required Skills and Qualifications
· Professional Experience: 5+ years of experience in data engineering or a related field.
· Programming: Strong proficiency in Python, with experience in libraries like pandas, pySpark, or boto3.
· AWS Expertise: Hands-on experience with core AWS services for data engineering, such as:
· AWS Glue for ETL/ELT.
· S3 for storage.
· Redshift or Athena for data warehousing and querying.
· Lambda for serverless compute.
· Kinesis or SNS/SQS for data streaming.
· IAM Roles for security.
· Databases: Proficiency in SQL and experience with relational (e.g., PostgreSQL, MySQL) and NoSQL (e.g., DynamoDB) databases.
· Data Processing: Knowledge of big data frameworks (e.g., Hadoop, Spark) is a plus.
· DevOps: Familiarity with CI/CD pipelines and tools like Jenkins, Git, and CodePipeline.
· Version Control: Proficient with Git-based workflows.
· Problem Solving: Excellent analytical and debugging skills.
Optional Skills
· Knowledge of data modeling and data warehouse design principles.
· Experience with data visualization tools (e.g., Tableau, Power BI).
· Familiarity with containerization (e.g., Docker) and orchestration (e.g., Kubernetes).
· Exposure to other programming languages like Scala or Java.
Education
· Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field.
Why Join Us?
· Opportunity to work on cutting-edge AWS technologies.
· Collaborative and innovative work environment.

Requirements:
- Must have proficiency in Python
- At least 3+ years of professional experience in software application development.
- Good understanding of REST APIs and a solid experience in testing APIs.
- Should have built APIs at some point and practical knowledge on working with them
- Must have experience in API testing tools like Postman and in setting up the prerequisites and post-execution validations using these tools
- Ability to develop applications for test automation
- Should have worked in a distributed micro-service environment
- Hands-on experience with Python packages for testing (preferably pytest).
- Should be able to create fixtures, mock objects and datasets that can be used by tests across different micro-services
- Proficiency in gitStrong in writing SQL queriesTools like Jira, Asana or similar bug tracking tool, Confluence - Wiki, Jenkins - CI tool
- Excellent written and oral communication and organisational skills with the ability to work within a growing company with increasing needs
- Proven track record of ability to handle time-critical projects
Good to have:
- Good understanding of CI/CDKnowledge of queues, especially Kafka
- Ability to independently manage test environment deployments and handle issues around itPerformed load testing of API endpoints
- Should have built an API test automation framework from scratch and maintained it
- Knowledge of cloud platforms like AWS, Azure
- Knowledge of different browsers and cross-platform operating systems
- Knowledge of JavaScript
- Web Programming, Docker & 3-Tier Architecture Knowledge is preferred.
- Should have knowlege in API Creation, Coding Experience would be add on.
- 5+ years experience in test automation using tools like TestNG, Selenium Webdriver (Grid, parallel, SauceLabs), Mocha_Chai front-end and backend test automation
- Bachelor's degree in Computer Science / IT / Computer Applications
We're Hiring: Java Developers | Mumbai (Hybrid) 🚀
Are you a passionate Java Developer with 5 to 10 years of experience? Here's your chance to take your career to the next level! 💼
We're looking for talented professionals to join an exciting opportunity with a top-tier BFSI domain Project—a true leader in the market. 🏦💻
🔹 Location: Mumbai
🔹 Work Mode: Hybrid
🔹 Experience: 4 to 10 years
🔹 Domain: BFSI
This is more than just a job—it's a chance to work on impactful projects and grow with some of the best minds in the industry. 🌟
👉 If you're interested, please share your updated resume along with the following details:
Total Experience
Current CTC
Expected CTC
Tag your network or apply now—this could be your next big move! 🔄🚀

Responsibilities:
- Design, develop, and maintain scalable backend systems for live products.
- Build and implement RESTful APIs for seamless communication between systems.
- Collaborate with cross-functional teams to translate business requirements into technical solutions.
- Worked with databases (SQL or NoSQL) for efficient data storage and retrieval.
- Troubleshoot, debug, and optimize backend systems for better performance and scalability.
- Demonstrate a willingness to learn and work with Ruby, as it is the primary language used.
Requirements:
- Hands-on experience building RESTful APIs.
- Experience working on live products.
Preferred Skills:
- Experience with backend-heavy development or full-stack development with a backend focus.
- Proficiency in Ruby or RoR
Additional Information:
- Candidates must have hands-on experience with live, production-ready products.
- We are open to candidates with expertise in any backend language but prefer those who can quickly adapt to Ruby.
Join us at Raising Superstars, where your backend skills will contribute to building impactful, scalable solutions!
Profile: Product Support Engineer
🔴 Experience: 1 year as Product Support Engineer.
🔴 Location: Mumbai (Andheri).
🔴 5 days of working from office.
Skills Required:
🔷 Experience in providing support for ETL or data warehousing is preferred.
🔷 Good Understanding on Unix and Databases concepts.
🔷 Experience working with SQL and No-SQL databases and writing simple
queries to get data for debugging issues.
🔷 Being able to creatively come up with solutions for various problems and
implement them.
🔷 Experience working with REST APIs and debugging requests and
responses using tools like Postman.
🔷 Quick troubleshooting and diagnosing skills.
🔷 Knowledge of customer success processes.
🔷 Experience in document creation.
🔷 High availability for fast response to customers.
🔷 Language knowledge required in one of NodeJs, Python, Java.
🔷 Background in AWS, Docker, Kubernetes, Networking - an advantage.
🔷 Experience in SAAS B2B software companies - an advantage.
🔷 Ability to join the dots around multiple events occurring concurrently and
spot patterns.


JioTesseract, a digital arm of Reliance Industries, is India's leading and largest AR/VR organization with the mission to democratize mixed reality for India and the world. We make products at the cross of hardware, software, content and services with focus on making India the leader in spatial computing. We specialize in creating solutions in AR, VR and AI, with some of our notable products such as JioGlass, JioDive, 360 Streaming, Metaverse, AR/VR headsets for consumers and enterprise space.
Mon-fri role, In office, with excellent perks and benefits!
Position Overview
We are seeking a Software Architect to lead the design and development of high-performance robotics and AI software stacks utilizing NVIDIA technologies. This role will focus on defining scalable, modular, and efficient architectures for robot perception, planning, simulation, and embedded AI applications. You will collaborate with cross-functional teams to build next-generation autonomous systems 9
Key Responsibilities:
1. System Architecture & Design
● Define scalable software architectures for robotics perception, navigation, and AI-driven decision-making.
● Design modular and reusable frameworks that leverage NVIDIA’s Jetson, Isaac ROS, Omniverse, and CUDA ecosystems.
● Establish best practices for real-time computing, GPU acceleration, and edge AI inference.
2. Perception & AI Integration
● Architect sensor fusion pipelines using LIDAR, cameras, IMUs, and radar with DeepStream, TensorRT, and ROS2.
● Optimize computer vision, SLAM, and deep learning models for edge deployment on Jetson Orin and Xavier.
● Ensure efficient GPU-accelerated AI inference for real-time robotics applications.
3. Embedded & Real-Time Systems
● Design high-performance embedded software stacks for real-time robotic control and autonomy.
● Utilize NVIDIA CUDA, cuDNN, and TensorRT to accelerate AI model execution on Jetson platforms.
● Develop robust middleware frameworks to support real-time robotics applications in ROS2 and Isaac SDK.
4. Robotics Simulation & Digital Twins
● Define architectures for robotic simulation environments using NVIDIA Isaac Sim & Omniverse.
● Leverage synthetic data generation (Omniverse Replicator) for training AI models.
● Optimize sim-to-real transfer learning for AI-driven robotic behaviors.
5. Navigation & Motion Planning
● Architect GPU-accelerated motion planning and SLAM pipelines for autonomous robots.
● Optimize path planning, localization, and multi-agent coordination using Isaac ROS Navigation.
● Implement reinforcement learning-based policies using Isaac Gym.
6. Performance Optimization & Scalability
● Ensure low-latency AI inference and real-time execution of robotics applications.
● Optimize CUDA kernels and parallel processing pipelines for NVIDIA hardware.
● Develop benchmarking and profiling tools to measure software performance on edge AI devices.
Required Qualifications:
● Master’s or Ph.D. in Computer Science, Robotics, AI, or Embedded Systems.
● Extensive experience (7+ years) in software development, with at least 3-5 years focused on architecture and system design, especially for robotics or embedded systems.
● Expertise in CUDA, TensorRT, DeepStream, PyTorch, TensorFlow, and ROS2.
● Experience in NVIDIA Jetson platforms, Isaac SDK, and GPU-accelerated AI.
● Proficiency in programming languages such as C++, Python, or similar, with deep understanding of low-level and high-level design principles.
● Strong background in robotic perception, planning, and real-time control.
● Experience with cloud-edge AI deployment and scalable architectures.
Preferred Qualifications
● Hands-on experience with NVIDIA DRIVE, NVIDIA Omniverse, and Isaac Gym
● Knowledge of robot kinematics, control systems, and reinforcement learning
● Expertise in distributed computing, containerization (Docker), and cloud robotics
● Familiarity with automotive, industrial automation, or warehouse robotics
● Experience designing architectures for autonomous systems or multi-robot systems.
● Familiarity with cloud-based solutions, edge computing, or distributed computing for robotics
● Experience with microservices or service-oriented architecture (SOA)
● Knowledge of machine learning and AI integration within robotic systems
● Knowledge of testing on edge devices with HIL and simulations (Isaac Sim, Gazebo, V-REP etc.)

What we want to accomplish and why we need you?
Jio Haptik is an AI leader having pioneered AI-powered innovation since 2013. Reliance Jio Digital Services acquired Haptik in April 2019. Haptik currently leads India’s AI market having become the first to process 15 billion+ two-way conversations across 10+ channels and in 135 languages. Haptik is also a Category Leader across platforms including Gartner, G2, Opus Research & more. Recently Haptik won the award for “Tech Startup of the Year” in the AI category at Entrepreneur India Awards 2023, and gold medal for “Best Chat & Conversational Bot” at Martequity Awards 2023. Haptik has a headcount of 200+ employees with offices in Mumbai, Delhi, and Bangalore.
What will you do everyday?
As a backend engineer you will be responsible for building the Haptik platform which is used by people across the globe. You will be responsible for developing, architecting and scaling the systems that support all the functions of the Haptik platform. While you know how to work hard, you also know how to have fun at work and make friends with your colleagues.
Ok, you're sold, but what are we looking for in the perfect candidate?
Develop and maintain expertise in backend systems and API development, ensuring seamless integrations and scalable solutions, including:
- Strong expertise in backend systems, including design principles and adherence to good coding practices.
- Proven ability to enhance or develop complex tools at scale with a thorough understanding of system architecture.
- Capability to work cross-functionally with all teams, ensuring seamless implementation of APIs and solutioning for various tools.
- Skilled in high-level task estimation, scoping, and breaking down complex projects into actionable tasks.
- Proficiency in modeling and optimizing database architecture for enhanced performance and scalability.
- Experience collaborating with product teams to build innovative Proof of Concepts (POCs).
- Ability to respond to data requests and generate reports to support data-driven decision-making.
- Active participation in code reviews, automated testing, and quality assurance processes.
- Experience working in a scrum-based agile development environment.
- Commitment to staying updated with technology standards, emerging trends, and software development best practices.
- Strong verbal and written communication skills to facilitate collaboration and clarity.
Requirements*:
- A minimum of 5 years of experience in developing scalable products and applications.
- Must Have Bachelor's degree in Computer Engineering or related field.
- Proficiency in Python and expertise in at least one backend framework, with a preference for Django.
- Hands-on experience designing normalized database schemas for large-scale applications using technologies such as MySQL, MongoDB, or Elasticsearch.
- Practical knowledge of in-memory data stores like Redis or Memcached.
- Familiarity with working in agile environments and exposure to tools like Jira is highly desirable.
- Proficiency in using version control systems like Git.
- Strong communication skills and the ability to collaborate effectively in team settings.
- Self-motivated with a strong sense of ownership and commitment to delivering results.
- Additional knowledge of RabbitMQ, AWS/Azure services, Docker, MQTT, Lambda functions, Cron jobs, Kibana, and Logstash is an added advantage.
- Knowledge of web servers like Nginx/Apache is considered a valuable asset.
* Requirements is such a strong word. We don’t necessarily expect to find a candidate that has done everything listed, but you should be able to make a credible case that you’ve done most of it and are ready for the challenge of adding some new things to your resume.
Tell me more about Haptik
- On a roll: Announced major strategic partnership with Jio.
- Great team: You will be working with great leaders who have been listed in Business World 40 Under 40, Forbes 30 Under 30 and MIT 35 Under 35 Innovators.
- Great culture: The freedom to think and innovate is something that defines the culture of Haptik. Every person is approachable. While we are working hard, it is also important to take breaks to not get too worked up.
- Huge market: Disrupting a massive, growing chatbot market. The global market is projected to attain a valuation of US $0.94 bn by the end of 2024 progressing from US $0.11 bn earned in 2015.
- Great customers: Businesses across industries - Samsung, HDFCLife, Times of India are some that have relied on Haptik's Conversational AI solutions to engage, acquire, service and understand customers.
- Impact: A fun and exciting start-up culture that empowers its people to make a huge impact.
Working hard for things that we don't care about is stress, but working hard for something we love is called passion! At Haptik we passionately solve problems in order to be able to move faster and don't shy away from breaking things!


Required Skills and Experience:
- 5-7 years of experience in full-stack software development.
- Solid proficiency in Angular (latest versions preferred).
- Strong understanding of Angular components, modules, services, and performance optimization.
- Proven experience in C# and .NET development.
- Experience in designing and integrating RESTful APIs using Swagger.
- Solid understanding of front-end and back-end development principles.
- Excellent problem-solving and debugging skills.
- Strong communication and collaboration skills.
- Experience with Git and GitHub for version control.
- Experience with CI/CD pipelines and DevOps practices.
- Experience writing and maintaining integration tests.
- Experience with database technologies (SQL or NoSQL, MongoDB).
Nice-to-Have Skills:
- Experience with database technologies (SQL or NoSQL, MongoDB).
- Understanding of cloud platforms (Azure).

Job Description:
Please find below details:
Experience - 5+ Years
Location- Bangalore/Python
Role Overview
We are seeking a skilled Python Data Engineer with expertise in designing and implementing data solutions using the AWS cloud platform. The ideal candidate will be responsible for building and maintaining scalable, efficient, and secure data pipelines while leveraging Python and AWS services to enable robust data analytics and decision-making processes.
Key Responsibilities
- Design, develop, and optimize data pipelines using Python and AWS services such as Glue, Lambda, S3, EMR, Redshift, Athena, and Kinesis.
- Implement ETL/ELT processes to extract, transform, and load data from various sources into centralized repositories (e.g., data lakes or data warehouses).
- Collaborate with cross-functional teams to understand business requirements and translate them into scalable data solutions.
- Monitor, troubleshoot, and enhance data workflows for performance and cost optimization.
- Ensure data quality and consistency by implementing validation and governance practices.
- Work on data security best practices in compliance with organizational policies and regulations.
- Automate repetitive data engineering tasks using Python scripts and frameworks.
- Leverage CI/CD pipelines for deployment of data workflows on AWS.
Required Skills and Qualifications
- Professional Experience: 5+ years of experience in data engineering or a related field.
- Programming: Strong proficiency in Python, with experience in libraries like pandas, pySpark, or boto3.
- AWS Expertise: Hands-on experience with core AWS services for data engineering, such as:
- AWS Glue for ETL/ELT.
- S3 for storage.
- Redshift or Athena for data warehousing and querying.
- Lambda for serverless compute.
- Kinesis or SNS/SQS for data streaming.
- IAM Roles for security.
- Databases: Proficiency in SQL and experience with relational (e.g., PostgreSQL, MySQL) and NoSQL (e.g., DynamoDB) databases.
- Data Processing: Knowledge of big data frameworks (e.g., Hadoop, Spark) is a plus.
- DevOps: Familiarity with CI/CD pipelines and tools like Jenkins, Git, and CodePipeline.
- Version Control: Proficient with Git-based workflows.
- Problem Solving: Excellent analytical and debugging skills.
Optional Skills
- Knowledge of data modeling and data warehouse design principles.
- Experience with data visualization tools (e.g., Tableau, Power BI).
- Familiarity with containerization (e.g., Docker) and orchestration (e.g., Kubernetes).
- Exposure to other programming languages like Scala or Java.
Role Summary:
Lead and drive the development in BI domain using Tableau eco-system with deep technical and BI ecosystem knowledge. The resource will be responsible for the dashboard design, development, and delivery of BI services using Tableau eco-system.
Key functions & responsibilities:
· Communication & interaction with Project Manager to understand the requirement
· Dashboard designing, development and deployment using Tableau eco-system
· Ensure delivery within given time frame while maintaining quality
· Stay up to date with current tech and bring relevant ideas to the table
· Proactively work with the Management team to identify and resolve issues
· Performs other related duties as assigned or advised
· He/she should be a leader that sets the standard and expectations through example in his/her conduct, work ethic, integrity and character
· Contribute in dashboard designing, R&D and project delivery using Tableau
Experience:
· Overall 3-7 Years of experience in DWBI development projects, having worked on BI and Visualization technologies (Tableau, Qlikview) for at least 3 years.
· At least 3 years of experience covering Tableau implementation lifecycle including hands-on development/programming, managing security, data modeling, data blending, etc.
Technology & Skills:
· Hands-on expertise of Tableau administration and maintenance
· Strong working knowledge and development experience with Tableau Server and Desktop
· Strong knowledge in SQL, PL/SQL and Data modelling
· Knowledge of databases like Microsoft SQL Server, Oracle, etc.
· Exposure to alternate Visualization technologies like Qlikview, Spotfire, Pentaho etc.
· Good communication & Analytical skills with Excellent creative and conceptual thinking abilities
· Superior organizational skills, attention to detail/level of quality, Strong communication skills, both verbal and written
We are looking for credit analysts who can help them perform threshold analysis and impact calculations to streamline their credit underwriting process. Hence, they are looking for people who have experience in doing similar or related work. Since this is an urgent requirement, your quick help is requested in filling this role.
Attaching sample profile as well.
Manager and SM - 25-50LPA
Skills: Credit analyst + SAS
Goot to have: Credit underwriting, EWS (Early warning signal),EBC regulations, IFRS9.
Level: SM or M ( 4+ years)
NP: Max 30 days.
Budget: max 45 LPA.
Loc: Pan India
Responsibilities:
- Credit Underwriting, credit appraisal process
- EWS analysis, IFRS9 staging, credit analysis, threshold analysis
- Querying and coding in SAS and SQL
- ECB regulations
Key Responsibilities:
- Design, develop, and execute automated test scripts for trading applications.
- Work with product owners and business analysts to understand and write the acceptance test cases.
- Collaborate with developers, product managers, and other stakeholders to understand requirements and create test plans.
- Perform regression, performance, and end to end testing to ensure software reliability.
- Identify, document, and track defects using appropriate tools and methodologies.
- Maintain and enhance existing test automation frameworks for both frontend and backend.
- Report on coverage, functionality, defect aging, closure reports to the stakeholders so that they know the stability of releases.
- Integrate automation cases into CI/CD pipelines
Qualifications:
- Bachelor’s degree in Computer Science, Engineering, or a related field.
- Proven 5+ Years experience in automation testing for web and backend applications.
- Strong knowledge of testing frameworks (e.g., Selenium, Cypress, JUnit, TestNG, Playwright).
- Experience with API testing tools (e.g., Postman, SoapUI, RestAssured).
- Familiarity with programming languages such as Java, Python, or JavaScript.
- Understanding of basic SQL queries to validate data in the databases
- Understanding of CI/CD processes and tools (e.g., Jenkins, GitLab CI).
- Strong analytical and problem-solving skills.
- Excellent communication and teamwork abilities.
- Prior experience with trading applications or core financial services related applications is a big plus
Data Engineer + Integration engineer + Support specialistExp – 5-8 years
Necessary Skills:· SQL & Python / PySpark
· AWS Services: Glue, Appflow, Redshift
· Data warehousing
· Data modelling
Job Description:· Experience of implementing and delivering data solutions and pipelines on AWS Cloud Platform. Design/ implement, and maintain the data architecture for all AWS data services
· A strong understanding of data modelling, data structures, databases (Redshift), and ETL processes
· Work with stakeholders to identify business needs and requirements for data-related projects
Strong SQL and/or Python or PySpark knowledge
· Creating data models that can be used to extract information from various sources & store it in a usable format
· Optimize data models for performance and efficiency
· Write SQL queries to support data analysis and reporting
· Monitor and troubleshoot data pipelines
· Collaborate with software engineers to design and implement data-driven features
· Perform root cause analysis on data issues
· Maintain documentation of the data architecture and ETL processes
· Identifying opportunities to improve performance by improving database structure or indexing methods
· Maintaining existing applications by updating existing code or adding new features to meet new requirements
· Designing and implementing security measures to protect data from unauthorized access or misuse
· Recommending infrastructure changes to improve capacity or performance
Experience in Process industry
Data Engineer + Integration engineer + Support specialistExp – 3-5 years
Necessary Skills:· SQL & Python / PySpark
· AWS Services: Glue, Appflow, Redshift
· Data warehousing basics
· Data modelling basics
Job Description:· Experience of implementing and delivering data solutions and pipelines on AWS Cloud Platform.
· A strong understanding of data modelling, data structures, databases (Redshift)
Strong SQL and/or Python or PySpark knowledge
· Design and implement ETL processes to load data into the data warehouse
· Creating data models that can be used to extract information from various sources & store it in a usable format
· Optimize data models for performance and efficiency
· Write SQL queries to support data analysis and reporting
· Collaborate with team to design and implement data-driven features
· Monitor and troubleshoot data pipelines
· Perform root cause analysis on data issues
· Maintain documentation of the data architecture and ETL processes
· Maintaining existing applications by updating existing code or adding new features to meet new requirements
· Designing and implementing security measures to protect data from unauthorized access or misuse
· Identifying opportunities to improve performance by improving database structure or indexing methods
· Designing and implementing security measures to protect data from unauthorized access or misuse
· Recommending infrastructure changes to improve capacity or performance
Job Title: Data Analyst-Fintech
Job Description:
We are seeking a highly motivated and detail-oriented Data Analyst with 2 to 4 years of work experience to join our team. The ideal candidate will have a strong analytical mindset, excellent problem-solving skills, and a passion for transforming data into actionable insights. In this role, you will play a pivotal role in gathering, analyzing, and interpreting data to support informed decision-making and drive business growth.
Key Responsibilities:
1. Data Collection and Extraction:
§ Gather data from various sources, including databases, spreadsheets and APIs,
§ Perform data cleansing and validation to ensure data accuracy and integrity.
2. Data Analysis:
§ Analyze large datasets to identify trends, patterns, and anomalies.
§ Conduct analysis and data modeling to generate insights and forecasts.
§ Create data visualizations and reports to present findings to stakeholders.
3. Data Interpretation and Insight Generation:
§ Translate data insights into actionable recommendations for business improvements.
§ Collaborate with cross-functional teams to understand data requirements and provide data-driven solutions.
4. Data Quality Assurance:
§ Implement data quality checks and validation processes to ensure data accuracy and consistency.
§ Identify and address data quality issues promptly.
Qualifications:
1. Bachelor's degree in a relevant field such as Computer Science, Statistics, Mathematics, or a related discipline.
2. Proven work experience as a Data Analyst, with 2 to 4 years of relevant experience.
3. Knowledge of data warehousing concepts and ETL processes is advantageous.
4. Proficiency in data analysis tools and languages (e.g., SQL, Python, R).
5. Experience with data visualization tools (e.g., Tableau, Power BI) is a plus.
6. Strong analytical and problem-solving skills.
7. Excellent communication and presentation skills.
8. Attention to detail and a commitment to data accuracy.
9. Familiarity with machine learning and predictive modeling is a bonus.
If you are a data-driven professional with a passion for uncovering insights from complex datasets and have the qualifications and skills mentioned above, we encourage you to apply for this Data Analyst position. Join our dynamic team and contribute to making data-driven decisions that will shape our company's future.
Fatakpay is an equal-opportunity employer. We celebrate diversity and are committed to creating an inclusive environment for all employees.
- 4-8 years of experience in Functional testing with good foundation in technical expertise
- Experience in the Capital Markets domain is MUST
- Exposure to API testing tools like SoapUI and Postman
- Well versed with SQL
- Hands on experience in debugging issues using Unix commands
- Basic understanding of XML and JSON structures
- Knowledge of FitNesse is good to have
- Should be early joinee.
Job Description:
Position: Senior Manager- Data Analytics (Fintech Firm)
Experience: 5-8 Years
Location: Mumbai-Andheri
Employment Type: Full-Time
About Us:
We are a dynamic fintech firm dedicated to revolutionizing the financial services industry through innovative data solutions. We believe in leveraging cutting-edge technology to provide superior financial products and services to our clients. Join our team and be a part of this exciting journey.
Job Overview:
We are looking for a skilled Data Engineer with 3-5 years of experience to join our data team. The ideal candidate will have a strong background in ETL processes, data pipeline creation, and database management. As a Data Engineer, you will be responsible for designing, developing, and maintaining scalable data systems and pipelines.
Key Responsibilities:
- Design and develop robust and scalable ETL processes to ingest and process large datasets from various sources.
- Build and maintain efficient data pipelines to support real-time and batch data processing.
- Collaborate with data scientists, analysts, and other stakeholders to understand data requirements and deliver solutions.
- Optimize database performance and ensure data integrity and security.
- Troubleshoot and resolve data-related issues and provide support for data operations.
- Implement data quality checks and monitor data pipeline performance.
- Document technical solutions and processes for future reference.
Required Skills and Qualifications:
- Bachelor's degree in Engineering, or a related field.
- 3-5 years of experience in data engineering or a related role.
- Strong proficiency in ETL tools and techniques.
- Experience with SQL and relational databases (e.g., MySQL, PostgreSQL).
- Familiarity with big data technologies
- Proficiency in programming languages such as Python, Java, or Scala.
- Knowledge of data warehousing concepts and tools
- Excellent problem-solving skills and attention to detail.
- Strong communication and collaboration skills.
Preferred Qualifications:
- Experience with data visualization tools (e.g., Tableau, Power BI).
- Knowledge of machine learning and data science principles.
- Experience with real-time data processing and streaming platforms (e.g., Kafka).
What We Offer:
- Competitive compensation package (10-15 LPA) based on experience and qualifications.
- Opportunity to work with a talented and innovative team in the fintech industry..
- Professional development and growth opportunities.
How to Apply:
If you are passionate about data engineering and eager to contribute to a forward-thinking fintech firm, we would love to hear from you.
- Bachelor's degree required, or higher education level, or foreign equivalent, preferably in area wit
- At least 5 years experience in Duck Creek Data Insights as Technical Architect/Senior Developer.
- Strong Technical knowledge on SQL databases, MSBI.
- Should have strong hands-on knowledge on Duck Creek Insight product, SQL Server/DB level configuration, T-SQL, XSL/XSLT, MSBI etc
- Well versed with Duck Creek Extract Mapper Architecture
- Strong understanding of Data Modelling, Data Warehousing, Data Marts, Business Intelligence with ability to solve business problems
- Strong understanding of ETL and EDW toolsets on the Duck Creek Data Insights
- Strong knowledge on Duck Creek Insight product overall architecture flow, Data hub, Extract mapper etc
- Understanding of data related to business application areas policy, billing, and claims business solutions
- Minimum 4 to 7 year working experience on Duck Creek Insights product
- Strong Technical knowledge on SQL databases, MSBI
- Preferable having experience in Insurance domain
- Preferable experience in Duck Creek Data Insights
- Experience specific to Duck Creek would be an added advantage
- Strong knowledge of database structure systems and data mining
- Excellent organisational and analytical abilities
- Outstanding problem solver
Java Technical Lead
We are solving complex technical problems in the financial industry and need talented software engineers to join our mission and be a part of a global software development team.
A brilliant opportunity to become a part of a highly motivated and expert team which has made a mark as a high-end technical consulting.
Experience: 10+ years
Location: Mumbai
Job Description:
• Experience in Core Java, Spring Boot.
• Experience in microservices.
• Extensive experience in developing enterprise-scale systems for global organization. Should possess good architectural knowledge and be aware of enterprise application design patterns.
• Should be able to analyze, design, develop and test complex, low-latency client-facing applications.
• Good development experience with RDBMS in SQL Server, Postgres, Oracle or DB2
• Good knowledge of multi-threading
• Basic working knowledge of Unix/Linux
• Excellent problem solving and coding skills in Java
• Strong interpersonal, communication and analytical skills.
• Should be able to express their design ideas and thoughts.
About Wissen Technology: Wissen Technology is a niche global consulting and solutions company that brings unparalleled domain expertise in Banking and Finance, Telecom and Startups. Wissen Technology is a part of Wissen Group and was established in the year 2015. Wissen has offices in the US, India, UK, Australia, Mexico, and Canada, with best-in-class infrastructure and development facilities. Wissen has successfully delivered projects worth $1 Billion for more than 25 of the Fortune 500 companies. The Wissen Group overall includes more than 4000 highly skilled professionals.
Wissen Technology provides exceptional value in mission critical projects for its clients, through thought leadership, ownership, and assured on-time deliveries that are always ‘first time right’.
Our team consists of 1200+ highly skilled professionals, with leadership and senior management executives who have graduated from Ivy League Universities like Wharton, MIT, IITs, IIMs, and NITs and with rich work experience in some of the biggest companies in the world.
Wissen Technology offers an array of services including Application Development, Artificial Intelligence & Machine Learning, Big Data & Analytics, Visualization & Business Intelligence, Robotic Process Automation, Cloud, Mobility, Agile & DevOps, Quality Assurance & Test Automation.
We have been certified as a Great Place to Work® for two consecutive years (2020-2022) and voted as the Top 20 AI/ML vendor by CIO Insider.
Job Description: Data Engineer (Fintech Firm)
Position: Data Engineer
Experience: 2-4 Years
Location: Mumbai-Andheri
Employment Type: Full-Time
About Us:
We are a dynamic fintech firm dedicated to revolutionizing the financial services industry through innovative data solutions. We believe in leveraging cutting-edge technology to provide superior financial products and services to our clients. Join our team and be a part of this exciting journey.
Job Overview:
We are looking for a skilled Data Engineer with 3-5 years of experience to join our data team. The ideal candidate will have a strong background in ETL processes, data pipeline creation, and database management. As a Data Engineer, you will be responsible for designing, developing, and maintaining scalable data systems and pipelines.
Key Responsibilities:
- Design and develop robust and scalable ETL processes to ingest and process large datasets from various sources.
- Build and maintain efficient data pipelines to support real-time and batch data processing.
- Collaborate with data scientists, analysts, and other stakeholders to understand data requirements and deliver solutions.
- Optimize database performance and ensure data integrity and security.
- Troubleshoot and resolve data-related issues and provide support for data operations.
- Implement data quality checks and monitor data pipeline performance.
- Document technical solutions and processes for future reference.
Required Skills and Qualifications:
- Bachelor's degree in Engineering, or a related field.
- 3-5 years of experience in data engineering or a related role.
- Strong proficiency in ETL tools and techniques.
- Experience with SQL and relational databases (e.g., MySQL, PostgreSQL).
- Familiarity with big data technologies
- Proficiency in programming languages such as Python, Java, or Scala.
- Knowledge of data warehousing concepts and tools
- Excellent problem-solving skills and attention to detail.
- Strong communication and collaboration skills.
Preferred Qualifications:
- Experience with data visualization tools (e.g., Tableau, Power BI).
- Knowledge of machine learning and data science principles.
- Experience with real-time data processing and streaming platforms (e.g., Kafka).
What We Offer:
- Competitive compensation package (12-20 LPA) based on experience and qualifications.
- Opportunity to work with a talented and innovative team in the fintech industry..
- Professional development and growth opportunities.


Company: CorpCare
Title: Lead Engineer (Full stack developer)
Location: Mumbai (work from office)
CTC: Commensurate with experience
About Us:
CorpCare is India’s first all-in-one corporate funds and assets management platform. We offer a single-window solution for corporates, family offices, and HNIs. We assist corporates in formulating and managing treasury management policies and conducting reviews with investment committees and the board.
Job Summary:
The Lead Engineer will be responsible for overseeing the development, implementation, and management of our corporate funds and assets management platform. This role demands a deep understanding of the broking industry/Financial services industry, software engineering, and product management. The ideal candidate will have a robust background in engineering leadership, a proven track record of delivering scalable technology solutions, and strong product knowledge.
Key Responsibilities:
- Engineering Strategy and Vision:
- Develop and communicate a clear engineering vision and strategy aligned with our broking and funds management platform.
- Conduct market research and technical analysis to identify trends, opportunities, and customer needs within the broking industry.
- Define and prioritize the engineering roadmap, ensuring alignment with business goals and customer requirements.
- Lead cross-functional engineering teams (software development, QA, DevOps, etc.) to deliver high-quality products on time and within budget.
- Oversee the entire software development lifecycle, from planning and architecture to development and deployment, ensuring robust and scalable solutions.
- Write detailed technical specifications and guide the engineering teams to ensure clarity and successful execution.
- Leverage your understanding of the broking industry to guide product development and engineering efforts.
- Collaborate with product managers to incorporate industry-specific requirements and ensure the platform meets the needs of brokers, traders, and financial institutions.
- Stay updated with regulatory changes, market trends, and technological advancements within the broking sector.
- Mentor and lead a high-performing engineering team, fostering a culture of innovation, collaboration, and continuous improvement.
- Recruit, train, and retain top engineering talent to build a world-class development team.
- Conduct regular performance reviews and provide constructive feedback to team members.
- Define and track key performance indicators (KPIs) for engineering projects to ensure successful delivery and performance.
- Analyze system performance, user data, and platform metrics to identify areas for improvement and optimization.
- Prepare and present engineering performance reports to senior management and stakeholders.
- Work closely with product managers, sales, marketing, and customer support teams to align engineering efforts with overall business objectives.
- Provide technical guidance and support to sales teams to help them understand the platform's capabilities and competitive advantages.
- Engage with customers, partners, and stakeholders to gather feedback, understand their needs, and validate engineering solutions.
Requirements:
- BE /B. Tech - Computer Science from a top engineering college
- MBA a plus, not required
- 5+ years of experience in software engineering, with at least 2+ years in a leadership role.
- Strong understanding of the broking industry and financial services industry.
- Proven track record of successfully managing and delivering complex software products.
- Excellent communication, presentation, and interpersonal skills.
- Strong analytical and problem-solving abilities.
- Experience with Agile/Scrum methodologies.
- Deep understanding of software architecture, cloud computing, and modern development practices.
Technical Expertise:
- Front-End: React, Next.js, JavaScript, HTML5, CSS3
- Back-End: Node.js, Express.js, RESTful APIs
- Database: MySQL, PostgreSQL, MongoDB
- DevOps: Docker, Kubernetes, AWS (EC2, S3, RDS), CI/CD pipelines
- Version Control: Git, GitHub/GitLab
- Other: TypeScript, Webpack, Babel, ESLint, Redux
Preferred Qualifications:
- Experience in the broking or financial services industry.
- Familiarity with data analytics tools and methodologies.
- Knowledge of user experience (UX) design principles.
- Experience with trading platforms or financial technology products.
This role is ideal for someone who combines strong technical expertise with a deep understanding of the broking industry and a passion for delivering high-impact software solutions.
• Experience in Core Java and Spring Boot.
• Extensive experience in developing enterprise-scale applications and systems. Should possess good architectural knowledge and be aware of enterprise application design patterns.
• Should have the ability to analyze, design, develop and test complex, low-latency client facing applications.
• Good development experience with RDBMS.
• Good knowledge of multi-threading and high-performance server-side development.
• Basic working knowledge of Unix/Linux.
• Excellent problem solving and coding skills.
• Strong interpersonal, communication and analytical skills.
• Should have the ability to express their design ideas and thoughts.
Job Description:
- 3+ years of experience in Functional testing with good foundation in technical expertise
- Experience in Capital Markets/Investment Banking domain is MUST
- Exposure to API testing tools like SoapUI and Postman
- Well versed with SQL
- Hands on experience in debugging issues using Unix commands
- Basic understanding of XML and JSON structures
- Knowledge of FitNesse is good to have
Location:
Pune/Mumbai
About Wissen Technology:
· The Wissen Group was founded in the year 2000. Wissen Technology, a part of Wissen Group, was established in the year 2015.
· Wissen Technology is a specialized technology company that delivers high-end consulting for organizations in the Banking & Finance, Telecom, and Healthcare domains. We help clients build world class products.
· Our workforce consists of 550+ highly skilled professionals, with leadership and senior management executives who have graduated from Ivy League Universities like Wharton, MIT, IITs, IIMs, and NITs and with rich work experience in some of the biggest companies in the world.
· Wissen Technology has grown its revenues by 400% in these five years without any external funding or investments.
· Globally present with offices US, India, UK, Australia, Mexico, and Canada.
· We offer an array of services including Application Development, Artificial Intelligence & Machine Learning, Big Data & Analytics, Visualization & Business Intelligence, Robotic Process Automation, Cloud, Mobility, Agile & DevOps, Quality Assurance & Test Automation.
· Wissen Technology has been certified as a Great Place to Work®.
· Wissen Technology has been voted as the Top 20 AI/ML vendor by CIO Insider in 2020.
· Over the years, Wissen Group has successfully delivered $650 million worth of projects for more than 20 of the Fortune 500 companies.
· We have served clients across sectors like Banking, Telecom, Healthcare, Manufacturing, and Energy. They include Morgan Stanley, MSCI, StateStreet, Flipkart, Swiggy, Trafigura, GE to name a few.
Website : www.wissen.com