Cutshort logo

50+ SQL Jobs in Mumbai | SQL Job openings in Mumbai

Apply to 50+ SQL Jobs in Mumbai on CutShort.io. Explore the latest SQL Job opportunities across top companies like Google, Amazon & Adobe.

Sql jobs in other cities
ESQL JobsESQL Jobs in PuneMySQL JobsMySQL Jobs in AhmedabadMySQL Jobs in Bangalore (Bengaluru)MySQL Jobs in BhubaneswarMySQL Jobs in ChandigarhMySQL Jobs in ChennaiMySQL Jobs in CoimbatoreMySQL Jobs in Delhi, NCR and GurgaonMySQL Jobs in HyderabadMySQL Jobs in IndoreMySQL Jobs in JaipurMySQL Jobs in Kochi (Cochin)MySQL Jobs in KolkataMySQL Jobs in MumbaiMySQL Jobs in PunePL/SQL JobsPL/SQL Jobs in AhmedabadPL/SQL Jobs in Bangalore (Bengaluru)PL/SQL Jobs in ChandigarhPL/SQL Jobs in ChennaiPL/SQL Jobs in CoimbatorePL/SQL Jobs in Delhi, NCR and GurgaonPL/SQL Jobs in HyderabadPL/SQL Jobs in IndorePL/SQL Jobs in JaipurPL/SQL Jobs in KolkataPL/SQL Jobs in MumbaiPL/SQL Jobs in PunePostgreSQL JobsPostgreSQL Jobs in AhmedabadPostgreSQL Jobs in Bangalore (Bengaluru)PostgreSQL Jobs in BhubaneswarPostgreSQL Jobs in ChandigarhPostgreSQL Jobs in ChennaiPostgreSQL Jobs in CoimbatorePostgreSQL Jobs in Delhi, NCR and GurgaonPostgreSQL Jobs in HyderabadPostgreSQL Jobs in IndorePostgreSQL Jobs in JaipurPostgreSQL Jobs in Kochi (Cochin)PostgreSQL Jobs in KolkataPostgreSQL Jobs in MumbaiPostgreSQL Jobs in PunePSQL JobsPSQL Jobs in Bangalore (Bengaluru)PSQL Jobs in ChennaiPSQL Jobs in Delhi, NCR and GurgaonPSQL Jobs in HyderabadPSQL Jobs in MumbaiPSQL Jobs in PuneRemote SQL JobsSQL JobsSQL Jobs in AhmedabadSQL Jobs in Bangalore (Bengaluru)SQL Jobs in BhubaneswarSQL Jobs in ChandigarhSQL Jobs in ChennaiSQL Jobs in CoimbatoreSQL Jobs in Delhi, NCR and GurgaonSQL Jobs in HyderabadSQL Jobs in IndoreSQL Jobs in JaipurSQL Jobs in Kochi (Cochin)SQL Jobs in KolkataSQL Jobs in PuneTransact-SQL JobsTransact-SQL Jobs in Bangalore (Bengaluru)Transact-SQL Jobs in ChennaiTransact-SQL Jobs in HyderabadTransact-SQL Jobs in JaipurTransact-SQL Jobs in Pune
icon
Tata Consultancy Services
Agency job
via Risk Resources LLP hyd by Jhansi Padiy
Chennai, Bengaluru (Bangalore), Hyderabad, Mumbai
4 - 10 yrs
₹6L - ₹25L / yr
IBM InfoSphere DataStage
ETL
Datawarehousing
SQL
IBM Datastage

DataStage Developer Job Description

A DataStage Developer is responsible for designing, developing, and implementing data integration solutions using IBM InfoSphere DataStage. Here's a brief overview:


Key Responsibilities

- Data Integration: Design and develop data integration jobs using DataStage to extract, transform, and load (ETL) data from various sources.

- Job Development: Develop and test DataStage jobs to meet business requirements.

- Data Transformation: Use DataStage transformations to cleanse, aggregate, and transform data.

- Performance Optimization: Optimize DataStage jobs for performance and scalability.

- Troubleshooting: Troubleshoot DataStage issues and resolve data integration problems.


Technical Skills

- DataStage: Strong understanding of IBM InfoSphere DataStage and its components.

- ETL: Experience with ETL concepts and data integration best practices.

- Data Transformation: Knowledge of data transformation techniques and DataStage transformations.

- SQL: Familiarity with SQL and database concepts.

- Data Modeling: Understanding of data modeling concepts and data warehousing.


Read more
QAgile Services

at QAgile Services

1 recruiter
Radhika Chotai
Posted by Radhika Chotai
Santacruz , Mumbai
3 - 5 yrs
₹3L - ₹7L / yr
ASP.NET
SQL
skill iconReact.js
skill iconAngular (2+)
Oracle SQL Developer
+4 more

JD-

 

Job Title: Full Stack Developer (ASP.NET & MySQL)

Experience Required: 3 to 5 Years

Location: Kalina, Mumbai (Onsite)

Employment Type: Full-Time


About the Role:

We are looking for a skilled and proactive Full Stack Developer with 3 to 5 years of hands-on experience in ASP.NET and MySQL to join our dynamic team in Kalina, Mumbai. This is a fully onsite role, ideal for professionals who enjoy working in a collaborative office environment and are passionate about building scalable web applications.


Key Responsibilities:

  • Design, develop, test, and maintain web applications using ASP.NET (MVC/Web API).
  • Write clean, scalable, and efficient code for both front-end and back-end development.
  • Develop and manage MySQL databases, including design, optimization, and complex queries.
  • Collaborate with UI/UX designers, front-end developers, and other stakeholders to deliver seamless user experiences.


Required Skills & Qualifications:

  • 3 to 5 years of experience in full stack development.
  • Strong proficiency in ASP.NET (MVC, Web API) and C#.
  • Solid experience with MySQL database design, queries, and optimization.
  • Good understanding of front-end technologies such as HTML, CSS, JavaScript, and jQuery.
  • Familiarity with version control tools like Git.


Good to Have (Preferred):

  • Experience with front-end frameworks (e.g., Angular, React).
  • Knowledge of RESTful API development and integration.
  • Understanding of Agile methodologies.


Location & Work Mode:

  • Location: Kalina, Mumbai
  • Work Mode: Onsite (Work from Office)


Read more
Deqode

at Deqode

1 recruiter
purvisha Bhavsar
Posted by purvisha Bhavsar
Mumbai
2 - 5 yrs
₹4L - ₹10L / yr
Automation
skill iconPostman
SQL
API

Some specific Requirements:

  • Must have proficiency in Python
  • At least 3+ years of professional experience in software application development.
  • Good understanding of REST APIs and a solid experience in testing APIs.
  • Should have built APIs at some point, and practical knowledge on working with them
  • Must have experience in API testing tools like Postman and in setting up the prerequisites and post-execution validations using these tools
  • Ability to develop applications for test automation
  • Should be able to create fixtures, mock object,s and datasets that can be used by tests across different micro-services
  • Proficiency in gitStrong in writing SQL queries, tools like Jira, Asa,na a or similar bug tracking tool, Confluence - Wiki, Jenkins - CI tool


Good to have:

  • Good understanding of CI/Knowledge of queues, especially Kafka
  • Should have built an API test automation framework from scratch and maintained it
  • Knowledge of cloud platforms like AWS, Azure
  • Knowledge of different browsers and cross-platform operating systems
  • Web Programming, Docker & 3-Tier Architecture Knowledge is preferred.
  • Should know about API Creation, Coding Experience would be added on.
  • Bachelor's degree in Computer Science / IT / Computer Applications


Read more
Envecon IT Systems
RADHIKA WADKAR
Posted by RADHIKA WADKAR
Mumbai
3 - 4 yrs
₹5L - ₹6L / yr
ASP.NET
ASP.NET MVC
skill iconC#
SQL
Platform as a Service (PaaS)
+1 more

Lead modules of Software Product Development. • Development and Continuous enhancement of Product Architecture and Framework. • Catalyze the personal development plans for team management. • Managing quality and timely Project deliveries. • Collaborate with internal teams to produce strong software design and architecture. • To develop project requirement roadmaps that outline the planned features and enhancements of products, prioritizing them based on customer and business needs. • Continuous cross-functional collaboration, especially on coordination with other teams (Project Management, Implementation, Quality Assurance, etc.). • Debugging and fixing software issues in a timely manner. • Writing clean, efficient, and well-documented code

Read more
Deqode

at Deqode

1 recruiter
Roshni Maji
Posted by Roshni Maji
Mumbai
2 - 4 yrs
₹4L - ₹11L / yr
Automation
SQL
API
skill iconPostman
E-Commerce

Job Title: QA Automation Engineer

Location: Andheri, Mumbai

Experience: 2 to 4 Years

Work Mode: 5 Days Work From Office (WFO)

Joining: Immediate Joiners Preferred


Job Overview:

We are looking for a proactive and detail-oriented QA Automation Engineer with 2 to 4 years of experience. The ideal candidate must have solid experience in automation testing, strong command over SQL, and hands-on proficiency in API testing using Postman. This is a full-time on-site role based in Andheri, Mumbai, and we are open to candidates from diverse project backgrounds as long as the core skills are strong.


Key Responsibilities:

  • Develop and maintain automated test cases for APIs and backend systems.
  • Perform end-to-end API validations using Postman including request construction, response validation, and chaining requests.
  • Write and optimize SQL queries to validate business logic and data integrity in backend systems.
  • Collaborate closely with developers, business analysts, and QA team members to define test requirements.
  • Track, report, and follow up on bugs and issues using tools like JIRA or similar.
  • Ensure timely regression, smoke, and sanity testing before every release.
  • Participate in Agile ceremonies and contribute to improving testing processes and coverage.


Mandatory Skills:

  • 2–4 years of experience in Automation Testing
  • Strong knowledge of SQL and ability to write complex queries
  • Practical, hands-on experience with Postman for API Testing
  • Solid understanding of software QA methodologies, tools, and processes
  • Familiarity with Agile development and continuous testing workflow


Nice to Have (Not Mandatory):

  • Experience in any programming language (e.g., Python, Java)
  • Exposure to test automation frameworks (e.g., PyTest, Selenium, RestAssured)
  • Familiarity with CI/CD tools like Jenkins or Git
  • Basic knowledge of performance testing tools like JMeter


Other Details:

  • Location: On-site at our Andheri office in Mumbai
  • Work Mode: Full-time | 5 days working (Monday to Friday)
  • Notice Period: Immediate joiners preferred
  • Candidate Preference: Only Mumbai-local candidates with relevant background will be considered
Read more
QAgile Services

at QAgile Services

1 recruiter
Radhika Chotai
Posted by Radhika Chotai
Navi Mumbai
5 - 9 yrs
₹9L - ₹16L / yr
Customer Relationship Management (CRM)
Business Analysis
skill iconJava
SQL
Agile management
+1 more

1.Experience with CRMNext or similar CRM platforms

2.Familiarity with SDLC and Agile methodologies

3.Ability to analyze business requirements and deliver effective solutions

4.Proficiency in Java or .NET

5.Experience with API integrations, ESB, and Node.js

6.Strong knowledge of SQL and Oracle databases

7.Understanding of system architecture, operating systems, and data structures

8.Excellent communication skills and a collaborative approach

9.Accountability for end-to-end project delivery Preferred Profile


Candidates with a Java development background and experience in CRM applications will be given preference.

Read more
Wissen Technology

at Wissen Technology

4 recruiters
Praffull Shinde
Posted by Praffull Shinde
Pune, Mumbai, Bengaluru (Bangalore)
4 - 8 yrs
₹14L - ₹26L / yr
skill iconPython
PySpark
skill iconDjango
skill iconFlask
RESTful APIs
+3 more

Job title - Python developer

Exp – 4 to 6 years

Location – Pune/Mum/B’lore

 

PFB JD

Requirements:

  • Proven experience as a Python Developer
  • Strong knowledge of core Python and Pyspark concepts
  • Experience with web frameworks such as Django or Flask
  • Good exposure to any cloud platform (GCP Preferred)
  • CI/CD exposure required
  • Solid understanding of RESTful APIs and how to build them
  • Experience working with databases like Oracle DB and MySQL
  • Ability to write efficient SQL queries and optimize database performance
  • Strong problem-solving skills and attention to detail
  • Strong SQL programing (stored procedure, functions)
  • Excellent communication and interpersonal skill

Roles and Responsibilities

  • Design, develop, and maintain data pipelines and ETL processes using pyspark
  • Work closely with data scientists and analysts to provide them with clean, structured data.
  • Optimize data storage and retrieval for performance and scalability.
  • Collaborate with cross-functional teams to gather data requirements.
  • Ensure data quality and integrity through data validation and cleansing processes.
  • Monitor and troubleshoot data-related issues to ensure data pipeline reliability.
  • Stay up to date with industry best practices and emerging technologies in data engineering.


Read more
KreditVenture

KreditVenture

Agency job
via Pluginlive by Harsha Saggi
chembur, Mumbai
3 - 9 yrs
₹7.5L - ₹8.5L / yr
JIRA
JIRA Agile
HP ALM
Selenium
SQL
+1 more

Job Title: Sr. / Quality Assurance Engineer – LOS/LMS

Location: Mumbai (Chembur)

Company: KreditVenture

CTC: Up to ₹8.5 LPA

Working Mode: Full-Time

Experience Required: 3–5 Years

Preference: Immediate Joiners or Candidates Serving Notice

Company Overview:

KreditVenture is a growing financial technology company focused on delivering innovative lending solutions. The organization develops and manages digital platforms for Loan Origination (LOS) and Loan Management Systems (LMS), targeting products such as Loan Against Property (LAP) and Commercial Vehicle Loans.

Job Summary:

We are looking for a Quality Assurance Engineer with experience in Loan Origination Systems (LOS) and Loan Management Systems (LMS) to test and ensure high-quality software delivery across the lending lifecycle. The ideal candidate will have hands-on experience in functional testing, especially in financial services or lending platforms, and will ensure systems comply with regulatory standards and business needs.

Key Responsibilities:

Functional Testing

  • Design and execute test cases for LOS and LMS functionalities
  • Perform end-to-end testing of loan lifecycle: application → underwriting → disbursement → servicing → closure
  • Validate calculations for interest rates, EMIs, fees, and schedules

Test Case Development & Execution

  • Develop test scripts/scenarios for LAP and CV loans
  • Execute and maintain manual and regression tests

Defect Tracking

  • Log, track, and retest bugs using tools like JIRA or Bugzilla
  • Collaborate with developers to ensure resolution and closure of defects

Stakeholder Collaboration

  • Work with Business Analysts, Developers, and Product Managers
  • Participate in sprint planning and requirement reviews

Compliance Testing

  • Ensure system compliance with financial regulations, disclosures, and reporting standards

Reporting & Documentation

  • Prepare test reports, defect logs, test coverage, and documentation for knowledge sharing

UAT Support

  • Assist in User Acceptance Testing (UAT) and help validate business readiness

Required Skills & Qualifications:

Education:

  • Bachelor’s degree in Computer Science, Information Technology, Finance, or equivalent

Experience:

  • 3–5 years of experience in functional testing, especially in lending/loan systems
  • Strong knowledge of Loan Against Property (LAP) and Commercial Vehicle Loans

Technical Skills:

  • Hands-on experience with testing tools: JIRA, TestRail, Selenium, HP ALM, QTest
  • Working knowledge of SQL for database validation
  • Understanding of LOS/LMS architecture and APIs

Testing Methodologies:

  • Strong experience with Functional, Regression, Integration, and UAT Testing
  • Familiarity with Agile/Scrum and SDLC best practices

Soft Skills:

  • Strong attention to detail, communication, and time management
  • Collaborative team player with problem-solving mindset

Preferred Certifications (Optional):

  • ISTQB or equivalent testing certifications
  • Certifications in finance, banking, or loan platforms are a plus

Why Join KreditVenture?

  • Join a mission-driven company focused on financial inclusion and innovation
  • Work on cutting-edge digital lending platforms with real-world impact
  • Career growth and professional development opportunities


Read more
KreditVenture

KreditVenture

Agency job
via Pluginlive by Harsha Saggi
Mumbai
3 - 6 yrs
₹8L - ₹9L / yr
JIRA
HP ALM
Selenium
SQL
Agile/Scrum
+1 more

Job Title: Sr. / Quality Assurance Engineer – LOS/LMS

Location: Mumbai (Chembur)

Company: KreditVenture

CTC: Up to ₹8.5 LPA

Working Mode: Full-Time

Experience Required: 3–5 Years

Preference: Immediate Joiners or Candidates Serving Notice

Company Overview:

KreditVenture is a growing financial technology company focused on delivering innovative lending solutions. The organization develops and manages digital platforms for Loan Origination (LOS) and Loan Management Systems (LMS), targeting products such as Loan Against Property (LAP) and Commercial Vehicle Loans.

Job Summary:

We are looking for a Quality Assurance Engineer with experience in Loan Origination Systems (LOS) and Loan Management Systems (LMS) to test and ensure high-quality software delivery across the lending lifecycle. The ideal candidate will have hands-on experience in functional testing, especially in financial services or lending platforms, and will ensure systems comply with regulatory standards and business needs.

Key Responsibilities:

Functional Testing

  • Design and execute test cases for LOS and LMS functionalities
  • Perform end-to-end testing of loan lifecycle: application → underwriting → disbursement → servicing → closure
  • Validate calculations for interest rates, EMIs, fees, and schedules

Test Case Development & Execution

  • Develop test scripts/scenarios for LAP and CV loans
  • Execute and maintain manual and regression tests

Defect Tracking

  • Log, track, and retest bugs using tools like JIRA or Bugzilla
  • Collaborate with developers to ensure resolution and closure of defects

Stakeholder Collaboration

  • Work with Business Analysts, Developers, and Product Managers
  • Participate in sprint planning and requirement reviews

Compliance Testing

  • Ensure system compliance with financial regulations, disclosures, and reporting standards

Reporting & Documentation

  • Prepare test reports, defect logs, test coverage, and documentation for knowledge sharing

UAT Support

  • Assist in User Acceptance Testing (UAT) and help validate business readiness

Required Skills & Qualifications:

Education:

  • Bachelor’s degree in Computer Science, Information Technology, Finance, or equivalent

Experience:

  • 3–5 years of experience in functional testing, especially in lending/loan systems
  • Strong knowledge of Loan Against Property (LAP) and Commercial Vehicle Loans

Technical Skills:

  • Hands-on experience with testing tools: JIRA, TestRail, Selenium, HP ALM, QTest
  • Working knowledge of SQL for database validation
  • Understanding of LOS/LMS architecture and APIs

Testing Methodologies:

  • Strong experience with Functional, Regression, Integration, and UAT Testing
  • Familiarity with Agile/Scrum and SDLC best practices

Soft Skills:

  • Strong attention to detail, communication, and time management
  • Collaborative team player with problem-solving mindset

Preferred Certifications (Optional):

  • ISTQB or equivalent testing certifications
  • Certifications in finance, banking, or loan platforms are a plus

Why Join KreditVenture?

  • Join a mission-driven company focused on financial inclusion and innovation
  • Work on cutting-edge digital lending platforms with real-world impact
  • Career growth and professional development opportunities
Read more
Gruve
Nikita Sinha
Posted by Nikita Sinha
Mumbai, Pune
5 - 10 yrs
Upto ₹22L / yr (Varies
)
skill iconReact.js
skill iconNextJs (Next.js)
Wordpress
skill iconPHP
skill iconHTML/CSS
+4 more

We are seeking an experienced WordPress Developer with expertise in both frontend and backend development. The ideal candidate will have a deep understanding of headless WordPress architecture, where the backend is managed with WordPress, and the frontend is built using React.js (or Next.js). The developer should follow best coding practices to ensure the website is secure, high-performing, scalable, and fully responsive. 


Key Responsibilities: 

Backend Development (WordPress): 

  • Develop and maintain a headless WordPress CMS to serve content via REST API / GraphQL. 
  • Create custom WordPress plugins and themes to optimize content delivery. 
  • Ensure secure authentication and role-based access for API endpoints. 
  • Optimize WordPress database queries for better performance. 

Frontend Development (React.js / Next.js): 

  • Build a decoupled frontend using React.js (or Next.js) that fetches content from WordPress. 
  • Experience with Figma for translating UI/UX designs to code. 
  • Ensure seamless integration of frontend with WordPress APIs. 
  • Implement modern UI/UX principles to create responsive, fast-loading web pages. 

Code quality, Performance & Security Optimization: 

  • Optimize website speed using caching, lazy loading, and CDN integration. 
  • Ensure the website follows SEO best practices and is mobile-friendly. 
  • Implement security best practices to prevent vulnerabilities such as SQL injection, XSS, and CSRF. 
  • Write clean, maintainable, and well-documented code following industry standards. 
  • Implement version control using Git/GitHub/GitLab. 
  • Conduct regular code reviews and debugging to ensure a high-quality product. 

Collaboration & Deployment: 

  • Work closely with designers, content teams, and project managers. 
  • Deploy and manage WordPress and frontend code in staging and production environments. 
  • Monitor website performance and implement improvements. 

Required Skills & Qualifications: 

  • B.E/B. Tech Degree, Master’s Degree required
  • Experience: 6 – 8 Years
  • Strong experience in React.js / Next.js for building frontend applications. 
  • Proficiency in JavaScript (ES6+), TypeScript, HTML5, CSS3, and TailwindCSS.
  • Familiarity with SSR (Server Side Rendering) and SSG (Static Site Generation). 
  • Experience in WordPress development (PHP, MySQL, WP REST API, GraphQL). 
  • Experience with ACF (Advanced Custom Fields), Custom Post Types, WP Headless CMS
  • Strong knowledge of WordPress security, database optimization, and caching techniques. 

Why Join Us:

  • Competitive salary and benefits package.
  • Work in a dynamic, collaborative, and creative environment.
  • Opportunity to lead and influence design decisions across various platforms.
  • Professional development opportunities and career growth potential.


Read more
Intellikart Ventures LLP
ramandeep intellikart
Posted by ramandeep intellikart
Mumbai
3 - 5 yrs
₹18L - ₹19L / yr
SQL
Spark
Data modeling
Windows Azure
skill iconData Analytics
+1 more

Location: Mumbai

Job Type: Full-Time (Hybrid – 3 days in office, 2 days WFH)


Job Overview:

We are looking for a skilled Azure Data Engineer with strong experience in data modeling, pipeline development, and SQL/Spark expertise. The ideal candidate will work closely with the Data Analytics & BI teams to implement robust data solutions on Azure Synapse and ensure seamless data integration with third-party applications.


Key Responsibilities:

  • Design, develop, and maintain Azure data pipelines using Azure Synapse (SQL dedicated pools or Apache Spark pools).
  • Implement data models in collaboration with the Data Analytics and BI teams.
  • Optimize and manage large-scale SQL and Spark-based data processing solutions.
  • Ensure data availability and reliability for third-party application consumption.
  • Collaborate with cross-functional teams to translate business requirements into scalable data solutions.


Required Skills & Experience:

3–5 years of hands-on experience in:

  • Azure data services
  • Data Modeling
  • SQL development and tuning
  • Apache Spark
  • Strong knowledge of Azure Synapse Analytics.
  • Experience in designing data pipelines and ETL/ELT processes.
  • Ability to troubleshoot and optimize complex data workflows.


Preferred Qualifications:

  • Experience with data governance, security, and data quality practices.
  • Familiarity with DevOps practices in a data engineering context.
  • Effective communication skills and the ability to work in a collaborative team environment.
Read more
Tata Consultancy Services
Agency job
via Risk Resources LLP hyd by Jhansi Padiy
AnyWhareIndia, Bengaluru (Bangalore), Mumbai, Delhi, Gurugram, Noida, Ghaziabad, Faridabad, Pune, Hyderabad, Indore, Kolkata
5 - 11 yrs
₹6L - ₹30L / yr
Snowflake
skill iconPython
PySpark
SQL

Role descriptions / Expectations from the Role

·        6-7 years of IT development experience with min 3+ years hands-on experience in Snowflake

·        Strong experience in building/designing the data warehouse or data lake, and data mart end-to-end implementation experience focusing on large enterprise scale and Snowflake implementations on any of the hyper scalers.

·        Strong experience with building productionized data ingestion and data pipelines in Snowflake

·        Good knowledge of Snowflake's architecture, features likie  Zero-Copy Cloning, Time Travel, and performance tuning capabilities

·        Should have good exp on Snowflake RBAC and data security.

·        Strong experience in Snowflake features including new snowflake features.

·        Should have good experience in Python/Pyspark.

·        Should have experience in AWS services (S3, Glue, Lambda, Secrete Manager, DMS) and few Azure services (Blob storage, ADLS, ADF)

·        Should have experience/knowledge in orchestration and scheduling tools experience like Airflow

·        Should have good understanding on ETL or ELT processes and ETL tools.

Read more
Fundlyai

at Fundlyai

2 candid answers
2 products
Reshika Mendiratta
Posted by Reshika Mendiratta
Mumbai
7yrs+
Upto ₹40L / yr (Varies
)
skill iconJava
skill iconSpring Boot
Spring MVC
Hibernate (Java)
SQL
+1 more

About Fundly

Fundly is building a retailer-centric Pharma Digital Supply Chain Finance platform and Marketplace for over 10 million pharma retailers in India.

  • Founded by seasoned professionals with a combined experience of 30+ years
  • Team of 60+ across 12 cities, built in less than 2 years
  • AUM of ₹100+ crores
  • USD 3M raised in venture capital

Opportunity at Fundly

  • Be part of a fast-growing, technology-first fintech revolutionizing the pharma supply chain
  • Work on a retailer-focused ecosystem with:
  • 3,000+ retailers onboarded
  • 36,000+ transactions
  • ₹200+ crores disbursed in the last 2 years
  • Join a high-ownership culture where your impact is visible
  • Influence the product and technology roadmap
  • Be an early team member with leadership responsibilities

Who You Are

  • Passionate about solving problems – technical, business, or people-related
  • Naturally take ownership and accountability
  • Experienced in leading engineering teams (4-5 developers)
  • Strong grasp of software development lifecycle (SDLC), architecture, and DevOps
  • Enjoy shipping high-quality, scalable products at speed

Responsibilities

  • Be hands-on: ship clean, efficient, and scalable code quickly
  • Plan, design, lead, execute, and deploy technical solutions
  • Drive architectural and design decisions
  • Create and maintain high- and low-level architecture documentation
  • Improve and maintain existing systems; manage technical debt
  • Mentor and coach other developers across technologies and practices
  • Promote and uphold engineering best practices like planning, estimation, documentation, and code reviews

Qualifications

  • 6+ years of hands-on development experience with Java, Spring Boot, Spring MVC, Hibernate, Play
  • Hands on experience in SQL and NoSQL databases: Postgres, MongoDB, ElasticSearch, Redis
  • Strong DevOps exposure and experience with related tools
  • Proven experience leading a team of 4–5 engineers
  • In-depth understanding of Architectural Design Patterns
Read more
Wissen Technology

at Wissen Technology

4 recruiters
Rutuja Patil
Posted by Rutuja Patil
Mumbai
5 - 12 yrs
Best in industry
skill iconJava
J2EE
Spring
Hibernate (Java)
Multithreading
+3 more

Company Name – Wissen Technology

Group of companies in India – Wissen Technology & Wissen Infotech

Work Location - Bangalore/Mumbai


Java Developer – Job Description


Wissen Technology is now hiring for a Java Developer - Mumbai with hands-on experience in Core Java, algorithms, data structures, multithreading and SQL.


We are solving complex technical problems in the industry and need talented software engineers to join our mission and be a part of a global software development team.


A brilliant opportunity to become a part of a highly motivated and expert team which has made a mark as a high-end technical consulting.


Required Skills:


  • Exp. - 5+ years.
  • Experience in Core Java and Spring Boot.
  • Extensive experience in developing enterprise-scale applications and systems. Should possess good architectural knowledge and be aware of enterprise application design patterns.
  • Should have the ability to analyze, design, develop and test complex, low-latency client- facing applications.
  • Good development experience with RDBMS.
  • Good knowledge of multi-threading and high-performance server-side development.
  • Basic working knowledge of Unix/Linux.
  • Excellent problem solving and coding skills.
  • Strong interpersonal, communication and analytical skills.
  • Should have the ability to express their design ideas and thoughts.


About Wissen Technology:


Wissen Technology is a niche global consulting and solutions company that brings unparalleled domain expertise in Banking and Finance, Telecom and Startups. Wissen Technology is a part of Wissen Group and was established in the year 2015. Wissen has offices in the US, India, UK, Australia, Mexico, and Canada, with best-in-class infrastructure and development facilities. Wissen has successfully delivered projects worth $1 Billion for more than 25 of the Fortune 500 companies. The Wissen Group overall includes more than 4000 highly skilled professionals.


Wissen Technology provides exceptional value in mission critical projects for its clients, through thought leadership, ownership, and assured on-time deliveries that are always ‘first time right’.


Our team consists of 1200+ highly skilled professionals, with leadership and senior management executives who have graduated from Ivy League Universities like Wharton, MIT, IITs, IIMs, and NITs and with rich work experience in some of the biggest companies in the world.


Wissen Technology offers an array of services including Application Development, Artificial Intelligence & Machine Learning, Big Data & Analytics, Visualization & Business Intelligence, Robotic Process Automation, Cloud, Mobility, Agile & DevOps, Quality Assurance & Test Automation.


We have been certified as a Great Place to Work® for two consecutive years (2020-2022) and voted as the Top 20 AI/ML vendor by CIO Insider.



Read more
Wissen Technology

at Wissen Technology

4 recruiters
VenkataRamanan S
Posted by VenkataRamanan S
Mumbai
4 - 8 yrs
₹15L - ₹25L / yr
skill iconPython
SQL
ETL

What We’re Looking For:

  • Strong experience in Python (3+ years).
  • Hands-on experience with any database (SQL or NoSQL).
  • Experience with frameworks like Flask, FastAPI, or Django.
  • Knowledge of ORMs, API development, and unit testing.
  • Familiarity with Git and Agile methodologies.


Read more
Bengaluru (Bangalore), Mumbai, Gurugram, Pune, Hyderabad, Chennai, Kolkata
3 - 8 yrs
₹5L - ₹20L / yr
Oracle Analytics Cloud (OAC)
Fusion Data Intelligence (FDI) Specialist
RPD
OAC Reports
Data Visualization
+7 more

Job Title : Oracle Analytics Cloud (OAC) / Fusion Data Intelligence (FDI) Specialist

Experience : 3 to 8 years

Location : All USI locations – Hyderabad, Bengaluru, Mumbai, Gurugram (preferred) and Pune, Chennai, Kolkata

Work Mode : Hybrid Only (2-3 days from office or all 5 days from office)


Mandatory Skills : Oracle Analytics Cloud (OAC), Fusion Data Intelligence (FDI), RPD, OAC Reports, Data Visualizations, SQL, PL/SQL, Oracle Databases, ODI, Oracle Cloud Infrastructure (OCI), DevOps tools, Agile methodology.


Key Responsibilities :

  • Design, develop, and maintain solutions using Oracle Analytics Cloud (OAC).
  • Build and optimize complex RPD models, OAC reports, and data visualizations.
  • Utilize SQL and PL/SQL for data querying and performance optimization.
  • Develop and manage applications hosted on Oracle Cloud Infrastructure (OCI).
  • Support Oracle Cloud migrations, OBIEE upgrades, and integration projects.
  • Collaborate with teams using the ODI (Oracle Data Integrator) tool for ETL processes.
  • Implement cloud scripting using CURL for Oracle Cloud automation.
  • Contribute to the design and implementation of Business Continuity and Disaster Recovery strategies for cloud applications.
  • Participate in Agile development processes and DevOps practices including CI/CD and deployment orchestration.

Required Skills :

  • Strong hands-on expertise in Oracle Analytics Cloud (OAC) and/or Fusion Data Intelligence (FDI).
  • Deep understanding of data modeling, reporting, and visualization techniques.
  • Proficiency in SQL, PL/SQL, and relational databases on Oracle.
  • Familiarity with DevOps tools, version control, and deployment automation.
  • Working knowledge of Oracle Cloud services, scripting, and monitoring.

Good to Have :

  • Prior experience in OBIEE to OAC migrations.
  • Exposure to data security models and cloud performance tuning.
  • Certification in Oracle Cloud-related technologies.
Read more
NeoGenCode Technologies Pvt Ltd
Akshay Patil
Posted by Akshay Patil
Bengaluru (Bangalore), Mumbai, Gurugram, Pune, Hyderabad, Chennai
3 - 6 yrs
₹5L - ₹20L / yr
IBM Sterling Integrator Developer
IBM Sterling B2B Integrator
Shell Scripting
skill iconPython
SQL
+1 more

Job Title : IBM Sterling Integrator Developer

Experience : 3 to 5 Years

Locations : Hyderabad, Bangalore, Mumbai, Gurgaon, Chennai, Pune

Employment Type : Full-Time


Job Description :

We are looking for a skilled IBM Sterling Integrator Developer with 3–5 years of experience to join our team across multiple locations.

The ideal candidate should have strong expertise in IBM Sterling and integration, along with scripting and database proficiency.

Key Responsibilities :

  • Develop, configure, and maintain IBM Sterling Integrator solutions.
  • Design and implement integration solutions using IBM Sterling.
  • Collaborate with cross-functional teams to gather requirements and provide solutions.
  • Work with custom languages and scripting to enhance and automate integration processes.
  • Ensure optimal performance and security of integration systems.

Must-Have Skills :

  • Hands-on experience with IBM Sterling Integrator and associated integration tools.
  • Proficiency in at least one custom scripting language.
  • Strong command over Shell scripting, Python, and SQL (mandatory).
  • Good understanding of EDI standards and protocols is a plus.

Interview Process :

  • 2 Rounds of Technical Interviews.

Additional Information :

  • Open to candidates from Hyderabad, Bangalore, Mumbai, Gurgaon, Chennai, and Pune.
Read more
Deqode

at Deqode

1 recruiter
Alisha Das
Posted by Alisha Das
Bengaluru (Bangalore), Mumbai, Pune, Chennai, Gurugram
5.6 - 7 yrs
₹10L - ₹28L / yr
skill iconAmazon Web Services (AWS)
skill iconPython
PySpark
SQL

Job Summary:

As an AWS Data Engineer, you will be responsible for designing, developing, and maintaining scalable, high-performance data pipelines using AWS services. With 6+ years of experience, you’ll collaborate closely with data architects, analysts, and business stakeholders to build reliable, secure, and cost-efficient data infrastructure across the organization.

Key Responsibilities:

  • Design, develop, and manage scalable data pipelines using AWS Glue, Lambda, and other serverless technologies
  • Implement ETL workflows and transformation logic using PySpark and Python on AWS Glue
  • Leverage AWS Redshift for warehousing, performance tuning, and large-scale data queries
  • Work with AWS DMS and RDS for database integration and migration
  • Optimize data flows and system performance for speed and cost-effectiveness
  • Deploy and manage infrastructure using AWS CloudFormation templates
  • Collaborate with cross-functional teams to gather requirements and build robust data solutions
  • Ensure data integrity, quality, and security across all systems and processes

Required Skills & Experience:

  • 6+ years of experience in Data Engineering with strong AWS expertise
  • Proficient in Python and PySpark for data processing and ETL development
  • Hands-on experience with AWS Glue, Lambda, DMS, RDS, and Redshift
  • Strong SQL skills for building complex queries and performing data analysis
  • Familiarity with AWS CloudFormation and infrastructure as code principles
  • Good understanding of serverless architecture and cost-optimized design
  • Ability to write clean, modular, and maintainable code
  • Strong analytical thinking and problem-solving skills


Read more
Wissen Technology

at Wissen Technology

4 recruiters
Sruthy VS
Posted by Sruthy VS
Bengaluru (Bangalore), Mumbai
4 - 8 yrs
Best in industry
Snow flake schema
ETL
SQL
skill iconPython
  • Strong Snowflake Cloud database experience Database developer.
  • Knowledge of Spark and Databricks is desirable.
  • Strong technical background in data modelling, database design and optimization for data warehouses, specifically on column oriented MPP architecture 
  • Familiar with technologies relevant to data lakes such as Snowflake
  • Candidate should have strong ETL & database design/modelling skills. 
  • Experience creating data pipelines
  • Strong SQL skills and debugging knowledge and Performance Tuning exp.
  • Experience with Databricks / Azure is add on /good to have . 
  • Experience working with global teams and global application environments
  • Strong understanding of SDLC methodologies with track record of high quality deliverables and data quality, including detailed technical design documentation desired

 

 

 

Read more
Deqode

at Deqode

1 recruiter
Roshni Maji
Posted by Roshni Maji
Mumbai
2 - 4 yrs
₹4L - ₹10L / yr
TestNG
Selenium
RESTful APIs
Automation
SQL
+3 more

Position: QA Automation Engineer

Location: Mumbai, India

Experience: 2+ Years

Type: Full-Time

Company: Deqode

Overview:

Deqode is seeking a skilled QA Automation Engineer with a passion for quality, automation, and robust testing practices. This role is ideal for professionals from product-based software development companies who have worked on e-commerce platforms.

Required Skills:

  • Proficiency in Selenium (Grid, parallel execution), TestNG.
  • Experience with API testing (Postman, RestAssured).
  • Strong SQL and backend data validation.
  • Experience using Git, Jira, Asana.
  • Familiarity with Jenkins, Confluence.
  • Understanding of cross-browser testing.
  • Exposure to Mocha_Chai for frontend/backend automation is a plus.
  • Eye for design and UX consistency.
  • Strong written and verbal communication.

Preferred Background:

  • Must be from a product-based software development company.
  • Prior experience in e-commerce projects is a major plus.
  • Ability to work on time-critical and fast-paced projects.


Key Responsibilities:

  • Design and maintain automated test frameworks for web and API applications.
  • Perform manual and automated tests to ensure product quality.
  • Build and execute test cases using Selenium with TestNG.
  • Conduct comprehensive REST API testing.
  • Write and optimize complex SQL queries for test data validation.
  • Use Jira/Asana for issue tracking and documentation.
  • Collaborate using Confluence for test documentation.
  • Execute tests across multiple browsers and operating systems.
  • Participate in Agile processes like sprint planning and retrospectives.
  • Identify and troubleshoot issues during testing.
  • Maintain CI pipelines using Jenkins.


Read more
Wissen Technology

at Wissen Technology

4 recruiters
Vishakha Walunj
Posted by Vishakha Walunj
Bengaluru (Bangalore), Pune, Mumbai
7 - 12 yrs
Best in industry
PySpark
databricks
SQL
skill iconPython

Required Skills:

  • Hands-on experience with Databricks, PySpark
  • Proficiency in SQL, Python, and Spark.
  • Understanding of data warehousing concepts and data modeling.
  • Experience with CI/CD pipelines and version control (e.g., Git).
  • Fundamental knowledge of any cloud services, preferably Azure or GCP.


Good to Have:

  • Bigquery
  • Experience with performance tuning and data governance.


Read more
Enpointeio
sanath shetty
Posted by sanath shetty
Mumbai
2 - 5 yrs
₹8L - ₹12L / yr
skill iconNodeJS (Node.js)
TypeScript
SQL
skill iconAmazon Web Services (AWS)

Position Overview

We're seeking a skilled Full Stack Developer to build and maintain scalable web applications using modern technologies. You'll work across the entire development stack, from database design to user interface implementation.


Key Responsibilities

  • Develop and maintain full-stack web applications using Node.js and TypeScript
  • Design and implement RESTful APIs and microservices
  • Build responsive, user-friendly front-end interfaces
  • Design and optimize SQL databases and write efficient queries
  • Collaborate with cross-functional teams on feature development
  • Participate in code reviews and maintain high code quality standards
  • Debug and troubleshoot application issues across the stack

Required Skills

  • Backend: 3+ years experience with Node.js and TypeScript
  • Database: Proficient in SQL (PostgreSQL, MySQL, or similar)
  • Frontend: Experience with modern JavaScript frameworks (React, Vue, or Angular)
  • Version Control: Git and collaborative development workflows
  • API Development: RESTful services and API design principles

Preferred Qualifications

  • Experience with cloud platforms (AWS, Azure, or GCP)
  • Knowledge of containerization (Docker)
  • Familiarity with testing frameworks (Jest, Mocha, or similar)
  • Understanding of CI/CD pipelines

What We Offer

  • Competitive salary and benefits
  • Flexible work arrangements
  • Professional development opportunities
  • Collaborative team environment


Read more
TCS

TCS

Agency job
via Risk Resources LLP hyd by Jhansi Padiy
Mumbai, Pune, Chennai
4 - 8 yrs
₹6L - ₹20L / yr
Marketing Campaign
SAS
Teradata
SQL

 

Required Technical Skill Set:Teradata with Marketing Campaign knowledge and SAS

Desired Competencies (Technical/Behavioral Competency)

Must-Have

1. Advanced coding skills in Teradata SQL and SAS is required

2. Experience with customer segmentation, marketing optimization, and marketing automation. Thorough understanding of customer contact management principles

3. Design and execution of campaign on consumer and business products using Teradata communication manager and inhouse tools

4. Analyzing effectiveness of various campaigns by doing necessary analysis to add insights and improve future campaigns

5. Timely resolution of Marketing team queries and other ad-hoc request

Good-to-Have

1. Awareness of CRM tools & process, automation

2. Knowledge of commercial databases preferable

3. People & team management skills

Read more
Wissen Technology

at Wissen Technology

4 recruiters
Rutuja Patil
Posted by Rutuja Patil
Mumbai
5 - 12 yrs
Best in industry
skill iconJava
skill iconSpring Boot
Hibernate
Java Architecture for XML Binding (JAXBJava Architecture for XML Binding...
Problem solving
+5 more

Experience: 5+ years

Location: Mumbai - Andheri Marol

Company: Wissen Technology

Website: www.wissen.com


Job Description:


Wissen Technology is currently hiring experienced Full Stack Java Developers with strong expertise in Core Java, Spring Boot, front-end technologies (Angular/React), REST APIs, multithreading, data structures, and SQL.


You will be working on enterprise-grade solutions as part of a global development team tackling complex problems in domains like Banking and Finance.


This is an excellent opportunity to be part of a high-caliber, technically strong team delivering impactful solutions for Fortune 500 companies.


Key Responsibilities:


  • Develop and maintain scalable, high-performance applications with a focus on both backend (Java/Spring Boot) and frontend (Angular/React).
  • Participate in all phases of the development lifecycle – design, coding, testing, deployment, and maintenance.
  • Work closely with cross-functional teams to understand requirements and deliver quality solutions.
  • Optimize application performance and ensure responsiveness across platforms.
  • Write clean, maintainable, and well-documented code.


Required Skills:


  • 5+ years of hands-on experience in Java, Spring Boot.
  • Strong proficiency in frontend frameworks like Angular or React.
  • Experience with RESTful APIs, microservices, and web technologies (HTML, CSS, JavaScript).
  • Sound knowledge of data structures, algorithms, and multithreading.
  • Experience in developing and deploying enterprise-grade applications.
  • Solid understanding of RDBMS (e.g., Oracle, MySQL, PostgreSQL).
  • Exposure to DevOps practices, version control (Git), and CI/CD pipelines.
  • Familiarity with Unix/Linux environments.
  • Strong problem-solving, communication, and analytical skills.


Good to Have:

  • Exposure to cloud platforms like AWS, Azure, or GCP.
  • Understanding of containerization using Docker/Kubernetes.
  • Knowledge of Agile methodologies.



About Wissen Technology:


Wissen Technology is a niche global consulting and solutions company that brings unparalleled domain expertise in Banking and Finance, Telecom and Startups. Wissen Technology is a part of Wissen Group and was established in the year 2015. Wissen has offices in the US, India, UK, Australia, Mexico, and Canada, with best-in-class infrastructure and development facilities. Wissen has successfully delivered projects worth $1 Billion for more than 25 of the Fortune 500 companies. The Wissen Group overall includes more than 4000 highly skilled professionals.


Wissen Technology provides exceptional value in mission critical projects for its clients, through thought leadership, ownership, and assured on-time deliveries that are always ‘first time right’.


Our team consists of 1200+ highly skilled professionals, with leadership and senior management executives who have graduated from Ivy League Universities like Wharton, MIT, IITs, IIMs, and NITs and with rich work experience in some of the biggest companies in the world.


Wissen Technology offers an array of services including Application Development, Artificial Intelligence & Machine Learning, Big Data & Analytics, Visualization & Business Intelligence, Robotic Process Automation, Cloud, Mobility, Agile & DevOps, Quality Assurance & Test Automation.


We have been certified as a Great Place to Work® for two consecutive years (2020-2022) and voted as the Top 20 AI/ML vendor by CIO Insider.



Read more
Deqode

at Deqode

1 recruiter
Mokshada Solanki
Posted by Mokshada Solanki
Bengaluru (Bangalore), Mumbai, Pune, Gurugram
4 - 5 yrs
₹4L - ₹20L / yr
SQL
skill iconAmazon Web Services (AWS)
Migration
PySpark
ETL

Job Summary:

Seeking a seasoned SQL + ETL Developer with 4+ years of experience in managing large-scale datasets and cloud-based data pipelines. The ideal candidate is hands-on with MySQL, PySpark, AWS Glue, and ETL workflows, with proven expertise in AWS migration and performance optimization.


Key Responsibilities:

  • Develop and optimize complex SQL queries and stored procedures to handle large datasets (100+ million records).
  • Build and maintain scalable ETL pipelines using AWS Glue and PySpark.
  • Work on data migration tasks in AWS environments.
  • Monitor and improve database performance; automate key performance indicators and reports.
  • Collaborate with cross-functional teams to support data integration and delivery requirements.
  • Write shell scripts for automation and manage ETL jobs efficiently.


Required Skills:

  • Strong experience with MySQL, complex SQL queries, and stored procedures.
  • Hands-on experience with AWS Glue, PySpark, and ETL processes.
  • Good understanding of AWS ecosystem and migration strategies.
  • Proficiency in shell scripting.
  • Strong communication and collaboration skills.


Nice to Have:

  • Working knowledge of Python.
  • Experience with AWS RDS.



Read more
Deqode

at Deqode

1 recruiter
Shraddha Katare
Posted by Shraddha Katare
Bengaluru (Bangalore), Pune, Chennai, Mumbai, Gurugram
5 - 7 yrs
₹5L - ₹19L / yr
skill iconAmazon Web Services (AWS)
skill iconPython
PySpark
SQL
redshift

Profile: AWS Data Engineer

Mode- Hybrid

Experience- 5+7 years

Locations - Bengaluru, Pune, Chennai, Mumbai, Gurugram


Roles and Responsibilities

  • Design and maintain ETL pipelines using AWS Glue and Python/PySpark
  • Optimize SQL queries for Redshift and Athena
  • Develop Lambda functions for serverless data processing
  • Configure AWS DMS for database migration and replication
  • Implement infrastructure as code with CloudFormation
  • Build optimized data models for performance
  • Manage RDS databases and AWS service integrations
  • Troubleshoot and improve data processing efficiency
  • Gather requirements from business stakeholders
  • Implement data quality checks and validation
  • Document data pipelines and architecture
  • Monitor workflows and implement alerting
  • Keep current with AWS services and best practices


Required Technical Expertise:

  • Python/PySpark for data processing
  • AWS Glue for ETL operations
  • Redshift and Athena for data querying
  • AWS Lambda and serverless architecture
  • AWS DMS and RDS management
  • CloudFormation for infrastructure
  • SQL optimization and performance tuning
Read more
Deqode

at Deqode

1 recruiter
Alisha Das
Posted by Alisha Das
Pune, Mumbai, Bengaluru (Bangalore), Chennai
4 - 7 yrs
₹5L - ₹15L / yr
skill iconAmazon Web Services (AWS)
skill iconPython
PySpark
Glue semantics
Amazon Redshift
+1 more

Job Overview:

We are seeking an experienced AWS Data Engineer to join our growing data team. The ideal candidate will have hands-on experience with AWS Glue, Redshift, PySpark, and other AWS services to build robust, scalable data pipelines. This role is perfect for someone passionate about data engineering, automation, and cloud-native development.

Key Responsibilities:

  • Design, build, and maintain scalable and efficient ETL pipelines using AWS Glue, PySpark, and related tools.
  • Integrate data from diverse sources and ensure its quality, consistency, and reliability.
  • Work with large datasets in structured and semi-structured formats across cloud-based data lakes and warehouses.
  • Optimize and maintain data infrastructure, including Amazon Redshift, for high performance.
  • Collaborate with data analysts, data scientists, and product teams to understand data requirements and deliver solutions.
  • Automate data validation, transformation, and loading processes to support real-time and batch data processing.
  • Monitor and troubleshoot data pipeline issues and ensure smooth operations in production environments.

Required Skills:

  • 5 to 7 years of hands-on experience in data engineering roles.
  • Strong proficiency in Python and PySpark for data transformation and scripting.
  • Deep understanding and practical experience with AWS Glue, AWS Redshift, S3, and other AWS data services.
  • Solid understanding of SQL and database optimization techniques.
  • Experience working with large-scale data pipelines and high-volume data environments.
  • Good knowledge of data modeling, warehousing, and performance tuning.

Preferred/Good to Have:

  • Experience with workflow orchestration tools like Airflow or Step Functions.
  • Familiarity with CI/CD for data pipelines.
  • Knowledge of data governance and security best practices on AWS.
Read more
Deqode

at Deqode

1 recruiter
Shraddha Katare
Posted by Shraddha Katare
Pune, Mumbai, Bengaluru (Bangalore), Gurugram
4 - 6 yrs
₹5L - ₹10L / yr
ETL
SQL
skill iconAmazon Web Services (AWS)
PySpark
KPI

Role - ETL Developer

Work ModeHybrid

Experience- 4+ years

Location - Pune, Gurgaon, Bengaluru, Mumbai

Required Skills - AWS, AWS Glue, Pyspark, ETL, SQL

Required Skills:

  • 4+ years of hands-on experience in MySQL, including SQL queries and procedure development
  • Experience in Pyspark, AWS, AWS Glue
  • Experience in AWS ,Migration
  • Experience with automated scripting and tracking KPIs/metrics for database performance
  • Proficiency in shell scripting and ETL.
  • Strong communication skills and a collaborative team player
  • Knowledge of Python and AWS RDS is a plus


Read more
Wissen Technology

at Wissen Technology

4 recruiters
Khushboo Kumari
Posted by Khushboo Kumari
Bengaluru (Bangalore), Mumbai
4 - 14 yrs
₹6L - ₹29L / yr
skill iconJava
Data Structures
Algorithms
Multithreading
SQL
+1 more

Job Title: Java Developer


Java Developer – Job Description Wissen Technology is now hiring for a Java Developer - Bangalore with hands-on experience in Core Java, algorithms, data structures, multithreading and SQL. We are solving complex technical problems in the industry and need talented software engineers to join our mission and be a part of a global software development team. A brilliant opportunity to become a part of a highly motivated and expert team which has made a mark as a high-end technical consulting.

 Required Skills: • Exp. - 4 to 14 years.

• Experience in Core Java and Spring Boot.

• Extensive experience in developing enterprise-scale applications and systems. Should possess good architectural knowledge and be aware of enterprise application design patterns.

 • Should have the ability to analyze, design, develop and test complex, low-latency clientfacing applications.

• Good development experience with RDBMS.

 • Good knowledge of multi-threading and high-performance server-side development.

• Basic working knowledge of Unix/Linux.

• Excellent problem solving and coding skills.

• Strong interpersonal, communication and analytical skills.

• Should have the ability to express their design ideas and thoughts.

Read more
Wissen Technology

at Wissen Technology

4 recruiters
Khushboo Kumari
Posted by Khushboo Kumari
Bengaluru (Bangalore), Mumbai
3 - 12 yrs
₹6L - ₹29L / yr
skill iconJava
Data Structures
Algorithms
Multithreading
SQL

Hello Everyone,


Job Description Wissen Technology is now hiring for a Java Developer - Bangalore with hands-on experience in Core Java, algorithms, data structures, multithreading and SQL. We are solving complex technical problems in the industry and need talented software engineers to join our mission and be a part of a global software development team. A brilliant opportunity to become a part of a highly motivated and expert team which has made a mark as a high-end technical consulting.

 Required Skills: • Exp. - 5 to 14 years.

• Experience in Core Java and Spring Boot.

• Extensive experience in developing enterprise-scale applications and systems. Should possess good architectural knowledge and be aware of enterprise application design patterns.

 • Should have the ability to analyze, design, develop and test complex, low-latency clientfacing applications.

• Good development experience with RDBMS.

 • Good knowledge of multi-threading and high-performance server-side development.

• Basic working knowledge of Unix/Linux.

• Excellent problem solving and coding skills.

• Strong interpersonal, communication and analytical skills.

• Should have the ability to express their design ideas and thoughts.


About Wissen Technology:

Wissen Technology is a niche global consulting and solutions company that brings unparalleled domain expertise in Banking and Finance, Telecom and Startups. Wissen Technology is a part of Wissen Group and was established in the year 2015. Wissen has offices in the US, India, UK, Australia, Mexico, and Canada, with best-in-class infrastructure and development facilities. Wissen hassuccessfully delivered projects worth $1 Billion for more than 25 of the Fortune 500 companies. The Wissen Group overall includes more than 4000 highly skilled professionals. Wissen Technology provides exceptional value in mission critical projects for its clients, through thought leadership, ownership, and assured on-time deliveries that are always ‘first time right’. Our team consists of 1200+ highly skilled professionals, with leadership and senior management executives who have graduated from Ivy League Universities like Wharton, MIT, IITs, IIMs, and NITs and with rich work experience in some of the biggest companies in the world. Wissen Technology offers an array of services including Application Development, Artificial Intelligence & Machine Learning, Big Data & Analytics, Visualization & Business Intelligence, Robotic Process Automation, Cloud, Mobility, Agile & DevOps, Quality Assurance & Test Automation. We have been certified as a Great Place to Work® for two consecutive years (2020-2022) and voted as the Top 20 AI/ML vendor by CIO Insider.


Read more
TechMynd Consulting

at TechMynd Consulting

2 candid answers
Suraj N
Posted by Suraj N
Bengaluru (Bangalore), Gurugram, Mumbai
4 - 8 yrs
₹10L - ₹24L / yr
skill iconData Science
skill iconPostgreSQL
skill iconPython
Apache
skill iconAmazon Web Services (AWS)
+5 more

Senior Data Engineer


Location: Bangalore, Gurugram (Hybrid)


Experience: 4-8 Years


Type: Full Time | Permanent


Job Summary:


We are looking for a results-driven Senior Data Engineer to join our engineering team. The ideal candidate will have hands-on expertise in data pipeline development, cloud infrastructure, and BI support, with a strong command of modern data stacks. You’ll be responsible for building scalable ETL/ELT workflows, managing data lakes and marts, and enabling seamless data delivery to analytics and business intelligence teams.


This role requires deep technical know-how in PostgreSQL, Python scripting, Apache Airflow, AWS or other cloud environments, and a working knowledge of modern data and BI tools.


Key Responsibilities:


PostgreSQL & Data Modeling


· Design and optimize complex SQL queries, stored procedures, and indexes


· Perform performance tuning and query plan analysis


· Contribute to schema design and data normalization


Data Migration & Transformation


· Migrate data from multiple sources to cloud or ODS platforms


· Design schema mapping and implement transformation logic


· Ensure consistency, integrity, and accuracy in migrated data


Python Scripting for Data Engineering


· Build automation scripts for data ingestion, cleansing, and transformation


· Handle file formats (JSON, CSV, XML), REST APIs, cloud SDKs (e.g., Boto3)


· Maintain reusable script modules for operational pipelines


Data Orchestration with Apache Airflow


· Develop and manage DAGs for batch/stream workflows


· Implement retries, task dependencies, notifications, and failure handling


· Integrate Airflow with cloud services, data lakes, and data warehouses


Cloud Platforms (AWS / Azure / GCP)


· Manage data storage (S3, GCS, Blob), compute services, and data pipelines


· Set up permissions, IAM roles, encryption, and logging for security


· Monitor and optimize cost and performance of cloud-based data operations


Data Marts & Analytics Layer


· Design and manage data marts using dimensional models


· Build star/snowflake schemas to support BI and self-serve analytics


· Enable incremental load strategies and partitioning


Modern Data Stack Integration


· Work with tools like DBT, Fivetran, Redshift, Snowflake, BigQuery, or Kafka


· Support modular pipeline design and metadata-driven frameworks


· Ensure high availability and scalability of the stack


BI & Reporting Tools (Power BI / Superset / Supertech)


· Collaborate with BI teams to design datasets and optimize queries


· Support development of dashboards and reporting layers


· Manage access, data refreshes, and performance for BI tools




Required Skills & Qualifications:


· 4–6 years of hands-on experience in data engineering roles


· Strong SQL skills in PostgreSQL (tuning, complex joins, procedures)


· Advanced Python scripting skills for automation and ETL


· Proven experience with Apache Airflow (custom DAGs, error handling)


· Solid understanding of cloud architecture (especially AWS)


· Experience with data marts and dimensional data modeling


· Exposure to modern data stack tools (DBT, Kafka, Snowflake, etc.)


· Familiarity with BI tools like Power BI, Apache Superset, or Supertech BI


· Version control (Git) and CI/CD pipeline knowledge is a plus


· Excellent problem-solving and communication skills

Read more
Leading HealthTech, a U.S.-based product company

Leading HealthTech, a U.S.-based product company

Agency job
via Recruiting Bond by Pavan Kumar
Bengaluru (Bangalore), Mumbai
9 - 13 yrs
₹35L - ₹45L / yr
skill iconJava
J2EE
WebLogic
Spring
Apache Camel
+18 more

🚀 We're Hiring: Technical Lead – Java Backend & Integration

📍 Bangalore | Hybrid | Full-Time

👨‍💻 9+ Years Experience | Enterprise Product Development

🏥 Healthcare Tech | U.S. Health Insurance Domain

Join Leading HealthTech, a U.S.-based product company driving innovation in the $1.1 trillion health insurance industry. We power over 81 million lives, with 130+ customers and 100+ third-party integrations. At our growing Bangalore tech hub, you’ll solve real-world, large-scale problems and help modernize one of the most stable and impactful industries in the world.


🔧 What You'll Work On:

  • Architect and build backend & integration solutions using Java, J2EE, WebLogic, Spring, Apache Camel
  • Transition monolith systems to microservices-based architecture
  • Lead design reviews, customer discussions, code quality, UAT & production readiness
  • Work with high-volume transactional systems processing millions of health claims daily
  • Coach & mentor engineers, contribute to platform modernization


🧠 What You Bring:

  • 9+ years in backend Java development and enterprise system integration
  • Hands-on with REST, SOAP, JMS, SQL, stored procedures, XML, ESBs
  • Solid understanding of SOA, data structures, system design, and performance tuning
  • Experience with Agile, CI/CD, unit testing, and code quality tools
  • Healthcare/payor domain experience is a huge plus!


💡 Why this opportunity?

  • Global product impact from our India technology center
  • Work on mission-critical systems in a stable and recession-resilient sector
  • Be part of a journey to modernize healthcare through tech
  • Solve complex challenges at scale that few companies offer

🎯 Ready to drive change at the intersection of tech and healthcare?

Read more
Ketto

at Ketto

1 video
3 recruiters
Sagar Ganatra
Posted by Sagar Ganatra
Mumbai
1 - 3 yrs
₹10L - ₹15L / yr
Tableau
PowerBI
SQL
skill iconPython
Dashboard
+5 more

About the company:


Ketto is Asia's largest tech-enabled crowdfunding platform with a vision - Healthcare for all. We are a profit-making organization with a valuation of more than 100 Million USD. With over 1,100 crores raised from more than 60 lakh donors, we have positively impacted the lives of 2 lakh+ campaigners. Ketto has embarked on a high-growth journey, and we would like you to be part of our family, helping us to create a large-scale impact on a daily basis by taking our product to the next level



Role Overview:


Ketto, Asia's largest crowdfunding platform, is looking for an innovative Product Analyst to take charge of our data systems, reporting frameworks, and generative AI initiatives. This role is pivotal in ensuring data integrity and reliability, driving key insights that fuel strategic decisions, and implementing automation through AI. This position encompasses the full data and analytics lifecycle—from requirements gathering to design planning—alongside implementing advanced analytics and generative AI solutions to support Ketto’s mission.


Key Responsibilities


●  Data Strategy & Automation:

○ Lead data collection, processing, and quality assurance processes to ensure accuracy, completeness, and relevance.

○ Explore opportunities to incorporate generative AI models to automate and optimize processes, enhancing efficiencies in analytics, reporting, and decision-making.


●  Data Analysis & Insight Generation:

○ Conduct in-depth analyses of user behaviour, campaign performance, and platform metrics to uncover insights that support crowdfunding success.

○ Translate complex data into clear, actionable insights that drive strategic decisions, providing stakeholders with the necessary information to enhance business outcomes.


●  Reporting & Quality Assurance:

○ Design and maintain a robust reporting framework to deliver timely insights, enhancing data reliability and ensuring stakeholders are well-informed.

○ Monitor and improve data accuracy, consistency, and integrity across all data processes, identifying and addressing areas for enhancement.


●  Collaboration & Strategic Planning:

○ Work closely with Business, Product, and IT teams to align data initiatives with Ketto’s objectives and growth strategy.

○ Propose data-driven strategies that leverage AI and automation to tackle business challenges and scale impact across the platform.

○ Mentor junior data scientists and analysts, fostering a culture of data-driven decision-making.


Required Skills and Qualifications


●  Technical Expertise:

○ Strong background in SQL, Statistics and Maths


●  Analytical & Strategic Mindset:

○ Proven ability to derive meaningful, actionable insights from large data sets and translate findings into business strategies.

○ Experience with statistical analysis, advanced analytics


●  Communication & Collaboration:

○ Exceptional written and verbal communication skills, capable of explaining complex data insights to non-technical stakeholders.

○ Strong interpersonal skills to work effectively with cross-functional teams, aligning data initiatives with organisational goals.


●  Preferred Experience:

○ Proven experience in advanced analytics roles

○ Experience leading data lifecycle management, model deployment, and quality assurance initiatives.


Why Join Ketto?

At Ketto, we’re committed to driving social change through innovative data and AI solutions. As our Sr. Product Analyst, you’ll have the unique opportunity to leverage advanced data science and generative AI to shape the future of crowdfunding in Asia. If you’re passionate about using data and AI for social good, we’d love to hear from you!

Read more
VyTCDC
Gobinath Sundaram
Posted by Gobinath Sundaram
Mumbai
1.5 - 1.8 yrs
₹1L - ₹6L / yr
Production support
Linux/Unix
SQL

A Production Support Engineer ensures the smooth operation of software applications and IT systems in a production environment. Here’s a breakdown of the role:

Key Responsibilities

  • Monitoring System Performance: Continuously track application health and resolve performance issues.
  • Incident Management: Diagnose and fix software failures, collaborating with developers and system administrators.
  • Troubleshooting & Debugging: Analyze logs, use diagnostic tools, and implement solutions to improve system reliability.
  • Documentation & Reporting: Maintain records of system issues, resolutions, and process improvements.
  • Collaboration: Work with cross-functional teams to enhance system efficiency and reduce downtime.
  • Process Optimization: Suggest improvements to reduce production costs and enhance system stability.

Required Skills

  • Strong knowledge of SQL, UNIX/Linux, Java, Oracle, and Splunk.
  • Experience in incident management and debugging.
  • Ability to analyze system failures and optimize performance.
  • Good communication and problem-solving skills.
Read more
ZeMoSo Technologies

at ZeMoSo Technologies

11 recruiters
Agency job
via TIGI HR Solution Pvt. Ltd. by Vaidehi Sarkar
Mumbai, Bengaluru (Bangalore), Hyderabad, Chennai, Pune
4 - 8 yrs
₹10L - ₹15L / yr
Data engineering
skill iconPython
SQL
Data Warehouse (DWH)
skill iconAmazon Web Services (AWS)
+3 more

Work Mode: Hybrid


Need B.Tech, BE, M.Tech, ME candidates - Mandatory



Must-Have Skills:

● Educational Qualification :- B.Tech, BE, M.Tech, ME in any field.

● Minimum of 3 years of proven experience as a Data Engineer.

● Strong proficiency in Python programming language and SQL.

● Experience in DataBricks and setting up and managing data pipelines, data warehouses/lakes.

● Good comprehension and critical thinking skills.


● Kindly note Salary bracket will vary according to the exp. of the candidate - 

- Experience from 4 yrs to 6 yrs - Salary upto 22 LPA

- Experience from 5 yrs to 8 yrs - Salary upto 30 LPA

- Experience more than 8 yrs - Salary upto 40 LPA

Read more
Wissen Technology

at Wissen Technology

4 recruiters
Seema Srivastava
Posted by Seema Srivastava
Mumbai
5 - 10 yrs
Best in industry
skill iconPython
SQL
Databases
Data engineering
skill iconAmazon Web Services (AWS)

Job Description: Data Engineer 

Position Overview:

Role Overview

We are seeking a skilled Python Data Engineer with expertise in designing and implementing data solutions using the AWS cloud platform. The ideal candidate will be responsible for building and maintaining scalable, efficient, and secure data pipelines while leveraging Python and AWS services to enable robust data analytics and decision-making processes.

 

Key Responsibilities

· Design, develop, and optimize data pipelines using Python and AWS services such as Glue, Lambda, S3, EMR, Redshift, Athena, and Kinesis.

· Implement ETL/ELT processes to extract, transform, and load data from various sources into centralized repositories (e.g., data lakes or data warehouses).

· Collaborate with cross-functional teams to understand business requirements and translate them into scalable data solutions.

· Monitor, troubleshoot, and enhance data workflows for performance and cost optimization.

· Ensure data quality and consistency by implementing validation and governance practices.

· Work on data security best practices in compliance with organizational policies and regulations.

· Automate repetitive data engineering tasks using Python scripts and frameworks.

· Leverage CI/CD pipelines for deployment of data workflows on AWS.

 

Required Skills and Qualifications

· Professional Experience: 5+ years of experience in data engineering or a related field.

· Programming: Strong proficiency in Python, with experience in libraries like pandas, pySpark, or boto3.

· AWS Expertise: Hands-on experience with core AWS services for data engineering, such as:

· AWS Glue for ETL/ELT.

· S3 for storage.

· Redshift or Athena for data warehousing and querying.

· Lambda for serverless compute.

· Kinesis or SNS/SQS for data streaming.

· IAM Roles for security.

· Databases: Proficiency in SQL and experience with relational (e.g., PostgreSQL, MySQL) and NoSQL (e.g., DynamoDB) databases.

· Data Processing: Knowledge of big data frameworks (e.g., Hadoop, Spark) is a plus.

· DevOps: Familiarity with CI/CD pipelines and tools like Jenkins, Git, and CodePipeline.

· Version Control: Proficient with Git-based workflows.

· Problem Solving: Excellent analytical and debugging skills.

 

Optional Skills

· Knowledge of data modeling and data warehouse design principles.

· Experience with data visualization tools (e.g., Tableau, Power BI).

· Familiarity with containerization (e.g., Docker) and orchestration (e.g., Kubernetes).

· Exposure to other programming languages like Scala or Java.

 

Education

· Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field.

 

Why Join Us?

· Opportunity to work on cutting-edge AWS technologies.

· Collaborative and innovative work environment.

 

 

Read more
Deqode

at Deqode

1 recruiter
Roshni Maji
Posted by Roshni Maji
Mumbai
2 - 4 yrs
₹8L - ₹13L / yr
skill iconPython
RESTful APIs
SQL
JIRA

Requirements:

  • Must have proficiency in Python
  • At least 3+ years of professional experience in software application development.
  • Good understanding of REST APIs and a solid experience in testing APIs.
  • Should have built APIs at some point and practical knowledge on working with them
  • Must have experience in API testing tools like Postman and in setting up the prerequisites and post-execution validations using these tools
  • Ability to develop applications for test automation
  • Should have worked in a distributed micro-service environment
  • Hands-on experience with Python packages for testing (preferably pytest).
  • Should be able to create fixtures, mock objects and datasets that can be used by tests across different micro-services
  • Proficiency in gitStrong in writing SQL queriesTools like Jira, Asana or similar bug tracking tool, Confluence - Wiki, Jenkins - CI tool
  • Excellent written and oral communication and organisational skills with the ability to work within a growing company with increasing needs
  • Proven track record of ability to handle time-critical projects


Good to have:

  • Good understanding of CI/CDKnowledge of queues, especially Kafka
  • Ability to independently manage test environment deployments and handle issues around itPerformed load testing of API endpoints
  • Should have built an API test automation framework from scratch and maintained it
  • Knowledge of cloud platforms like AWS, Azure
  • Knowledge of different browsers and cross-platform operating systems
  • Knowledge of JavaScript
  • Web Programming, Docker & 3-Tier Architecture Knowledge is preferred.
  • Should have knowlege in API Creation, Coding Experience would be add on.
  • 5+ years experience in test automation using tools like TestNG, Selenium Webdriver (Grid, parallel, SauceLabs), Mocha_Chai front-end and backend test automation
  • Bachelor's degree in Computer Science / IT / Computer Applications


Read more
Wissen Technology

at Wissen Technology

4 recruiters
Seema Srivastava
Posted by Seema Srivastava
Mumbai
5 - 10 yrs
₹15L - ₹30L / yr
SQL
skill iconJava
skill iconSpring Boot

We're Hiring: Java Developers | Mumbai (Hybrid) 🚀

Are you a passionate Java Developer with 5 to 10 years of experience? Here's your chance to take your career to the next level! 💼

We're looking for talented professionals to join an exciting opportunity with a top-tier BFSI domain Project—a true leader in the market. 🏦💻

🔹 Location: Mumbai

🔹 Work Mode: Hybrid

🔹 Experience: 4 to 10 years

🔹 Domain: BFSI

This is more than just a job—it's a chance to work on impactful projects and grow with some of the best minds in the industry. 🌟

👉 If you're interested, please share your updated resume along with the following details:

Total Experience

Current CTC

Expected CTC

Tag your network or apply now—this could be your next big move! 🔄🚀

Read more
Deqode

at Deqode

1 recruiter
Shraddha Katare
Posted by Shraddha Katare
Mumbai
1 - 2 yrs
₹6L - ₹8L / yr
ETL
SQL
NOSQL Databases
RESTful APIs
Troubleshooting
+8 more

Profile: Product Support Engineer

🔴 Experience: 1 year as Product Support Engineer.

🔴 Location: Mumbai (Andheri).

🔴 5 days of working from office.


Skills Required:

🔷 Experience in providing support for ETL or data warehousing is preferred.

🔷 Good Understanding on Unix and Databases concepts.

🔷 Experience working with SQL and No-SQL databases and writing simple

queries to get data for debugging issues.

🔷 Being able to creatively come up with solutions for various problems and

implement them.

🔷 Experience working with REST APIs and debugging requests and

responses using tools like Postman.

🔷 Quick troubleshooting and diagnosing skills.

🔷 Knowledge of customer success processes.

🔷 Experience in document creation.

🔷 High availability for fast response to customers.

🔷 Language knowledge required in one of NodeJs, Python, Java.

🔷 Background in AWS, Docker, Kubernetes, Networking - an advantage.

🔷 Experience in SAAS B2B software companies - an advantage.

🔷 Ability to join the dots around multiple events occurring concurrently and

spot patterns.


Read more
Jio Tesseract
TARUN MISHRA
Posted by TARUN MISHRA
Bengaluru (Bangalore), Pune, Hyderabad, Delhi, Gurugram, Noida, Ghaziabad, Faridabad, Mumbai, Navi Mumbai, Kolkata, Rajasthan
5 - 24 yrs
₹9L - ₹70L / yr
skill iconC
skill iconC++
Visual C++
Embedded C++
Artificial Intelligence (AI)
+32 more

JioTesseract, a digital arm of Reliance Industries, is India's leading and largest AR/VR organization with the mission to democratize mixed reality for India and the world. We make products at the cross of hardware, software, content and services with focus on making India the leader in spatial computing. We specialize in creating solutions in AR, VR and AI, with some of our notable products such as JioGlass, JioDive, 360 Streaming, Metaverse, AR/VR headsets for consumers and enterprise space.


Mon-fri role, In office, with excellent perks and benefits!


Position Overview

We are seeking a Software Architect to lead the design and development of high-performance robotics and AI software stacks utilizing NVIDIA technologies. This role will focus on defining scalable, modular, and efficient architectures for robot perception, planning, simulation, and embedded AI applications. You will collaborate with cross-functional teams to build next-generation autonomous systems 9


Key Responsibilities:

1. System Architecture & Design

● Define scalable software architectures for robotics perception, navigation, and AI-driven decision-making.

● Design modular and reusable frameworks that leverage NVIDIA’s Jetson, Isaac ROS, Omniverse, and CUDA ecosystems.

● Establish best practices for real-time computing, GPU acceleration, and edge AI inference.


2. Perception & AI Integration

● Architect sensor fusion pipelines using LIDAR, cameras, IMUs, and radar with DeepStream, TensorRT, and ROS2.

● Optimize computer vision, SLAM, and deep learning models for edge deployment on Jetson Orin and Xavier.

● Ensure efficient GPU-accelerated AI inference for real-time robotics applications.


3. Embedded & Real-Time Systems

● Design high-performance embedded software stacks for real-time robotic control and autonomy.

● Utilize NVIDIA CUDA, cuDNN, and TensorRT to accelerate AI model execution on Jetson platforms.

● Develop robust middleware frameworks to support real-time robotics applications in ROS2 and Isaac SDK.


4. Robotics Simulation & Digital Twins

● Define architectures for robotic simulation environments using NVIDIA Isaac Sim & Omniverse.

● Leverage synthetic data generation (Omniverse Replicator) for training AI models.

● Optimize sim-to-real transfer learning for AI-driven robotic behaviors.


5. Navigation & Motion Planning

● Architect GPU-accelerated motion planning and SLAM pipelines for autonomous robots.

● Optimize path planning, localization, and multi-agent coordination using Isaac ROS Navigation.

● Implement reinforcement learning-based policies using Isaac Gym.


6. Performance Optimization & Scalability

● Ensure low-latency AI inference and real-time execution of robotics applications.

● Optimize CUDA kernels and parallel processing pipelines for NVIDIA hardware.

● Develop benchmarking and profiling tools to measure software performance on edge AI devices.


Required Qualifications:

● Master’s or Ph.D. in Computer Science, Robotics, AI, or Embedded Systems.

● Extensive experience (7+ years) in software development, with at least 3-5 years focused on architecture and system design, especially for robotics or embedded systems.

● Expertise in CUDA, TensorRT, DeepStream, PyTorch, TensorFlow, and ROS2.

● Experience in NVIDIA Jetson platforms, Isaac SDK, and GPU-accelerated AI.

● Proficiency in programming languages such as C++, Python, or similar, with deep understanding of low-level and high-level design principles.

● Strong background in robotic perception, planning, and real-time control.

● Experience with cloud-edge AI deployment and scalable architectures.


Preferred Qualifications

● Hands-on experience with NVIDIA DRIVE, NVIDIA Omniverse, and Isaac Gym

● Knowledge of robot kinematics, control systems, and reinforcement learning

● Expertise in distributed computing, containerization (Docker), and cloud robotics

● Familiarity with automotive, industrial automation, or warehouse robotics

● Experience designing architectures for autonomous systems or multi-robot systems.

● Familiarity with cloud-based solutions, edge computing, or distributed computing for robotics

● Experience with microservices or service-oriented architecture (SOA)

● Knowledge of machine learning and AI integration within robotic systems

● Knowledge of testing on edge devices with HIL and simulations (Isaac Sim, Gazebo, V-REP etc.)

Read more
Jio Haptik
Priya Agrawal
Posted by Priya Agrawal
Mumbai
4 - 8 yrs
₹20L - ₹32L / yr
skill iconPython
skill iconDjango
skill iconElastic Search
Systems design
SQL
+3 more

What we want to accomplish and why we need you?


Jio Haptik is an AI leader having pioneered AI-powered innovation since 2013. Reliance Jio Digital Services acquired Haptik in April 2019. Haptik currently leads India’s AI market having become the first to process 15 billion+ two-way conversations across 10+ channels and in 135 languages. Haptik is also a Category Leader across platforms including Gartner, G2, Opus Research & more. Recently Haptik won the award for “Tech Startup of the Year” in the AI category at Entrepreneur India Awards 2023, and gold medal for “Best Chat & Conversational Bot” at Martequity Awards 2023. Haptik has a headcount of 200+ employees with offices in Mumbai, Delhi, and Bangalore.


What will you do everyday?


As a backend engineer you will be responsible for building the Haptik platform which is used by people across the globe. You will be responsible for developing, architecting and scaling the systems that support all the functions of the Haptik platform. While you know how to work hard, you also know how to have fun at work and make friends with your colleagues. 


Ok, you're sold, but what are we looking for in the perfect candidate?


Develop and maintain expertise in backend systems and API development, ensuring seamless integrations and scalable solutions, including:

  • Strong expertise in backend systems, including design principles and adherence to good coding practices.
  • Proven ability to enhance or develop complex tools at scale with a thorough understanding of system architecture.
  • Capability to work cross-functionally with all teams, ensuring seamless implementation of APIs and solutioning for various tools.
  • Skilled in high-level task estimation, scoping, and breaking down complex projects into actionable tasks.
  • Proficiency in modeling and optimizing database architecture for enhanced performance and scalability.
  • Experience collaborating with product teams to build innovative Proof of Concepts (POCs).
  • Ability to respond to data requests and generate reports to support data-driven decision-making.
  • Active participation in code reviews, automated testing, and quality assurance processes.
  • Experience working in a scrum-based agile development environment.
  • Commitment to staying updated with technology standards, emerging trends, and software development best practices.
  • Strong verbal and written communication skills to facilitate collaboration and clarity.


Requirements*:


  • A minimum of 5 years of experience in developing scalable products and applications.
  • Must Have Bachelor's degree in Computer Engineering or related field.
  • Proficiency in Python and expertise in at least one backend framework, with a preference for Django.
  • Hands-on experience designing normalized database schemas for large-scale applications using technologies such as MySQL, MongoDB, or Elasticsearch.
  • Practical knowledge of in-memory data stores like Redis or Memcached.
  • Familiarity with working in agile environments and exposure to tools like Jira is highly desirable.
  • Proficiency in using version control systems like Git.
  • Strong communication skills and the ability to collaborate effectively in team settings.
  • Self-motivated with a strong sense of ownership and commitment to delivering results.
  • Additional knowledge of RabbitMQ, AWS/Azure services, Docker, MQTT, Lambda functions, Cron jobs, Kibana, and Logstash is an added advantage.
  • Knowledge of web servers like Nginx/Apache is considered a valuable asset.

* Requirements is such a strong word. We don’t necessarily expect to find a candidate that has done everything listed, but you should be able to make a credible case that you’ve done most of it and are ready for the challenge of adding some new things to your resume. 


Tell me more about Haptik


  • On a roll: Announced major strategic partnership with Jio. 
  • Great team: You will be working with great leaders who have been listed in Business World 40 Under 40, Forbes 30 Under 30 and MIT 35 Under 35 Innovators. 
  • Great culture: The freedom to think and innovate is something that defines the culture of Haptik. Every person is approachable. While we are working hard, it is also important to take breaks to not get too worked up. 
  • Huge market: Disrupting a massive, growing chatbot market. The global market is projected to attain a valuation of US $0.94 bn by the end of 2024 progressing from US $0.11 bn earned in 2015. 
  • Great customers: Businesses across industries - Samsung, HDFCLife, Times of India are some that have relied on Haptik's Conversational AI solutions to engage, acquire, service and understand customers. 
  • Impact: A fun and exciting start-up culture that empowers its people to make a huge impact. 


Working hard for things that we don't care about is stress, but working hard for something we love is called passion! At Haptik we passionately solve problems in order to be able to move faster and don't shy away from breaking things! 

Read more
Wissen Technology

at Wissen Technology

4 recruiters
Vijayalakshmi Selvaraj
Posted by Vijayalakshmi Selvaraj
Mumbai
5 - 10 yrs
Best in industry
skill icon.NET
skill iconC#
skill iconAngular (2+)
SQL

Required Skills and Experience:

  • 5-7 years of experience in full-stack software development.
  • Solid proficiency in Angular (latest versions preferred).
  • Strong understanding of Angular components, modules, services, and performance optimization.
  • Proven experience in C# and .NET development.
  • Experience in designing and integrating RESTful APIs using Swagger.
  • Solid understanding of front-end and back-end development principles.
  • Excellent problem-solving and debugging skills.
  • Strong communication and collaboration skills.
  • Experience with Git and GitHub for version control.
  • Experience with CI/CD pipelines and DevOps practices.
  • Experience writing and maintaining integration tests.
  • Experience with database technologies (SQL or NoSQL, MongoDB).


Nice-to-Have Skills:

  • Experience with database technologies (SQL or NoSQL, MongoDB).
  • Understanding of cloud platforms (Azure).


Read more
Wissen Technology

at Wissen Technology

4 recruiters
Seema Srivastava
Posted by Seema Srivastava
Mumbai, Bengaluru (Bangalore)
5 - 14 yrs
Best in industry
skill iconPython
skill iconAmazon Web Services (AWS)
SQL
pandas
Amazon Redshift

Job Description: 

Please find below details:


Experience - 5+ Years

Location- Bangalore/Python


Role Overview

We are seeking a skilled Python Data Engineer with expertise in designing and implementing data solutions using the AWS cloud platform. The ideal candidate will be responsible for building and maintaining scalable, efficient, and secure data pipelines while leveraging Python and AWS services to enable robust data analytics and decision-making processes.

 

Key Responsibilities

  • Design, develop, and optimize data pipelines using Python and AWS services such as Glue, Lambda, S3, EMR, Redshift, Athena, and Kinesis.
  • Implement ETL/ELT processes to extract, transform, and load data from various sources into centralized repositories (e.g., data lakes or data warehouses).
  • Collaborate with cross-functional teams to understand business requirements and translate them into scalable data solutions.
  • Monitor, troubleshoot, and enhance data workflows for performance and cost optimization.
  • Ensure data quality and consistency by implementing validation and governance practices.
  • Work on data security best practices in compliance with organizational policies and regulations.
  • Automate repetitive data engineering tasks using Python scripts and frameworks.
  • Leverage CI/CD pipelines for deployment of data workflows on AWS.

 

Required Skills and Qualifications

  • Professional Experience: 5+ years of experience in data engineering or a related field.
  • Programming: Strong proficiency in Python, with experience in libraries like pandaspySpark, or boto3.
  • AWS Expertise: Hands-on experience with core AWS services for data engineering, such as:
  • AWS Glue for ETL/ELT.
  • S3 for storage.
  • Redshift or Athena for data warehousing and querying.
  • Lambda for serverless compute.
  • Kinesis or SNS/SQS for data streaming.
  • IAM Roles for security.
  • Databases: Proficiency in SQL and experience with relational (e.g., PostgreSQL, MySQL) and NoSQL (e.g., DynamoDB) databases.
  • Data Processing: Knowledge of big data frameworks (e.g., Hadoop, Spark) is a plus.
  • DevOps: Familiarity with CI/CD pipelines and tools like Jenkins, Git, and CodePipeline.
  • Version Control: Proficient with Git-based workflows.
  • Problem Solving: Excellent analytical and debugging skills.

 

Optional Skills

  • Knowledge of data modeling and data warehouse design principles.
  • Experience with data visualization tools (e.g., Tableau, Power BI).
  • Familiarity with containerization (e.g., Docker) and orchestration (e.g., Kubernetes).
  • Exposure to other programming languages like Scala or Java.






Read more
ScatterPie Analytics Pvt Ltd
Bengaluru (Bangalore), Mumbai, Delhi
3 - 7 yrs
₹5L - ₹15L / yr
Spotfire
Qlikview
Tableau
PowerBI
Data Visualization
+2 more

Role Summary:

Lead and drive the development in BI domain using Tableau eco-system with deep technical and BI ecosystem knowledge. The resource will be responsible for the dashboard design, development, and delivery of BI services using Tableau eco-system.


Key functions & responsibilities:

· Communication & interaction with Project Manager to understand the requirement

· Dashboard designing, development and deployment using Tableau eco-system

· Ensure delivery within given time frame while maintaining quality

· Stay up to date with current tech and bring relevant ideas to the table

· Proactively work with the Management team to identify and resolve issues

· Performs other related duties as assigned or advised

· He/she should be a leader that sets the standard and expectations through example in his/her conduct, work ethic, integrity and character

· Contribute in dashboard designing, R&D and project delivery using Tableau



Experience:

· Overall 3-7 Years of experience in DWBI development projects, having worked on BI and Visualization technologies (Tableau, Qlikview) for at least 3 years.

· At least 3 years of experience covering Tableau implementation lifecycle including hands-on development/programming, managing security, data modeling, data blending, etc.


Technology & Skills:

· Hands-on expertise of Tableau administration and maintenance

· Strong working knowledge and development experience with Tableau Server and Desktop

· Strong knowledge in SQL, PL/SQL and Data modelling

· Knowledge of databases like Microsoft SQL Server, Oracle, etc.

· Exposure to alternate Visualization technologies like Qlikview, Spotfire, Pentaho etc.

· Good communication & Analytical skills with Excellent creative and conceptual thinking abilities

· Superior organizational skills, attention to detail/level of quality, Strong communication skills, both verbal and written

Read more
Big4

Big4

Agency job
via Xcelyst tech Solutions by Divya Verma
Bengaluru (Bangalore), Mumbai, Delhi, Gurugram, Noida, Ghaziabad, Faridabad
4 - 12 yrs
₹20L - ₹45L / yr
Underwriting
SAP Credit Management
SAS
SQL
IFRS
+1 more

We are looking for credit analysts who can help them perform threshold analysis and impact calculations to streamline their credit underwriting process. Hence, they are looking for people who have experience in doing similar or related work. Since this is an urgent requirement, your quick help is requested in filling this role.

 

Attaching sample profile as well.

 

Manager and SM - 25-50LPA

Skills: Credit analyst + SAS

Goot to have: Credit underwriting, EWS (Early warning signal),EBC regulations, IFRS9.

Level: SM or M ( 4+ years)

NP: Max 30 days.

Budget: max 45 LPA.

Loc: Pan India

 

Responsibilities:

  • Credit Underwriting, credit appraisal process
  • EWS analysis, IFRS9 staging, credit analysis, threshold analysis
  • Querying and coding in SAS and SQL
  • ECB regulations

 

Read more
Wissen Technology

at Wissen Technology

4 recruiters
Sukanya Mohan
Posted by Sukanya Mohan
Bengaluru (Bangalore), Mumbai, Hyderabad
6 - 10 yrs
Best in industry
Software Testing (QA)
Test Automation (QA)
Appium
Selenium
SDET
+7 more

Key Responsibilities:

 

- Design, develop, and execute automated test scripts for trading applications.

- Work with product owners and business analysts to understand and write the acceptance test cases.

- Collaborate with developers, product managers, and other stakeholders to understand requirements and create test plans.

- Perform regression, performance, and end to end testing to ensure software reliability.

- Identify, document, and track defects using appropriate tools and methodologies.

- Maintain and enhance existing test automation frameworks for both frontend and backend.

- Report on coverage, functionality, defect aging, closure reports to the stakeholders so that they know the stability of releases.

- Integrate automation cases into CI/CD pipelines

 

Qualifications:

 

- Bachelor’s degree in Computer Science, Engineering, or a related field.

- Proven 5+ Years experience in automation testing for web and backend applications.

- Strong knowledge of testing frameworks (e.g., Selenium, Cypress, JUnit, TestNG, Playwright).

- Experience with API testing tools (e.g., Postman, SoapUI, RestAssured).

- Familiarity with programming languages such as Java, Python, or JavaScript.

- Understanding of basic SQL queries to validate data in the databases

- Understanding of CI/CD processes and tools (e.g., Jenkins, GitLab CI).

- Strong analytical and problem-solving skills.

- Excellent communication and teamwork abilities.

- Prior experience with trading applications or core financial services related applications is a big plus

Read more
ProtoGene Consulting Private Limited
Mumbai
3 - 8 yrs
₹7L - ₹18L / yr
PySpark
Data engineering
Big Data
Hadoop
Spark
+4 more

Data Engineer + Integration engineer + Support specialistExp – 5-8 years

Necessary Skills:· SQL & Python / PySpark

· AWS Services: Glue, Appflow, Redshift

· Data warehousing

· Data modelling

Job Description:· Experience of implementing and delivering data solutions and pipelines on AWS Cloud Platform. Design/ implement, and maintain the data architecture for all AWS data services

· A strong understanding of data modelling, data structures, databases (Redshift), and ETL processes

· Work with stakeholders to identify business needs and requirements for data-related projects

Strong SQL and/or Python or PySpark knowledge

· Creating data models that can be used to extract information from various sources & store it in a usable format

· Optimize data models for performance and efficiency

· Write SQL queries to support data analysis and reporting

· Monitor and troubleshoot data pipelines

· Collaborate with software engineers to design and implement data-driven features

· Perform root cause analysis on data issues

· Maintain documentation of the data architecture and ETL processes

· Identifying opportunities to improve performance by improving database structure or indexing methods

· Maintaining existing applications by updating existing code or adding new features to meet new requirements

· Designing and implementing security measures to protect data from unauthorized access or misuse

· Recommending infrastructure changes to improve capacity or performance

Experience in Process industry

Data Engineer + Integration engineer + Support specialistExp – 3-5 years

Necessary Skills:· SQL & Python / PySpark

· AWS Services: Glue, Appflow, Redshift

· Data warehousing basics

· Data modelling basics

Job Description:· Experience of implementing and delivering data solutions and pipelines on AWS Cloud Platform.

· A strong understanding of data modelling, data structures, databases (Redshift)

Strong SQL and/or Python or PySpark knowledge

· Design and implement ETL processes to load data into the data warehouse

· Creating data models that can be used to extract information from various sources & store it in a usable format

· Optimize data models for performance and efficiency

· Write SQL queries to support data analysis and reporting

· Collaborate with team to design and implement data-driven features

· Monitor and troubleshoot data pipelines

· Perform root cause analysis on data issues

· Maintain documentation of the data architecture and ETL processes

· Maintaining existing applications by updating existing code or adding new features to meet new requirements

· Designing and implementing security measures to protect data from unauthorized access or misuse

· Identifying opportunities to improve performance by improving database structure or indexing methods

· Designing and implementing security measures to protect data from unauthorized access or misuse

· Recommending infrastructure changes to improve capacity or performance


Read more
Fatakpay

at Fatakpay

2 recruiters
Disha Gajra
Posted by Disha Gajra
Andheri east mumbai
2 - 4 yrs
₹8L - ₹15L / yr
Spotfire
Qlikview
Tableau
PowerBI
Data Visualization
+4 more

Job Title: Data Analyst-Fintech

Job Description:

We are seeking a highly motivated and detail-oriented Data Analyst with 2 to 4 years of work experience to join our team. The ideal candidate will have a strong analytical mindset, excellent problem-solving skills, and a passion for transforming data into actionable insights. In this role, you will play a pivotal role in gathering, analyzing, and interpreting data to support informed decision-making and drive business growth.

Key Responsibilities:

1.      Data Collection and Extraction:

§ Gather data from various sources, including databases, spreadsheets and APIs,

§ Perform data cleansing and validation to ensure data accuracy and integrity.

2.      Data Analysis:

§ Analyze large datasets to identify trends, patterns, and anomalies.

§ Conduct analysis and data modeling to generate insights and forecasts.

§ Create data visualizations and reports to present findings to stakeholders.

3.      Data Interpretation and Insight Generation:

§ Translate data insights into actionable recommendations for business improvements.

§ Collaborate with cross-functional teams to understand data requirements and provide data-driven solutions.

4.      Data Quality Assurance:

§ Implement data quality checks and validation processes to ensure data accuracy and consistency.

§ Identify and address data quality issues promptly.

Qualifications:

1.      Bachelor's degree in a relevant field such as Computer Science, Statistics, Mathematics, or a related discipline.

2.      Proven work experience as a Data Analyst, with 2 to 4 years of relevant experience.

3.      Knowledge of data warehousing concepts and ETL processes is advantageous.

4.      Proficiency in data analysis tools and languages (e.g., SQL, Python, R).

5.      Experience with data visualization tools (e.g., Tableau, Power BI) is a plus.

6.      Strong analytical and problem-solving skills.

7.      Excellent communication and presentation skills.

8.      Attention to detail and a commitment to data accuracy.

9.      Familiarity with machine learning and predictive modeling is a bonus.


If you are a data-driven professional with a passion for uncovering insights from complex datasets and have the qualifications and skills mentioned above, we encourage you to apply for this Data Analyst position. Join our dynamic team and contribute to making data-driven decisions that will shape our company's future.

Fatakpay is an equal-opportunity employer. We celebrate diversity and are committed to creating an inclusive environment for all employees.


Read more
Wissen Technology

at Wissen Technology

4 recruiters
Vijayalakshmi Selvaraj
Posted by Vijayalakshmi Selvaraj
Mumbai, Pune
4 - 8 yrs
Best in industry
Market Research
SQL
Equity derivatives
SoapUI
skill iconPostman
+4 more
  • 4-8 years of experience in Functional testing with good foundation in technical expertise
  • Experience in the Capital Markets domain is MUST
  • Exposure to API testing tools like SoapUI and Postman
  • Well versed with SQL
  • Hands on experience in debugging issues using Unix commands
  • Basic understanding of XML and JSON structures
  • Knowledge of FitNesse is good to have
  • Should be early joinee.


Read more
Fatakpay

at Fatakpay

2 recruiters
Disha Gajra
Posted by Disha Gajra
MUMBAI ANDHERI EAST
5 - 8 yrs
₹10L - ₹20L / yr
Spotfire
Qlikview
Tableau
PowerBI
Data Visualization
+5 more

Job Description:

Position: Senior Manager- Data Analytics (Fintech Firm)

Experience: 5-8 Years

Location: Mumbai-Andheri

Employment Type: Full-Time

About Us:

We are a dynamic fintech firm dedicated to revolutionizing the financial services industry through innovative data solutions. We believe in leveraging cutting-edge technology to provide superior financial products and services to our clients. Join our team and be a part of this exciting journey.

Job Overview:

We are looking for a skilled Data Engineer with 3-5 years of experience to join our data team. The ideal candidate will have a strong background in ETL processes, data pipeline creation, and database management. As a Data Engineer, you will be responsible for designing, developing, and maintaining scalable data systems and pipelines.

Key Responsibilities:

  • Design and develop robust and scalable ETL processes to ingest and process large datasets from various sources.
  • Build and maintain efficient data pipelines to support real-time and batch data processing.
  • Collaborate with data scientists, analysts, and other stakeholders to understand data requirements and deliver solutions.
  • Optimize database performance and ensure data integrity and security.
  • Troubleshoot and resolve data-related issues and provide support for data operations.
  • Implement data quality checks and monitor data pipeline performance.
  • Document technical solutions and processes for future reference.

Required Skills and Qualifications:

  • Bachelor's degree in Engineering, or a related field.
  • 3-5 years of experience in data engineering or a related role.
  • Strong proficiency in ETL tools and techniques.
  • Experience with SQL and relational databases (e.g., MySQL, PostgreSQL).
  • Familiarity with big data technologies
  • Proficiency in programming languages such as Python, Java, or Scala.
  • Knowledge of data warehousing concepts and tools 
  • Excellent problem-solving skills and attention to detail.
  • Strong communication and collaboration skills.

Preferred Qualifications:

  • Experience with data visualization tools (e.g., Tableau, Power BI).
  • Knowledge of machine learning and data science principles.
  • Experience with real-time data processing and streaming platforms (e.g., Kafka).

What We Offer:

  • Competitive compensation package (10-15 LPA) based on experience and qualifications.
  • Opportunity to work with a talented and innovative team in the fintech industry..
  • Professional development and growth opportunities.

How to Apply:

If you are passionate about data engineering and eager to contribute to a forward-thinking fintech firm, we would love to hear from you.


Read more
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort
Why apply via Cutshort?
Connect with actual hiring teams and get their fast response. No spam.
Find more jobs
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort