Cutshort logo

50+ SQL Jobs in Pune | SQL Job openings in Pune

Apply to 50+ SQL Jobs in Pune on CutShort.io. Explore the latest SQL Job opportunities across top companies like Google, Amazon & Adobe.

Sql jobs in other cities
ESQL JobsESQL Jobs in PuneMySQL JobsMySQL Jobs in AhmedabadMySQL Jobs in Bangalore (Bengaluru)MySQL Jobs in BhubaneswarMySQL Jobs in ChandigarhMySQL Jobs in ChennaiMySQL Jobs in CoimbatoreMySQL Jobs in Delhi, NCR and GurgaonMySQL Jobs in HyderabadMySQL Jobs in IndoreMySQL Jobs in JaipurMySQL Jobs in Kochi (Cochin)MySQL Jobs in KolkataMySQL Jobs in MumbaiMySQL Jobs in PunePL/SQL JobsPL/SQL Jobs in AhmedabadPL/SQL Jobs in Bangalore (Bengaluru)PL/SQL Jobs in ChandigarhPL/SQL Jobs in ChennaiPL/SQL Jobs in CoimbatorePL/SQL Jobs in Delhi, NCR and GurgaonPL/SQL Jobs in HyderabadPL/SQL Jobs in IndorePL/SQL Jobs in JaipurPL/SQL Jobs in KolkataPL/SQL Jobs in MumbaiPL/SQL Jobs in PunePostgreSQL JobsPostgreSQL Jobs in AhmedabadPostgreSQL Jobs in Bangalore (Bengaluru)PostgreSQL Jobs in BhubaneswarPostgreSQL Jobs in ChandigarhPostgreSQL Jobs in ChennaiPostgreSQL Jobs in CoimbatorePostgreSQL Jobs in Delhi, NCR and GurgaonPostgreSQL Jobs in HyderabadPostgreSQL Jobs in IndorePostgreSQL Jobs in JaipurPostgreSQL Jobs in Kochi (Cochin)PostgreSQL Jobs in KolkataPostgreSQL Jobs in MumbaiPostgreSQL Jobs in PunePSQL JobsPSQL Jobs in Bangalore (Bengaluru)PSQL Jobs in ChennaiPSQL Jobs in Delhi, NCR and GurgaonPSQL Jobs in HyderabadPSQL Jobs in MumbaiPSQL Jobs in PuneRemote SQL JobsRpgSQL JobsRpgSQL Jobs in HyderabadSQL JobsSQL Jobs in AhmedabadSQL Jobs in Bangalore (Bengaluru)SQL Jobs in BhubaneswarSQL Jobs in ChandigarhSQL Jobs in ChennaiSQL Jobs in CoimbatoreSQL Jobs in Delhi, NCR and GurgaonSQL Jobs in HyderabadSQL Jobs in IndoreSQL Jobs in JaipurSQL Jobs in Kochi (Cochin)SQL Jobs in KolkataSQL Jobs in MumbaiTransact-SQL JobsTransact-SQL Jobs in Bangalore (Bengaluru)Transact-SQL Jobs in ChennaiTransact-SQL Jobs in HyderabadTransact-SQL Jobs in JaipurTransact-SQL Jobs in Pune
icon
DeepIntent

at DeepIntent

2 candid answers
17 recruiters
Amruta Mundale
Posted by Amruta Mundale
Pune
8 - 10 yrs
Best in industry
Technical support
SQL
Apache
Google Cloud Platform (GCP)
skill iconAmazon Web Services (AWS)
+2 more

What You’ll Do:

We are looking for a Staff Operations Engineer based in Pune, India who can master both DeepIntent’s data architectures and pharma research and analytics methodologies to make significant contributions to how health media is analyzed by our clients. This role requires an Engineer who not only understands DBA functions but also how they impact research objectives and can work with researchers and data scientists to achieve impactful results.  

This role will be in the Engineering Operations team and will require integration and partnership with the Engineering Organization. The ideal candidate is a self-starter who is inquisitive who is not afraid to take on and learn from challenges and will constantly seek to improve the facets of the business they manage. The ideal candidate will also need to demonstrate the ability to collaborate and partner with others.  

  • Serve as the Engineering interface between Analytics and Engineering teams.
  • Develop and standardize all interface points for analysts to retrieve and analyze data with a focus on research methodologies and data-based decision-making.
  • Optimize queries and data access efficiencies, serve as an expert in how to most efficiently attain desired data points.
  • Build “mastered” versions of the data for Analytics-specific querying use cases.
  • Establish a formal data practice for the Analytics practice in conjunction with the rest of DeepIntent.
  • Interpret analytics methodology requirements and apply them to data architecture to create standardized queries and operations for use by analytics teams.
  • Implement DataOps practices.
  • Master existing and new Data Pipelines and develop appropriate queries to meet analytics-specific objectives.
  • Collaborate with various business stakeholders, software engineers, machine learning engineers, and analysts.
  • Operate between Engineers and Analysts to unify both practices for analytics insight creation.

Who You Are:

  • 8+ years of experience in Tech Support (Specialised in Monitoring and maintaining Data pipeline).
  • Adept in market research methodologies and using data to deliver representative insights.
  • Inquisitive, curious, understands how to query complicated data sets, move and combine data between databases.
  • Deep SQL experience is a must.
  • Exceptional communication skills with the ability to collaborate and translate between technical and non-technical needs.
  • English Language Fluency and proven success working with teams in the U.S.
  • Experience in designing, developing and operating configurable Data pipelines serving high-volume and velocity data.
  • Experience working with public clouds like GCP/AWS.
  • Good understanding of software engineering, DataOps, and data architecture, Agile and DevOps methodologies.
  • Proficient with SQL, Python or JVM-based language, Bash.
  • Experience with any of Apache open-source projects such as Spark, Druid, Beam, Airflow etc. and big data databases like BigQuery, Clickhouse, etc. 
  • Ability to think big, take bets and innovate, dive deep, hire and develop the best talent, learn and be curious.
  • Experience in debugging UI and Backend issues will be add on.


Read more
The Blue Owls Solutions

at The Blue Owls Solutions

2 candid answers
Apoorvo Chakraborty
Posted by Apoorvo Chakraborty
Pune
6 - 10 yrs
₹20L - ₹30L / yr
Data governance
Data engineering
Team leadership
Data modeling
Synapse
+3 more

The Role


We are looking for a Senior/Lead Azure Data Engineer to join our team in Pune. You will be responsible for the end-to-end lifecycle of data solutions, from initial client requirement gathering and solution architecture design to leading the data engineering team through implementation. You will be the technical anchor for the project, ensuring that our data estates are scalable, governed, and high-performing.


Key Responsibilities

  • Architecture & Design: Design robust data architectures using Microsoft Fabric and Azure Synapse, focusing on Medallion architecture and metadata-driven frameworks.
  • End-to-End Delivery: Translate complex client business requirements into technical roadmaps and lead the team to deliver them on time.
  • Data Governance: Implement and manage enterprise-grade governance, data discovery, and lineage using Microsoft Purview.
  • Team Leadership: Act as the technical lead for the team, performing code reviews, mentoring junior engineers, and ensuring best practices in PySpark and SQL.
  • Client Management: Interface directly with stakeholders to define project scope and provide technical consultancy.


What We’re Looking For

  • 6+ Years in Data Engineering with at least 3+ years leading technical teams or designing architectures.
  • Expertise in Microsoft Fabric/Synapse: Deep experience with Lakehouses, Warehouses, and Spark-based processing.
  • Governance Specialist: Proven experience implementing Microsoft Purview for data cataloging, sensitivity labeling, and lineage.
  • Technical Breadth: Strong proficiency in PySpark, SQL, and Data Factory. Familiarity with Infrastructure as Code (Bicep/Terraform) is a major plus.

Why Work with Us?

  • Competitive Pay
  • Flexible Hours
  • Work on Microsoft’s latest (Fabric, Purview, Foundry) as a Designated Solutions Partner.
  • High-Stakes Impact: Solve complex, client-facing problems for enterprise leaders
  • Structured learning paths to help you master AI automation and Agentic AI.


Read more
GeniWay Technologies

at GeniWay Technologies

1 candid answer
GeniWay Hiring
Posted by GeniWay Hiring
Pune
2 - 3 yrs
₹8L - ₹10L / yr
skill iconPython
FastAPI
SQL
skill iconNodeJS (Node.js)
Database modeling
+5 more

About Company (GeniWay)

GeniWay Technologies is pioneering India’s first AI-native platform for personalized learning and career guidance, transforming the way students learn, grow, and determine their future path. Addressing challenges in the K-12 system such as one-size-fits-all teaching and limited career awareness, GeniWay leverages cutting-edge AI to create a tailored educational experience for every student. The core technology includes an AI-powered learning engine, a 24x7 multilingual virtual tutor and Clario, a psychometrics-backed career guidance system. Aligned with NEP 2020 policies, GeniWay is on a mission to make high-quality learning accessible to every student in India, regardless of their background or region.


What you’ll do

  • Build the career assessment backbone: attempt lifecycle (create/resume/submit), timing metadata, partial attempts, idempotent APIs.
  • Implement deterministic scoring pipelines with versioning and audit trails (what changed, when, why).
  • Own Postgres data modeling: schemas, constraints, migrations, indexes, query performance.
  • Create safe, structured GenAI context payloads (controlled vocabulary, safety flags, eval datasets) to power parent/student narratives.
  • Raise reliability: tests for edge cases, monitoring, reprocessing/recalculation jobs, safe logging (no PII leakage).


Must-have skills

  • Backend development in Python (FastAPI/Django/Flask) or Node (NestJS) with production API experience.
  • Strong SQL + PostgreSQL fundamentals (transactions, indexes, schema design, migrations).
  • Testing discipline: unit + integration tests for logic-heavy code; systematic debugging approach.
  • Comfort using AI coding copilots to speed up scaffolding/tests/refactors — while validating correctness.
  • Ownership mindset: cares about correctness, data integrity, and reliability.


Good to have

  • Experience with rule engines, scoring systems, or audit-heavy domains (fintech, healthcare, compliance).
  • Event schemas/telemetry pipelines and observability basics.
  • Exposure to RAG/embeddings/vector DBs or prompt evaluation harnesses.


Location: Pune (on-site for first 3 months; hybrid/WFH flexibility thereafter)

Employment Type: Full-time

Experience: 2–3 years (correctness-first; strong learning velocity)

Compensation: Competitive (₹8–10 LPA fixed cash) + ESOP (equity ownership, founding-early employee level)

Joining Timeline: 2–3 weeks / Immediate


Why join (founding team)

  • You’ll build core IP: scoring integrity and data foundations that everything else depends on.
  • Rare skill-building: reliable systems + GenAI-safe context/evals (not just API calls).
  • Meaningful ESOP upside at an early stage.
  • High trust, high ownership, fast learning.
  • High-impact mission: reduce confusion and conflict in student career decisions; help families make better choices, transform student lives by making great learning personal.


Hiring process (fast)

1.      20-min intro call (fit + expectations).

2.      45–60 min SQL & data modeling, API deep dive.

3.      Practical exercise (2–3 hours max) implementing a small scoring service with tests.

4.      Final conversation + offer.


How to apply

Reply with your resume/LinkedIn profile plus one example of a system/feature where you owned data modeling and backend integration (a short paragraph is fine).

Read more
NonStop io Technologies Pvt Ltd
Kalyani Wadnere
Posted by Kalyani Wadnere
Pune
6 - 10 yrs
Best in industry
skill iconJava
skill iconSpring Boot
Microservices
Hibernate (Java)
skill iconAmazon Web Services (AWS)
+6 more

Company Description

NonStop io Technologies, founded in August 2015, is a Bespoke Engineering Studio specializing in Product Development. With over 80 satisfied clients worldwide, we serve startups and enterprises across prominent technology hubs, including San Francisco, New York, Houston, Seattle, London, Pune, and Tokyo. Our experienced team brings over 10 years of expertise in building web and mobile products across multiple industries. Our work is grounded in empathy, creativity, collaboration, and clean code, striving to build products that matter and foster an environment of accountability and collaboration.


Role Description

This is a full-time hybrid role for a Java Software Engineer, based in Pune. The Java Software Engineer will be responsible for designing, developing, and maintaining software applications. Key responsibilities include working with microservices architecture, implementing and managing the Spring Framework, and programming in Java. Collaboration with cross-functional teams to define, design, and ship new features is also a key aspect of this role.


Responsibilities:

● Develop and Maintain: Write clean, efficient, and maintainable code for Java-based applications 

● Collaborate: Work with cross-functional teams to gather requirements and translate them into technical solutions 

● Code Reviews: Participate in code reviews to maintain high-quality standards 

● Troubleshooting: Debug and resolve application issues in a timely manner 

● Testing: Develop and execute unit and integration tests to ensure software reliability

● Optimize: Identify and address performance bottlenecks to enhance application performance 


Qualifications & Skills:

● Strong knowledge of Java, Spring Framework (Spring Boot, Spring MVC), and Hibernate/JPA 

● Familiarity with RESTful APIs and web services 

● Proficiency in working with relational databases like MySQL or PostgreSQL 

● Practical experience with AWS cloud services and building scalable, microservices-based architectures

● Experience with build tools like Maven or Gradle 

● Understanding of version control systems, especially Git 

● Strong understanding of object-oriented programming principles and design patterns 

● Familiarity with automated testing frameworks and methodologies 

● Excellent problem-solving skills and attention to detail 

● Strong communication skills and ability to work effectively in a collaborative team environment 


Why Join Us? 

● Opportunity to work on cutting-edge technology products 

● A collaborative and learning-driven environment 

● Exposure to AI and software engineering innovations 

● Excellent work ethic and culture 


If you're passionate about technology and want to work on impactful projects, we'd love to hear from you

Read more
Talent Pro
Mayank choudhary
Posted by Mayank choudhary
Mumbai, Pune, Bengaluru (Bangalore)
3 - 5 yrs
₹8L - ₹12L / yr
skill iconData Analytics
SQL

Must have Strong SQL skills (queries, optimization, procedures, triggers), Hands-on experience with SQL, Automated through SQL.

Looking for candidates having 2+ years experience who has worked on large datasets with 1cr. datasets or more.

Handling the challenges and breaking.

Must have Advanced Excel skills

Should have 3+ years of relevant experience

Should have Reporting + dashboard creation experience

Should have Database development & maintenance experience

Must have Strong communication for client interactions

Should have Ability to work independently

Willingness to work from client locati

Read more
ByteFoundry AI

at ByteFoundry AI

4 candid answers
Bisman Gill
Posted by Bisman Gill
Remote only
3 - 8 yrs
Upto ₹40L / yr (Varies
)
skill iconReact.js
skill iconNodeJS (Node.js)
skill iconPython
SQL
skill iconAmazon Web Services (AWS)
+3 more

About the Role

We are looking for a motivated Full Stack Developer with 2–5 years of hands-on experience in building scalable web applications. You will work closely with senior engineers and product teams to develop new features, improve system performance, and ensure high-

quality code delivery.

Responsibilities

- Develop and maintain full-stack applications.

- Implement clean, maintainable, and efficient code.

- Collaborate with designers, product managers, and backend engineers.

- Participate in code reviews and debugging.

- Work with REST APIs/GraphQL.

- Contribute to CI/CD pipelines.

- Ability to work independently as well as within a collaborative team environment.


Required Technical Skills

- Strong knowledge of JavaScript/TypeScript.

- Experience with React.js, Next.js.

- Backend experience with Node.js, Express, NestJS.

- Understanding of SQL/NoSQL databases.

- Experience with Git, APIs, debugging tools.ß

- Cloud familiarity (AWS/GCP/Azure).

AI and System Mindset

Experience working with AI-powered systems is a strong plus. Candidates should be comfortable integrating AI agents, third-party APIs, and automation workflows into applications, and should demonstrate curiosity and adaptability toward emerging AI technologies.

Soft Skills

- Strong problem-solving ability.

- Good communication and teamwork.

- Fast learner and adaptable.

Education

Bachelor's degree in Computer Science / Engineering or equivalent.

Read more
Phi Commerce

at Phi Commerce

2 candid answers
Nikita Deshmuk
Posted by Nikita Deshmuk
Pune
10 - 20 yrs
₹1L - ₹30L / yr
Production support
Linux/Unix
SQL
People Management
Stakeholder management
+1 more

Experience - 10-20  Yrs

Job Location - CommerZone, Yerwada, Pune

Work Mode - Work from Office

Shifts - General Shift

Work days - 5 days

Quantification - Graduation full time mandatory

Domain - Payment/Card/Banking/BFSI/ Retail Payments

Job Type - Full Time

Notice period - Immediate or 30 days


Interview Process -


1) Screening

2) Virtual L1 interview

3) Managerial Round Face to Face at Pune Office

4) HR Discussion


Job Description 


Job Summary:


The Production/L2 Application Support Manager will be responsible for managing the banking applications that supports our payment gateway systems in a production environment. You will oversee the deployment, monitoring, optimization, and maintenance of all application components. You will ensure that our systems run smoothly, meet business and regulatory requirements, and provide high availability for our customers.


Key Responsibilities:

  • Manage and optimize the application for the payment gateway systems to ensure high availability, reliability, and scalability.
  • Oversee the day-to-day operations of production environments, including managing cloud services (AWS), load balancing, database systems, and monitoring tools.
  • Lead a team of application support engineers and administrators, providing technical guidance and support to ensure applications and infrastructure solutions are implemented efficiently and effectively.
  • Collaborate with development, security, and product teams to ensure application support the needs of the business and complies with relevant regulations.
  • Monitor application performance and system health using monitoring tools and ensure quick resolution of any performance bottlenecks or system failures.
  • Develop and maintain capacity planning, monitoring, and backup strategies to ensure scalability and minimal downtime during peak transaction periods.
  • Drive continuous improvement of processes and tools for efficient production/application management.
  • Ensure robust security practices are in place across production systems, including compliance with industry standards
  • Conduct incident response, root cause analysis, and post-mortem analysis to prevent recurring issues and improve system performance.
  • Oversee regular patching, updates, and version control of production systems to minimize vulnerabilities.
  • Develop and maintain application support documentation, including architecture diagrams, processes, and disaster recovery plans.
  • Manage and execute on-call duties, ensuring timely resolution of application-related issues and ensuring proper support coverage.


Skills and Qualifications:

  • Bachelor's degree in Computer Science, Information Technology, or a related field (or equivalent work experience).
  • 8+ years of experience managing L2 application support in high-availability, mission-critical environments, ideally within a payment gateway or fintech organization.
  • Experience with working L2 production support base on Java programming.
  • Experience with database systems (SQL, NoSQL) and database management, including high availability and disaster recovery strategies.
  • Excellent communication and leadership skills, with the ability to collaborate effectively across teams and drive initiatives forward..
  • Ability to work well under pressure and in high-stakes situations, ensuring uptime and service continuity.
Read more
Phi Commerce

at Phi Commerce

2 candid answers
Nikita Sinha
Posted by Nikita Sinha
Pune
2 - 5 yrs
Upto ₹10L / yr (Varies
)
Integration
System integration
SQL
Linux/Unix

We are looking for a skilled and motivated Integration Engineer to join our dynamic team in the payment domain. This role involves the seamless integration of payment systems, APIs, and third-party services into our platform, ensuring smooth and secure payment processing. The ideal candidate will bring experience with payment technologies, integration methodologies, and a strong grasp of industry standards.

Key Responsibilities:

  • System Integration:
  • Design, develop, and maintain integrations between various payment processors, gateways, and internal platforms using RESTful APIs, SOAP, and related technologies.
  • Payment Gateway Integration:
  • Integrate third-party payment solutions such as Visa, MasterCard, PayPal, Stripe, and others into the platform.
  • Troubleshooting & Support:
  • Identify and resolve integration issues including transactional failures, connectivity issues, and third-party service disruptions.
  • Testing & Validation:
  • Conduct end-to-end integration testing to ensure payment system functionality across development, staging, and production environments.

Qualifications:

  • Education:
  • Bachelor’s degree in Computer Science, Engineering, Information Technology, or a related field. Equivalent work experience is also acceptable.
  • Experience:
  • 3+ years of hands-on experience in integrating payment systems and third-party services.
  • Proven experience with payment gateways (e.g., Stripe, Square, PayPal, Adyen) and protocols (e.g., ISO 20022, EMV).
  • Familiarity with payment processing systems and industry standards.

Desirable Skills:

  • Strong understanding of API security, OAuth, and tokenization practices.
  • Experience with PCI-DSS compliance.
  • Excellent problem-solving and debugging skills.
  • Effective communication and cross-functional collaboration capabilities.
Read more
Phi Commerce

at Phi Commerce

2 candid answers
Ariba Khan
Posted by Ariba Khan
Pune
3 - 5 yrs
Upto ₹10L / yr (Varies
)
skill iconJava
skill iconSpring Boot
Microservices
SQL
skill iconReact.js

We are seeking an experienced and highly skilled Java (Fullstack) Engineer to join our team.


The ideal candidate will have a strong background in both Back-end JAVA, Spring-boot, Spring Framework & Frontend Javascript, React or Angular with ability to salable high performance applications.


Responsibilities

  • Develop, test, and deploy scalable and robust backend services Develop, test & deploy scalable & robust back-end services using JAVA & Spring-boot
  • Build responsive & user friendly front-end applications using modern Java-script framework with React
  • or Angular
  • Collaborate with architects & team members to design salable, maintainable & efficient systems.
  • Contribute to architectural decisions for micro-services, API’s & cloud solutions.
  • Implement & maintain Restful API for seamless integration.
  • Write clean, efficient & res-usable code adhering to best practices
  • Conduct code reviews, performance optimizations & debugging
  • Work with cross functional teams, including UX/UI designers, product managers & QA team.
  • Mentor junior developers & provide technical guidance.

Skills & Requirements

  • Minimum 3 Years of experience in backend/ fullstack development
  • Back-end - Core JAVA/JAVA8, Spring-boot, Spring Framework, Micro-services, Rest API’s, Kafka,
  • Front-end - JavaScript, HTML, CSS, Typescript, Angular
  • Database - MySQL

Preferred

  • Experience with Batch writing, Application performance, Caches security, Web Security
  • Experience working in fintech, payments, or high-scale production environments
Read more
Phi Commerce

at Phi Commerce

2 candid answers
Ariba Khan
Posted by Ariba Khan
Pune
8 - 12 yrs
Upto ₹25L / yr (Varies
)
skill iconJava
skill iconSpring Boot
Microservices
SQL

We are seeking an experienced & highly skilled Java Lead to join our team. The ideal candidate will have a strong background in both front end & Back-end Technologies with expertise in JAVA, and Spring.

As a Lead, you will be responsible for overseeing the development

team, architecture saleable application & ensuring best practices in software development. This role requires a hands on leader with excellent problem solving abilities & a passion for mentoring junior

team members.


Responsibilities

  • Lead & mentor a team of developers proving guidance on coding standards, architecture & best
  • practices
  • Architect, design & develop ent to end JAVA based web applications & ensure high performance,
  • security & scalability
  • Work closely with cross functional teams, including product managers, designers & other developers
  • to ensure alignment on project requirements & deliverable.
  • Conduct code reviews & provide constructive feedback to team members to improve code quality &
  • maintain a consistent codebase
  • Participate in Agile/Scrum Ceremonies such as stand ups, sprint planning & retrospectives to
  • contribute to the development process.
  • Troubleshoot & resolve complex technical issues & ensure timely resolution of bugs & improvements.
  • Stay up to date with emerging technologies & industry trends recommending & implementing
  • improvements to keep our stack modern & effective

Skills & Requirements

  • Minimum 8 Years of experience in Java development, with at least 2 years in Lead developer role.
  • Back-end - Core JAVA/JAVA8, Spring-boot, Spring Framework, Micro-services, Rest API’s, Kafka,
  • Database - MySQL
  • Must be working in the fintech/ Payments domain

Preferred

  • Experience with Batch writing, Application performance, Caches security, Web Security
Read more
Phi Commerce

at Phi Commerce

2 candid answers
Ariba Khan
Posted by Ariba Khan
Pune
5 - 8 yrs
Upto ₹20L / yr (Varies
)
skill iconJava
skill iconSpring Boot
Microservices
SQL
Linux/Unix
+1 more

We are seeking an experienced and highly skilled Java (Fullstack) Engineer to join our team.


The ideal candidate will have a strong background in both Back-end JAVA, Spring-boot, Spring Framework & Frontend Javascript, React or Angular with ability to salable high performance applications.


Responsibilities

  • Develop, test, and deploy scalable and robust backend services Develop, test & deploy scalable & robust back-end services using JAVA & Spring-boot
  • Build responsive & user friendly front-end applications using modern Java-script framework with React
  • or Angular
  • Collaborate with architects & team members to design salable, maintainable & efficient systems.
  • Contribute to architectural decisions for micro-services, API’s & cloud solutions.
  • Implement & maintain Restful API for seamless integration.
  • Write clean, efficient & res-usable code adhering to best practices
  • Conduct code reviews, performance optimizations & debugging
  • Work with cross functional teams, including UX/UI designers, product managers & QA team.
  • Mentor junior developers & provide technical guidance.

Skills & Requirements

  • Minimum 5 Years of experience in backend/ fullstack development
  • Back-end - Core JAVA/JAVA8, Spring-boot, Spring Framework, Micro-services, Rest API’s, Kafka,
  • Front-end - JavaScript, HTML, CSS, Typescript, Angular
  • Database - MySQL

Preferred

  • Experience with Batch writing, Application performance, Caches security, Web Security
  • Experience working in fintech, payments, or high-scale production environments
Read more
NeoGenCode Technologies Pvt Ltd
Akshay Patil
Posted by Akshay Patil
Pune
2 - 10 yrs
₹8L - ₹25L / yr
skill iconJava
skill iconSpring Boot
RESTful APIs
Hibernate (Java)
JPA
+8 more

Job Title : Java Developer

Experience : 2 to 10 Years

Location : Pune (Must be currently in Pune)

Notice Period : Immediate to 15 Days (Serving NP acceptable)

Budget :

  • 2 to 3.5 yrs → up to 13 LPA
  • 3.5 to 5 yrs → up to 18 LPA
  • 5+ yrs → up to 25 LPA

Mandatory Skills : Java 8/17, Spring Boot, REST APIs, Hibernate/JPA, SQL/RDBMS, OOPs, Design Patterns, Git/GitHub, Unit Testing, Microservices (Good Coding Skills Mandatory)


Role Overview :

Hiring multiple Java Developers to build scalable and performance-driven applications. Strong hands-on coding and problem-solving skills required.


Key Responsibilities :

  • Develop and maintain Java-based applications & REST services
  • Write clean, testable code with JUnit & unit tests
  • Participate in code reviews, debugging & optimization
  • Work with SQL databases, CI/CD & version control tools
  • Collaborate with cross-functional teams in Agile setups

Good to Have :

  • MongoDB, AWS, Docker, Jenkins/GitHub Actions, Prometheus, Grafana, Spring Actuators, Tomcat/JBoss
Read more
DeepIntent

at DeepIntent

2 candid answers
17 recruiters
Amruta Mundale
Posted by Amruta Mundale
Pune
5 - 8 yrs
Best in industry
skill iconPython
SQL
Spark
airflow
pandas
+6 more

What You’ll Do:

As a Sr. Data Scientist, you will work closely across DeepIntent Data Science teams located in New York, India, and Bosnia. The role will focus on building predictive models, implementing data-driven solutions to maximize ad effectiveness. You will also lead efforts in generating analyses and insights related to the measurement of campaign outcomes, Rx, patient journey, and supporting the evolution of the DeepIntent product suite. Activities in this position include developing and deploying models in production, reading campaign results, analyzing medical claims, clinical, demographic and clickstream data, performing analysis and creating actionable insights, summarizing, and presenting results and recommended actions to internal stakeholders and external clients, as needed.

  • Explore ways to create better predictive models.
  • Analyze medical claims, clinical, demographic and clickstream data to produce and present actionable insights.
  • Explore ways of using inference, statistical, and machine learning techniques to improve the performance of existing algorithms and decision heuristics.
  • Design and deploy new iterations of production-level code.
  • Contribute posts to our upcoming technical blog.

Who You Are:

  • Bachelor’s degree in a STEM field, such as Statistics, Mathematics, Engineering, Biostatistics, Econometrics, Economics, Finance, or Data Science.
  • 5+ years of working experience as a Data Scientist or Researcher in digital marketing, consumer advertisement, telecom, or other areas requiring customer-level predictive analytics.
  • Advanced proficiency in performing statistical analysis in Python, including relevant libraries, is required.
  • Experience working with data processing, transformation and building model pipelines using tools such as Spark, Airflow, and Docker.
  • You have an understanding of the ad-tech ecosystem, digital marketing and advertising data and campaigns or familiarity with the US healthcare patient and provider systems (e.g. medical claims, medications).
  • You have varied and hands-on predictive machine learning experience (deep learning, boosting algorithms, inference…).
  • You are interested in translating complex quantitative results into meaningful findings and interpretable deliverables, and communicating with less technical audiences orally and in writing.
  • You can write production level code, work with Git repositories.
  • Active Kaggle participant.
  • Working experience with SQL.
  • Familiar with medical and healthcare data (medical claims, Rx, preferred).
  • Conversant with cloud technologies such as AWS or Google Cloud.
Read more
AdElement

at AdElement

2 recruiters
Ritisha Nigam
Posted by Ritisha Nigam
Pune
2 - 5 yrs
₹3L - ₹7L / yr
adtech
SQL
skill iconJava
skill iconJavascript
skill iconPython

Company Description


AdElement is a leading digital advertising technology company that has been helping app publishers increase their ad revenue and reach untapped demand since 2011. With our expertise in connecting brands to app audiences on evolving screens, such as VR headsets and vehicle consoles, we enable our clients to be first to market. We have been recognized as the Google Agency of the Year and have offices globally, with our headquarters located in New Brunswick, New Jersey.


Job Description


Work alongside a highly skilled engineering team to design, develop, and maintain large-scale, highly performant, real-time applications.

Own building features, driving directly with product and other engineering teams.

Demonstrate excellent communication skills in working with technical and non-technical audiences.

Be an evangelist for best practices across all functions - developers, QA, and infrastructure/ops.

Be an evangelist for platform innovation and reuse.


Requirements:


2+ years of experience building large-scale and low-latency distributed systems.

Command of Java or C++.

Solid understanding of algorithms, data structures, performance optimization techniques, object-oriented programming, multi-threading, and real-time programming.

Experience with distributed caching, SQL/NO SQL, and other databases is a plus.

Experience with Big Data and cloud services such as AWS/GCP is a plus.

Experience in the advertising domain is a big plus.

B. S. or M. S. degree in Computer Science, Engineering, or equivalent.


Location: Pune, Maharashtra.





Read more
AdElement

at AdElement

2 recruiters
Ritisha Nigam
Posted by Ritisha Nigam
Pune
3 - 10 yrs
₹5L - ₹10L / yr
skill iconJava
skill iconAmazon Web Services (AWS)
Spark
Hadoop
Amazon Redshift
+3 more

We are looking for Senior Software Engineers responsible for designing, developing, and maintaining large scale distributed ad technology systems. This would entail working on several different systems, platforms and technologies.Collaborate with various engineering teams to meet a range of technological challenges. You will work with our product team to contribute and influence the roadmap of our products and technologies and also influence and inspire team members.


Experience

  • 3 - 10 Years


Required Skills

  • 3+ years of work experience and a degree in computer science or a similar field
  • Knowledgeable about computer science fundamentals including data structures, algorithms, and coding
  • Enjoy owning projects from creation to completion and wearing multiple hats
  • Product focused mindset
  • Experience building distributed systems capable of handling large volumes of traffic
  • Fluency with Java, Vertex, Redis, Relational Databases
  • Possess good communication skills
  • Enjoy working in a team-oriented environment that values excellence
  • Have a knack for solving very challenging problems
  • (Preferred) Previous experience in advertising technology or gaming apps
  • (Preferred) Hands-on experience with Spark, Kafka or similar open-source software

Responsibilities

  • Creating design and architecture documents
  • Conducting code reviews
  • Collaborate with others in the engineering teams to meet a range of technological challenges
  • Build, Design and Develop large scale advertising technology system capable of handling tens of billions of events daily

Education

  • UG - B.Tech/B.E. - Computers; PG - M.Tech - Computer


What We Offer:

  • Competitive salary and benefits package.
  • Opportunities for professional growth and development.
  • A collaborative and inclusive work environment.


Salary budget upto 50 LPA or hike20% on current ctc

you can text me over linkedin for quick response


Read more
Wissen Technology

at Wissen Technology

4 recruiters
Janane Mohanasankaran
Posted by Janane Mohanasankaran
Bengaluru (Bangalore), Mumbai, Pune
4 - 7 yrs
Best in industry
skill iconPython
pandas
NumPy
SQL
skill iconHTML/CSS
+4 more

Specific Knowledge/Skills


  1. 4-6 years of experience
  2. Proficiency in Python programming.
  3. Basic knowledge of front-end development.
  4. Basic knowledge of Data manipulation and analysis libraries
  5. Code versioning and collaboration. (Git)
  6. Knowledge for Libraries for extracting data from websites.
  7. Knowledge of SQL and NoSQL databases
  8. Familiarity with RESTful APIs
  9. Familiarity with Cloud (Azure /AWS) technologies
Read more
Deqode

at Deqode

1 recruiter
purvisha Bhavsar
Posted by purvisha Bhavsar
Indore, Pune, Bhopal, Mumbai, Nagpur, Kolkata, Bengaluru (Bangalore), Chennai
4 - 6 yrs
₹4.5L - ₹18L / yr
skill iconJava
skill iconSpring Boot
Microservices
SQL

🚀 Hiring: Java Developer at Deqode

⭐ Experience: 4+ Years

📍 Location: Indore, Pune, Mumbai, Nagpur, Noida, Kolkata, Bangalore,Chennai

⭐ Work Mode:- Hybrid

⏱️ Notice Period: Immediate Joiners

(Only immediate joiners & candidates serving notice period)


Requirements

✅ Strong proficiency in Java (Java 8/11/17)

✅ Experience with Spring / Spring Boot

✅ Knowledge of REST APIs, Microservices architecture

✅ Familiarity with SQL/NoSQL databases

✅ Understanding of Git, CI/CD pipelines

✅ Problem-solving skills and attention to detail


Read more
Global digital transformation solutions provider

Global digital transformation solutions provider

Agency job
via Peak Hire Solutions by Dhara Thakkar
Pune
6 - 9 yrs
₹15L - ₹25L / yr
Data engineering
Apache Kafka
skill iconPython
skill iconAmazon Web Services (AWS)
AWS Lambda
+11 more

Job Details

- Job Title: Lead I - Data Engineering 

- Industry: Global digital transformation solutions provider

- Domain - Information technology (IT)

- Experience Required: 6-9 years

- Employment Type: Full Time

- Job Location: Pune

- CTC Range: Best in Industry


Job Description

Job Title: Senior Data Engineer (Kafka & AWS)

Responsibilities:

  • Develop and maintain real-time data pipelines using Apache Kafka (MSK or Confluent) and AWS services.
  • Configure and manage Kafka connectors, ensuring seamless data flow and integration across systems.
  • Demonstrate strong expertise in the Kafka ecosystem, including producers, consumers, brokers, topics, and schema registry.
  • Design and implement scalable ETL/ELT workflows to efficiently process large volumes of data.
  • Optimize data lake and data warehouse solutions using AWS services such as Lambda, S3, and Glue.
  • Implement robust monitoring, testing, and observability practices to ensure reliability and performance of data platforms.
  • Uphold data security, governance, and compliance standards across all data operations.

 

Requirements:

  • Minimum of 5 years of experience in Data Engineering or related roles.
  • Proven expertise with Apache Kafka and the AWS data stack (MSK, Glue, Lambda, S3, etc.).
  • Proficient in coding with Python, SQL, and Java — with Java strongly preferred.
  • Experience with Infrastructure-as-Code (IaC) tools (e.g., CloudFormation) and CI/CD pipelines.
  • Excellent problem-solvingcommunication, and collaboration skills.
  • Flexibility to write production-quality code in both Python and Java as required.

 

Skills: Aws, Kafka, Python


Must-Haves

Minimum of 5 years of experience in Data Engineering or related roles.

Proven expertise with Apache Kafka and the AWS data stack (MSK, Glue, Lambda, S3, etc.).

Proficient in coding with Python, SQL, and Java — with Java strongly preferred.

Experience with Infrastructure-as-Code (IaC) tools (e.g., CloudFormation) and CI/CD pipelines.

Excellent problem-solving, communication, and collaboration skills.

Flexibility to write production-quality code in both Python and Java as required.

Skills: Aws, Kafka, Python

Notice period - 0 to 15days only

Read more
Highfly Sourcing

at Highfly Sourcing

2 candid answers
Highfly Hr
Posted by Highfly Hr
Singapore, Switzerland, New Zealand, Dubai, Dublin, Ireland, Augsburg, Germany, Manchester (United Kingdom), Qatar, Kuwait, Malaysia, Bengaluru (Bangalore), Mumbai, Delhi, Gurugram, Noida, Ghaziabad, Faridabad, Pune, Hyderabad, Goa
3 - 5 yrs
₹15L - ₹25L / yr
SQL
skill iconPHP
skill iconPython
Data Visualization
Data Structures
+5 more

We are seeking a motivated Data Analyst to support business operations by analyzing data, preparing reports, and delivering meaningful insights. The ideal candidate should be comfortable working with data, identifying patterns, and presenting findings in a clear and actionable way.

Key Responsibilities:

  • Collect, clean, and organize data from internal and external sources
  • Analyze large datasets to identify trends, patterns, and opportunities
  • Prepare regular and ad-hoc reports for business stakeholders
  • Create dashboards and visualizations using tools like Power BI or Tableau
  • Work closely with cross-functional teams to understand data requirements
  • Ensure data accuracy, consistency, and quality across reports
  • Document data processes and analysis methods


Read more
Wissen Technology

at Wissen Technology

4 recruiters
Praffull Shinde
Posted by Praffull Shinde
Pune, Mumbai, Bengaluru (Bangalore)
5 - 10 yrs
Best in industry
skill icon.NET
skill iconC#
SQL

Job Description

Wissen Technology is seeking an experienced C# .NET Developer to build and maintain applications related to streaming market data. This role involves developing message-based C#/.NET applications to process, normalize, and summarize large volumes of market data efficiently. The candidate should have a strong foundation in Microsoft .NET technologies and experience working with message-driven, event-based architecture. Knowledge of capital markets and equity market data is highly desirable.


Responsibilities

  • Design, develop, and maintain message-based C#/.NET applications for processing real-time and batch market data feeds.
  • Build robust routines to download and process data from AWS S3 buckets on a frequent schedule.
  • Implement daily data summarization and data normalization routines.
  • Collaborate with business analysts, data providers, and other developers to deliver high-quality, scalable market data solutions.
  • Troubleshoot and optimize market data pipelines to ensure low latency and high reliability.
  • Contribute to documentation, code reviews, and team knowledge sharing.

Required Skills and Experience

  • 5+ years of professional experience programming in C# and Microsoft .NET framework.
  • Strong understanding of message-based and real-time programming architectures.
  • Experience working with AWS services, specifically S3, for data retrieval and processing.
  • Experience with SQL and Microsoft SQL Server.
  • Familiarity with Equity market data, FX, Futures & Options, and capital markets concepts.
  • Excellent interpersonal and communication skills.
  • Highly motivated, curious, and analytical mindset with the ability to work well both independently and in a team environment.


Read more
Techno Wise
Ishita Panwar
Posted by Ishita Panwar
Pune
6 - 10 yrs
₹30L - ₹35L / yr
Microsoft Windows Azure
SQL
Informatica MDM
skill iconAmazon Web Services (AWS)
Informatica PowerCenter
+2 more

Profile: Senior Data Engineer (Informatica MDM)


Primary Purpose:

The Senior Data Engineer will be responsible for building new segments in a Customer Data Platform (CDP), maintaining the segments, understanding the data requirements for use cases, data integrity, data quality and data sources involved to build the specific use cases. The resource should also have an understanding of ETL processes. This position will have an understanding of integrations with cloud service providers like Microsoft Azure, Azure Data Lake Services, Azure Data Factory and cloud data warehouse platforms in addition to Enterprise Data Ware house environments. The ideal candidate will also have proven experience in data analysis and management, with excellent analytical and problem-solving abilities.


Major Functions/Responsibilities

• Design, develop and implement robust and extensible solutions to build segmentations using Customer Data Platform.

• Work closely with subject matter experts to identify and document based on the business requirements, functional specs and translate them into appropriate technical solutions.

• Responsible for estimating, planning, and managing the user stories, tasks and reports on Agile Projects.

• Develop advanced SQL Procedures, Functions and SQL jobs.

• Performance tuning and optimization of ETL Jobs, SQL Queries and Scripts.

• Configure and maintain scheduled ETL jobs, data segments and refresh.

• Support exploratory data analysis, statistical analysis, and predictive analytics.

• Support production issues and maintain existing data systems by researching and trouble shooting any issues/problems in a timely manner.

• Proactive, great attention to detail, results-oriented problem solver.


Preferred Experience

• 6+ years of experience in writing SQL queries and stored procedures to extract, manipulate and load data.

• 6+ years’ experience with design, build, test, and maintain data integrations for data marts and data warehouses.

• 3+ years of experience in integrations Azure / AWS Data Lakes, Azure Data Factory & IDMC (Informatica Cloud Services).

• In depth understanding of database management systems, online analytical processing (OLAP) and ETL (Extract, transform, load) framework.

• Excellent verbal and written communication skills

• Collaboration with both onshore and offshore development teams.

• Good Understanding of Marketing tools like Sales Force Marketing cloud, Adobe Marketing or Microsoft Customer Insights Journey and Customer Data Platform will be important to this role. Communication

• Facilitate project team meetings effectively.

• Effectively communicate relevant project information to superiors

• Deliver engaging, informative, well-organized presentations that are effectively tailored to the intended audience.

• Serve as a technical liaison with development partner.

• Serve as a communication bridge between applications team, developers and infrastructure team members to facilitate understanding of current systems

• Resolve and/or escalate issues in a timely fashion.

• Understand how to communicate difficult/sensitive information tactfully.

• Works under the direction of Technical Data Lead / Data architect. Education

Bachelor’s Degree or higher in Engineering, Technology or related field experience required. 

Read more
DeepIntent

at DeepIntent

2 candid answers
17 recruiters
Amruta Mundale
Posted by Amruta Mundale
Pune
4 - 8 yrs
Best in industry
skill iconJava
SQL
skill iconSpring Boot
Apache
skill iconAmazon Web Services (AWS)
+1 more

What You’ll Do:

  • Setting up formal data practices for the company.
  • Building and running super stable and scalable data architectures.
  • Making it easy for folks to add and use new data with self-service pipelines.
  • Getting DataOps practices in place.
  • Designing, developing, and running data pipelines to help out Products, Analytics, data scientists and machine learning engineers.
  • Creating simple, reliable data storage, ingestion, and transformation solutions that are a breeze to deploy and manage.
  • Writing and Managing reporting API for different products.
  • Implementing different methodologies for different reporting needs.
  • Teaming up with all sorts of people – business folks, other software engineers, machine learning engineers, and analysts.

Who You Are:

  • Bachelor’s degree in engineering (CS / IT) or equivalent degree from a well-known Institute / University.
  • 3.5+ years of experience in building and running data pipelines for tons of data.
  • Experience with public clouds like GCP or AWS.
  • Experience with Apache open-source projects like Spark, Druid, Airflow, and big data databases like BigQuery, Clickhouse.
  • Experience making data architectures that are optimised for both performance and cost.
  • Good grasp of software engineering, DataOps, data architecture, Agile, and DevOps.
  • Proficient in SQL, Java, Spring Boot, Python, and Bash.
  • Good communication skills for working with technical and non-technical people.
  • Someone who thinks big, takes chances, innovates, dives deep, gets things done, hires and develops the best, and is always learning and curious.


Read more
Global digital transformation solutions provider.

Global digital transformation solutions provider.

Agency job
via Peak Hire Solutions by Dhara Thakkar
Thiruvananthapuram, Chennai, Pune
4 - 7 yrs
₹10L - ₹20L / yr
skill iconC#
Test Automation (QA)
Manual testing
Play Framework
SQL
+6 more

Role Proficiency:

Performs tests in strict compliance independently guides other testers and assists test leads


Additional Comments:

Position Title: - Automation + Manual Tester Primary

Skills: Playwright, xUnit, Allure Report, Page Object Model, .Net, C#, Database Queries

Secondary Skills: GIT, JIRA, Manual Testing Experience: 4 to 5 years ESSENTIAL FUNCTIONS AND


BASIC DUTIES

1. Leadership in Automation Strategy: o Assess the feasibility and scope of automation efforts to ensure they align with project timelines and requirements. o Identify opportunities for process improvements and automation within the software development life cycle (SDLC).

2. Automation Test Framework Development: o Design, develop, and implement reusable test automation frameworks for various testing phases (unit, integration, functional, performance, etc.). o Ensure the automation frameworks integrate well with CI/CD pipelines and other development tools. o Maintain and optimize test automation scripts and frameworks for continuous improvements.

3. Team Management: o Lead and mentor a team of automation engineers, ensuring they follow best practices, writing efficient test scripts, and developing scalable automation solutions. o Conduct regular performance evaluations and provide constructive feedback. o Facilitate knowledge-sharing sessions within the team.

4. Collaboration with Cross-functional Teams: o Work closely with development, QA, and operations teams to ensure proper implementation of automated testing and automation practices. o Collaborate with business analysts, product owners, and project managers to understand business requirements and translate them into automated test cases.

5. Continuous Integration & Delivery (CI/CD): o Ensure that automated tests are integrated into the CI/CD pipelines to facilitate continuous testing. o Identify and resolve issues related to the automation processes within the CI/CD pipeline.

6. Test Planning and Estimation: o Contribute to the test planning phase by identifying key automation opportunities. o Estimate effort and time required for automating test cases and other automation tasks.

7. Test Reporting and Metrics: o Monitor automation test results and generate detailed reports on test coverage, defects, and progress. o Analyze test results to identify trends, bottlenecks, or issues in the automation process and make necessary improvements.

8. Automation Tools Management: o Evaluate, select, and manage automation tools and technologies that best meet the needs of the project. o Ensure that the automation tools used align with the overall project requirements and help to achieve optimal efficiency.

9. Test Environment and Data Management: o Work on setting up and maintaining the test environments needed for automation. o Ensure automation scripts work across multiple environments, including staging, testing, and production environments.

10. Risk Management & Issue Resolution:

• Proactively identify risks associated with the automation efforts and provide solutions or mitigation strategies.

• Troubleshoot issues in the automation scripts, framework, and infrastructure to ensure minimal downtime and quick issue resolution.

11. Develop and Maintain Automated Tests: Write and maintain automated scripts for different testing levels, including regression, functional, and integration tests.

12. Bug Identification and Tracking: Report, track, and manage defects identified through automation testing to ensure quick resolution.

13. Improve Test Coverage: Identify gaps in test coverage and develop additional test scripts to improve test comprehensiveness. 14. Automation Documentation: Create and maintain detailed documentation for test automation processes, scripts, and frameworks.

15. Quality Assurance: Ensure that all automated testing activities meet the quality standards, contributing to delivering a high-quality software product.

16. Stakeholder Communication: Regularly update project stakeholders about automation progress, risks, and areas for improvement.


REQUIRED KNOWLEDGE

1. Automation Tools Expertise: Proficiency in tools like Playwright, Allure reports and integration with CI/CD pipelines.

2. Programming Languages: Strong knowledge of languages such as .NET and test frameworks like xUnit.

3. Version Control: Experience using Git for script management and collaboration.

4. Test Automation Frameworks: Ability to design scalable, reusable frameworks for different types of tests (functional, integration, etc.).

5. Leadership and Mentoring: Lead and mentor automation teams, ensuring adherence to best practices and continuous improvement.

6. Problem-Solving: Strong troubleshooting and analytical skills to identify and resolve automation issues quickly.

7. Collaboration and Communication: Excellent communication skills for working with cross-functional teams and presenting test results.

8. Time Management: Ability to estimate, prioritize, and manage automation tasks to meet project deadlines.

9. Quality Focus: Strong commitment to improving software quality, test coverage, and automation efficiency.


Skills: xUnit, Allure report, Playwright, C#

Read more
Banking Industry

Banking Industry

Agency job
via Peak Hire Solutions by Dhara Thakkar
Bengaluru (Bangalore), Mangalore, Pune, Mumbai
3 - 5 yrs
₹8L - ₹11L / yr
skill iconData Analytics
SQL
Relational Database (RDBMS)
skill iconJava
skill iconPython
+1 more

Required Skills: Strong SQL Expertise, Data Reporting & Analytics, Database Development, Stakeholder & Client Communication, Independent Problem-Solving & Automation Skills

 

Review Criteria

· Must have Strong SQL skills (queries, optimization, procedures, triggers)

· Must have Advanced Excel skills

· Should have 3+ years of relevant experience

· Should have Reporting + dashboard creation experience

· Should have Database development & maintenance experience

· Must have Strong communication for client interactions

· Should have Ability to work independently

· Willingness to work from client locations.

 

Description

Who is an ideal fit for us?

We seek professionals who are analytical, demonstrate self-motivation, exhibit a proactive mindset, and possess a strong sense of responsibility and ownership in their work.

 

What will you get to work on?

As a member of the Implementation & Analytics team, you will:

● Design, develop, and optimize complex SQL queries to extract, transform, and analyze data

● Create advanced reports and dashboards using SQL, stored procedures, and other reporting tools

● Develop and maintain database structures, stored procedures, functions, and triggers

● Optimize database performance by tuning SQL queries, and indexing to handle large datasets efficiently

● Collaborate with business stakeholders and analysts to understand analytics requirements

● Automate data extraction, transformation, and reporting processes to improve efficiency


What do we expect from you?

For the SQL/Oracle Developer role, we are seeking candidates with the following skills and Expertise:

● Proficiency in SQL (Window functions, stored procedures) and MS Excel (advanced Excel skills)

● More than 3 plus years of relevant experience

● Java / Python experience is a plus but not mandatory

● Strong communication skills to interact with customers to understand their requirements

● Capable of working independently with minimal guidance, showcasing self-reliance and initiative

● Previous experience in automation projects is preferred

● Work From Office: Bangalore/Navi Mumbai/Pune/Client locations

 

Read more
ImmersiveDataAI
Ishan  Agrawal
Posted by Ishan Agrawal
Pune
0 - 1 yrs
₹10000 - ₹20000 / mo
skill iconPython
SQL
Large Language Models (LLM) tuning
Data engineering

Entry Level | On-Site | Pune

Internship Opportunity: Data + AI Intern

Location: Pune, India (In-office)

Duration: 2 Months

Start Date: Between 11th July 2025 and 15th August 2025

Work Days: Monday to Friday

Stipend: As per company policy

About ImmersiveData.AI

Smarter Data. Smarter Decisions. Smarter Enterprises.™

At ImmersiveData.AI, we don’t just transform data—we challenge and redefine business models. By leveraging cutting-edge AI, intelligent automation, and modern data platforms, we empower enterprises to unlock new value and drive strategic transformation.

About the Internship

As a Data + AI Intern, you will gain hands-on experience at the intersection of data engineering and AI. You’ll be part of a collaborative team working on real-world data challenges using modern tools like SnowflakeDBTAirflow, and LLM frameworks. This internship is a launchpad for students looking to enter the rapidly evolving field of Data & AI.

Key Responsibilities

  • Assist in designing, building, and optimizing data pipelines and ETL workflows
  • Work with structured and unstructured datasets across various sources
  • Contribute to AI-driven automation and analytics use cases
  • Support backend integration of large language models (LLMs)
  • Collaborate in building data platforms using tools like SnowflakeDBT, and Airflow

Required Skills

  • Proficiency in Python
  • Strong understanding of SQL and relational databases
  • Basic knowledge of Data Engineering and Data Analysis concepts
  • Familiarity with cloud data platforms or willingness to learn (e.g., Snowflake)

Preferred Learning Certifications (Optional but Recommended)

  • Python Programming
  • SQL & MySQL/PostgreSQL
  • Statistical Modeling
  • Tableau / Power BI
  • Voice App Development (Bonus)

Who Can Apply

Only candidates who:

  • Are available full-time (in-office, Pune)
  • Can start between 11th July and 15th August 2025
  • Are available for a minimum of 2 months
  • Have relevant skills and interest in data and AI

Perks

  • Internship Certificate
  • Letter of Recommendation
  • Work with cutting-edge tools and technologies
  • Informal dress code
  • Exposure to real industry use cases and mentorship


Read more
Inteliment Technologies

at Inteliment Technologies

2 candid answers
Ariba Khan
Posted by Ariba Khan
Pune
4 - 7 yrs
Upto ₹20L / yr (Varies
)
SQL
Data modeling
Data Vault
ERwin
Star schema
+1 more

About the company:

Inteliment is a niche business analytics company with almost 2 decades proven track record of partnering with hundreds of fortunes 500 global companies. Inteliment operates its ISO certified development centre in Pune, India and has business operations in multiple countries through subsidiaries in Singapore, Europe and headquarter in India.


About the Role:

We are seeking an experienced Technical Data Professional with hands-on expertise in designing and implementing dimensional data models using Erwin or any dimensional model tool and building SQL-based solutions adhering to Data Vault 2.0 and Information Mart standards. The ideal candidate will have strong data analysis capabilities, exceptional SQL skills, and a deep understanding of data relationships, metrics, and granularity of the data structures.


Qualifications:

  • Bachelor’s degree in computer science, Information Technology, or a related field.
  • Certifications with related field will be an added advantage.


Key Competencies:

1.Technical Expertise:

  • Proficiency in Erwin for data modeling.
  • Advanced SQL skills with experience in writing and optimizing performance driven queries.
  • Hands-on experience with Data Vault 2.0 and Information Mart standards is highly preferred.
  • Solid understanding of Star Schema, Facts & Dimensions, and Business Unit (BU) architecture.

2. Analytical Skills:

  • Strong data analysis skills to evaluate data relationships, metrics, and granularities.
  • Capability to troubleshoot and resolve complex data modeling and performance issues.

3.Soft Skills:

  • Strong problem-solving and decision-making skills.
  • Excellent communication and stakeholder management abilities.
  • Proactive and detail-oriented with a focus on delivering high-quality results.


Key Responsibilities:

1. Dimensional Data Modeling:

  • Design and develop dimensional data models using Erwin with a focus on Star Schema and BUS architecture (Fact and Dimension tables).
  • Ensure models align with business requirements and provide scalability, performance, and maintainability.

2. SQL Development:

  • Implement data models in SQL using best practices for view creation, ensuring high performance.
  • Write, optimize, and refactor complex SQL queries for efficiency and performance in large-scale databases.
  • Develop solutions adhering to Information Mart and Data Vault 2.0 standards. (Dimensional model that is built using Raw Data vault tables Hubs, Links, satellites, Effectivity satellites , Bridge and PIT tables from Data Vault.)

3. Data Analysis & Relationship Metrics:

  • Perform in-depth data analysis to identify patterns, relationships, and metrics at different levels of granularity.
  • Ensure data integrity and quality by validating data models against business expectations.

4. Performance Optimization:

  • Conduct performance tuning of existing data structures, queries, and ETL processes.
  • Provide guidance on database indexing, partitioning, and query optimization techniques.

5. Collaboration:

  • Work closely with business stakeholders, data engineers, and analysts to understand and translate business needs into effective data solutions.
  • Support cross-functional teams to ensure seamless integration and delivery of data solutions
Read more
Inteliment Technologies

at Inteliment Technologies

2 candid answers
Ariba Khan
Posted by Ariba Khan
Pune
3 - 5 yrs
Upto ₹16L / yr (Varies
)
SQL
skill iconPython
ETL
skill iconAmazon Web Services (AWS)
Azure
+1 more

About the company:

Inteliment is a niche business analytics company with almost 2 decades proven track record of partnering with hundreds of fortunes 500 global companies. Inteliment operates its ISO certified development centre in Pune, India and has business operations in multiple countries through subsidiaries in Singapore, Europe and headquarter in India.


About the Role:

As a Data Engineer, you will contribute to cutting-edge global projects and innovative product initiatives, delivering impactful solutions for our Fortune clients. In this role, you will take ownership of the entire data pipeline and infrastructure development lifecycle—from ideation and design to implementation and ongoing optimization. Your efforts will ensure the delivery of high-performance, scalable, and reliable data solutions. Join us to become a driving force in shaping the future of data infrastructure and innovation, paving the way for transformative advancements in the data ecosystem.


Qualifications:

  • Bachelor’s or master’s degree in computer science, Information Technology, or a related field.
  • Certifications with related field will be an added advantage.


Key Competencies:

  • Must have experience with SQL, Python and Hadoop
  • Good to have experience with Cloud Computing Platforms (AWS, Azure, GCP, etc.), DevOps Practices, Agile Development Methodologies
  • ETL or other similar technologies will be an advantage.
  • Core Skills: Proficiency in SQL, Python, or Scala for data processing and manipulation
  • Data Platforms: Experience with cloud platforms such as AWS, Azure, or Google Cloud.
  • Tools: Familiarity with tools like Apache Spark, Kafka, and modern data warehouses (e.g., Snowflake, Big Query, Redshift).
  • Soft Skills: Strong problem-solving abilities, collaboration, and communication skills to work effectively with technical and non-technical teams.
  • Additional: Knowledge of SAP would be an advantage 


Key Responsibilities:

  • Data Pipeline Development: Build, maintain, and optimize ETL/ELT pipelines for seamless data flow.
  • Data Integration: Consolidate data from various sources into unified systems.
  • Database Management: Design and optimize scalable data storage solutions.
  • Data Quality Assurance: Ensure data accuracy, consistency, and completeness.
  • Collaboration: Work with analysts, scientists, and stakeholders to meet data needs.
  • Performance Optimization: Enhance pipeline efficiency and database performance.
  • Data Security: Implement and maintain robust data security and governance policies
  • Innovation: Adopt new tools and design scalable solutions for future growth.
  • Monitoring: Continuously monitor and maintain data systems for reliability.
  • Data Engineers ensure reliable, high-quality data infrastructure for analytics and decision-making.
Read more
Global Digital Transformation Solutions Provider

Global Digital Transformation Solutions Provider

Agency job
via Peak Hire Solutions by Dhara Thakkar
Pune
6 - 12 yrs
₹15L - ₹30L / yr
skill iconMachine Learning (ML)
skill iconAmazon Web Services (AWS)
skill iconKubernetes
ECS
Amazon Redshift
+14 more

Core Responsibilities:

  • The MLE will design, build, test, and deploy scalable machine learning systems, optimizing model accuracy and efficiency
  • Model Development: Algorithms and architectures span traditional statistical methods to deep learning along with employing LLMs in modern frameworks.
  • Data Preparation: Prepare, cleanse, and transform data for model training and evaluation.
  • Algorithm Implementation: Implement and optimize machine learning algorithms and statistical models.
  • System Integration: Integrate models into existing systems and workflows.
  • Model Deployment: Deploy models to production environments and monitor performance.
  • Collaboration: Work closely with data scientists, software engineers, and other stakeholders.
  • Continuous Improvement: Identify areas for improvement in model performance and systems.

 

Skills:

  • Programming and Software Engineering: Knowledge of software engineering best practices (version control, testing, CI/CD).
  • Data Engineering: Ability to handle data pipelines, data cleaning, and feature engineering. Proficiency in SQL for data manipulation + Kafka, Chaossearch logs, etc for troubleshooting; Other tech touch points are ScyllaDB (like BigTable), OpenSearch, Neo4J graph
  • Model Deployment and Monitoring: MLOps Experience in deploying ML models to production environments.
  • Knowledge of model monitoring and performance evaluation.

 

Required experience:

  • Amazon SageMaker: Deep understanding of SageMaker's capabilities for building, training, and deploying ML models; understanding of the Sagemaker pipeline with ability to analyze gaps and recommend/implement improvements
  • AWS Cloud Infrastructure: Familiarity with S3, EC2, Lambda and using these services in ML workflows
  • AWS data: Redshift, Glue
  • Containerization and Orchestration: Understanding of Docker and Kubernetes, and their implementation within AWS (EKS, ECS)

 

Skills: Aws, Aws Cloud, Amazon Redshift, Eks

 

Must-Haves

Machine Learning +Aws+ (EKS OR ECS OR Kubernetes) + (Redshift AND Glue) + Sagemaker

Notice period - 0 to 15days only

Hybrid work mode- 3 days office, 2 days at home

Read more
Deqode

at Deqode

1 recruiter
Samiksha Agrawal
Posted by Samiksha Agrawal
Mumbai, Pune, Delhi, Gurugram, Noida, Ghaziabad, Faridabad, Indore, Bengaluru (Bangalore)
4 - 7 yrs
₹4L - ₹10L / yr
skill iconJava
skill iconSpring Boot
Microservices
SQL
Hibernate (Java)

Job Description

Role: Java Developer

Location: PAN India

Experience:4+ Years

Required Skills -

  1. 3+ years Java development experience
  2. Spring Boot framework expertise (MANDATORY)
  3. Microservices architecture design & implementation (MANDATORY)
  4. Hibernate/JPA for database operations (MANDATORY)
  5. RESTful API development (MANDATORY)
  6. Database design and optimization (MANDATORY)
  7. Container technologies (Docker/Kubernetes)
  8. Cloud platforms experience (AWS/Azure)
  9. CI/CD pipeline implementation
  10. Code review and quality assurance
  11. Problem-solving and debugging skills
  12. Agile/Scrum methodology
  13. Version control systems (Git)


Read more
Banking Industry

Banking Industry

Agency job
via Jobdost by Saida Pathan
Mangalore, Mumbai, Pune, Bengaluru (Bangalore)
3 - 5 yrs
₹8L - ₹10L / yr
SQL
Dashboard
skill iconData Analytics
Database Development

Who is an ideal fit for us?

We seek professionals who are analytical, demonstrate self-motivation, exhibit a proactive mindset, and possess a strong sense of responsibility and ownership in their work.

 

What will you get to work on?


As a member of the Implementation & Analytics team, you will:

● Design, develop, and optimize complex SQL queries to extract, transform, and analyze data

● Create advanced reports and dashboards using SQL, stored procedures, and other reporting tools

● Develop and maintain database structures, stored procedures, functions, and triggers

● Optimize database performance by tuning SQL queries, and indexing to handle large datasets efficiently

● Collaborate with business stakeholders and analysts to understand analytics requirements

● Automate data extraction, transformation, and reporting processes to improve efficiency


What do we expect from you?


For the SQL/Oracle Developer role, we are seeking candidates with the following skills and Expertise:

● Proficiency in SQL (Window functions, stored procedures) and MS Excel (advanced Excel skills)

● More than 3 plus years of relevant experience

● Java / Python experience is a plus but not mandatory

● Strong communication skills to interact with customers to understand their requirements

● Capable of working independently with minimal guidance, showcasing self-reliance and initiative

● Previous experience in automation projects is preferred

● Work From Office: Bangalore/Navi Mumbai/Pune/Client locations


Read more
Financial Services Company

Financial Services Company

Agency job
via Peak Hire Solutions by Dhara Thakkar
Pune
4 - 8 yrs
₹10L - ₹13L / yr
SQL
databricks
PowerBI
Windows Azure
Data engineering
+9 more

Review Criteria

  • Strong Senior Data Engineer profile
  • 4+ years of hands-on Data Engineering experience
  • Must have experience owning end-to-end data architecture and complex pipelines
  • Must have advanced SQL capability (complex queries, large datasets, optimization)
  • Must have strong Databricks hands-on experience
  • Must be able to architect solutions, troubleshoot complex data issues, and work independently
  • Must have Power BI integration experience
  • CTC has 80% fixed and 20% variable in their ctc structure


Preferred

  • Worked on Call center data, understand nuances of data generated in call centers
  • Experience implementing data governance, quality checks, or lineage frameworks
  • Experience with orchestration tools (Airflow, ADF, Glue Workflows), Python, Delta Lake, Lakehouse architecture


Job Specific Criteria

  • CV Attachment is mandatory
  • Are you Comfortable integrating with Power BI datasets?
  • We have an alternate Saturdays working. Are you comfortable to WFH on 1st and 4th Saturday?


Role & Responsibilities

We are seeking a highly experienced Senior Data Engineer with strong architectural capability, excellent optimisation skills, and deep hands-on experience in modern data platforms. The ideal candidate will have advanced SQL skills, strong expertise in Databricks, and practical experience working across cloud environments such as AWS and Azure. This role requires end-to-end ownership of complex data engineering initiatives, including architecture design, data governance implementation, and performance optimisation. You will collaborate with cross-functional teams to build scalable, secure, and high-quality data solutions.

 

Key Responsibilities-

  • Lead the design and implementation of scalable data architectures, pipelines, and integration frameworks.
  • Develop, optimise, and maintain complex SQL queries, transformations, and Databricks-based data workflows.
  • Architect and deliver high-performance ETL/ELT processes across cloud platforms.
  • Implement and enforce data governance standards, including data quality, lineage, and access control.
  • Partner with analytics, BI (Power BI), and business teams to enable reliable, governed, and high-value data delivery.
  • Optimise large-scale data processing, ensuring efficiency, reliability, and cost-effectiveness.
  • Monitor, troubleshoot, and continuously improve data pipelines and platform performance.
  • Mentor junior engineers and contribute to engineering best practices, standards, and documentation.


Ideal Candidate

  • Proven industry experience as a Senior Data Engineer, with ownership of high-complexity projects.
  • Advanced SQL skills with experience handling large, complex datasets.
  • Strong expertise with Databricks for data engineering workloads.
  • Hands-on experience with major cloud platforms — AWS and Azure.
  • Deep understanding of data architecture, data modelling, and optimisation techniques.
  • Familiarity with BI and reporting environments such as Power BI.
  • Strong analytical and problem-solving abilities with a focus on data quality and governance
  • Proficiency in python or another programming language in a plus.
Read more
Non-Banking Financial Company

Non-Banking Financial Company

Agency job
via Peak Hire Solutions by Dhara Thakkar
Pune
4 - 8 yrs
₹8L - ₹13L / yr
SQL
databricks
PowerBI
Data engineering
Data architecture
+7 more

ROLES AND RESPONSIBILITIES:

We are seeking a highly experienced Senior Data Engineer with strong architectural capability, excellent optimisation skills, and deep hands-on experience in modern data platforms. The ideal candidate will have advanced SQL skills, strong expertise in Databricks, and practical experience working across cloud environments such as AWS and Azure. This role requires end-to-end ownership of complex data engineering initiatives, including architecture design, data governance implementation, and performance optimisation. You will collaborate with cross-functional teams to build scalable, secure, and high-quality data solutions.


Key Responsibilities-

  • Lead the design and implementation of scalable data architectures, pipelines, and integration frameworks.
  • Develop, optimise, and maintain complex SQL queries, transformations, and Databricks-based data workflows.
  • Architect and deliver high-performance ETL/ELT processes across cloud platforms.
  • Implement and enforce data governance standards, including data quality, lineage, and access control.
  • Partner with analytics, BI (Power BI), and business teams to enable reliable, governed, and high-value data delivery.
  • Optimise large-scale data processing, ensuring efficiency, reliability, and cost-effectiveness.
  • Monitor, troubleshoot, and continuously improve data pipelines and platform performance.
  • Mentor junior engineers and contribute to engineering best practices, standards, and documentation.


IDEAL CANDIDATE:

  • Proven industry experience as a Senior Data Engineer, with ownership of high-complexity projects.
  • Advanced SQL skills with experience handling large, complex datasets.
  • Strong expertise with Databricks for data engineering workloads.
  • Hands-on experience with major cloud platforms — AWS and Azure.
  • Deep understanding of data architecture, data modelling, and optimisation techniques.
  • Familiarity with BI and reporting environments such as Power BI.
  • Strong analytical and problem-solving abilities with a focus on data quality and governance
  • Proficiency in python or another programming language in a plus.


PERKS, BENEFITS AND WORK CULTURE:

Our people define our passion and our audacious, incredibly rewarding achievements. The company is one of India’s most diversified Non-banking financial companies, and among Asia’s top 10 Large workplaces. If you have the drive to get ahead, we can help find you an opportunity at any of the 500+ locations we’re present in India.

Read more
Non-Banking Financial Company

Non-Banking Financial Company

Agency job
via Peak Hire Solutions by Dhara Thakkar
Pune
1 - 2 yrs
₹5L - ₹6.1L / yr
SQL
databricks
PowerBI
Data engineering
ETL
+6 more

ROLES AND RESPONSIBILITIES:

We are looking for a Junior Data Engineer who will work under guidance to support data engineering tasks, perform basic coding, and actively learn modern data platforms and tools. The ideal candidate should have foundational SQL knowledge, basic exposure to Databricks. This role is designed for early-career professionals who are eager to grow into full data engineering responsibilities while contributing to data pipeline operations and analytical support.


Key Responsibilities-

  • Support the development and maintenance of data pipelines and ETL/ELT workflows under mentorship.
  • Write basic SQL queries, transformations, and assist with Databricks notebook tasks.
  • Help troubleshoot data issues and contribute to ensuring pipeline reliability.
  • Work with senior engineers and analysts to understand data requirements and deliver small tasks.
  • Assist in maintaining documentation, data dictionaries, and process notes.
  • Learn and apply data engineering best practices, coding standards, and cloud fundamentals.
  • Support basic tasks related to Power BI data preparation or integrations as needed.


IDEAL CANDIDATE:

  • Foundational SQL skills with the ability to write and understand basic queries.
  • Basic exposure to Databricks, data transformation concepts, or similar data tools.
  • Understanding of ETL/ELT concepts, data structures, and analytical workflows.
  • Eagerness to learn modern data engineering tools, technologies, and best practices.
  • Strong problem-solving attitude and willingness to work under guidance.
  • Good communication and collaboration skills to work with senior engineers and analysts.


PERKS, BENEFITS AND WORK CULTURE:

Our people define our passion and our audacious, incredibly rewarding achievements. Bajaj Finance Limited is one of India’s most diversified Non-banking financial companies, and among Asia’s top 10 Large workplaces. If you have the drive to get ahead, we can help find you an opportunity at any of the 500+ locations we’re present in India.

Read more
Financial Services Company

Financial Services Company

Agency job
via Peak Hire Solutions by Dhara Thakkar
Pune
2 - 5 yrs
₹8L - ₹10.7L / yr
SQL Azure
databricks
ETL
SQL
Data modeling
+4 more

ROLES AND RESPONSIBILITIES:

We are seeking a skilled Data Engineer who can work independently on data pipeline development, troubleshooting, and optimisation tasks. The ideal candidate will have strong SQL skills, hands-on experience with Databricks, and familiarity with cloud platforms such as AWS and Azure. You will be responsible for building and maintaining reliable data workflows, supporting analytical teams, and ensuring high-quality, secure, and accessible data across the organisation.


KEY RESPONSIBILITIES:

  • Design, develop, and maintain scalable data pipelines and ETL/ELT workflows.
  • Build, optimise, and troubleshoot SQL queries, transformations, and Databricks data processes.
  • Work with large datasets to deliver efficient, reliable, and high-performing data solutions.
  • Collaborate closely with analysts, data scientists, and business teams to support data requirements.
  • Ensure data quality, availability, and security across systems and workflows.
  • Monitor pipeline performance, diagnose issues, and implement improvements.
  • Contribute to documentation, standards, and best practices for data engineering processes.


IDEAL CANDIDATE:

  • Proven experience as a Data Engineer or in a similar data-focused role (3+ years).
  • Strong SQL skills with experience writing and optimising complex queries.
  • Hands-on experience with Databricks for data engineering tasks.
  • Experience with cloud platforms such as AWS and Azure.
  • Understanding of ETL/ELT concepts, data modelling, and pipeline orchestration.
  • Familiarity with Power BI and data integration with BI tools.
  • Strong analytical and troubleshooting skills, with the ability to work independently.
  • Experience working end-to-end on data engineering workflows and solutions.


PERKS, BENEFITS AND WORK CULTURE:

Our people define our passion and our audacious, incredibly rewarding achievements. The company is one of India’s most diversified Non-banking financial companies, and among Asia’s top 10 Large workplaces. If you have the drive to get ahead, we can help find you an opportunity at any of the 500+ locations we’re present in India.

Read more
Verinite Technologies

at Verinite Technologies

3 candid answers
2 recruiters
Bisman Gill
Posted by Bisman Gill
Pune
8yrs+
Upto ₹25L / yr (Varies
)
Acquiring
Switch
MS-Excel
VLOOKUP
JIRA
+5 more

About the Company:


Verinite is a global technology consulting and services company laser-focused on the banking & financial services sector, especially in cards, payments, lending, trade, and treasury


They partner with banks, fintechs, payment processors, and other financial institutions to modernize their systems, improve operational resilience, and accelerate digital transformation. Their services include consulting, digital strategy, data, application modernization, quality engineering (testing), cloud & infrastructure, and application maintenance.


Skill – Authorization, Clearing and Settlement

1. Individual should have worked on scheme (Visa, Amex, Discover, Rupay & Mastercard both on authorization or clearing section.

2. Should be able to read scheme specifications and create business requirement/mapping for authorization and Clearing

3. Should have Hands on experience in implementing scheme related changes

4. Should be able to validate the and certify the change post development based on the mapping created

5. Should be able to work with Dev team on explaining and guiding on time-to-time basis.

6. Able to communicate with various teams & senior stakeholders

7. Go getter and great googler

8. Schemes – VISA/MC/AMEX/JCB/CUP/Mercury – Discover and Diners, CBUAE, Jaywan ( Local Scheme from UAE)

9.Experience with Issuing side is plus (good to have).

Read more
Deqode

at Deqode

1 recruiter
purvisha Bhavsar
Posted by purvisha Bhavsar
Pune, Gurugram, Bhopal, Jaipur, Bengaluru (Bangalore)
2 - 4 yrs
₹5L - ₹12L / yr
Windows Azure
SQL
Data Structures
databricks

 Hiring: Azure Data Engineer

⭐ Experience: 2+ Years

📍 Location: Pune, Bhopal, Jaipur, Gurgaon, Bangalore

⭐ Work Mode:- Hybrid

⏱️ Notice Period: Immediate Joiners

Passport: Mandatory & Valid

(Only immediate joiners & candidates serving notice period)


Mandatory Skills:

Azure Synapse, Azure Databricks, Azure Data Factory (ADF), SQL, Delta Lake, ADLS, ETL/ELT,Pyspark .


Responsibilities:

  • Build and maintain data pipelines using ADF, Databricks, and Synapse.
  • Develop ETL/ELT workflows and optimize SQL queries.
  • Implement Delta Lake for scalable lakehouse architecture.
  • Create Synapse data models and Spark/Databricks notebooks.
  • Ensure data quality, performance, and security.
  • Collaborate with cross-functional teams on data requirements.


Nice to Have:

Azure DevOps, Python, Streaming (Event Hub/Kafka), Power BI, Azure certifications (DP-203).


Read more
Global Digital Transformation Solutions Provider

Global Digital Transformation Solutions Provider

Agency job
via Peak Hire Solutions by Dhara Thakkar
Pune
6 - 12 yrs
₹25L - ₹30L / yr
skill iconMachine Learning (ML)
AWS CloudFormation
Online machine learning
skill iconAmazon Web Services (AWS)
ECS
+20 more

MUST-HAVES: 

  • Machine Learning + Aws + (EKS OR ECS OR Kubernetes) + (Redshift AND Glue) + Sage maker
  • Notice period - 0 to 15 days only 
  • Hybrid work mode- 3 days office, 2 days at home


SKILLS: AWS, AWS CLOUD, AMAZON REDSHIFT, EKS


ADDITIONAL GUIDELINES:

  • Interview process: - 2 Technical round + 1 Client round
  • 3 days in office, Hybrid model. 


CORE RESPONSIBILITIES:

  • The MLE will design, build, test, and deploy scalable machine learning systems, optimizing model accuracy and efficiency
  • Model Development: Algorithms and architectures span traditional statistical methods to deep learning along with employing LLMs in modern frameworks.
  • Data Preparation: Prepare, cleanse, and transform data for model training and evaluation.
  • Algorithm Implementation: Implement and optimize machine learning algorithms and statistical models.
  • System Integration: Integrate models into existing systems and workflows.
  • Model Deployment: Deploy models to production environments and monitor performance.
  • Collaboration: Work closely with data scientists, software engineers, and other stakeholders.
  • Continuous Improvement: Identify areas for improvement in model performance and systems.


SKILLS:

  • Programming and Software Engineering: Knowledge of software engineering best practices (version control, testing, CI/CD).
  • Data Engineering: Ability to handle data pipelines, data cleaning, and feature engineering. Proficiency in SQL for data manipulation + Kafka, Chaos search logs, etc. for troubleshooting; Other tech touch points are Scylla DB (like BigTable), OpenSearch, Neo4J graph
  • Model Deployment and Monitoring: MLOps Experience in deploying ML models to production environments.
  • Knowledge of model monitoring and performance evaluation.


REQUIRED EXPERIENCE:

  • Amazon SageMaker: Deep understanding of SageMaker's capabilities for building, training, and deploying ML models; understanding of the Sage maker pipeline with ability to analyze gaps and recommend/implement improvements
  • AWS Cloud Infrastructure: Familiarity with S3, EC2, Lambda and using these services in ML workflows
  • AWS data: Redshift, Glue
  • Containerization and Orchestration: Understanding of Docker and Kubernetes, and their implementation within AWS (EKS, ECS)
Read more
Technology Industry
Pune
10 - 14 yrs
₹15L - ₹40L / yr
User Research
skill iconGoogle Analytics
skill iconData Analytics
Mixpanel
CleverTap
+10 more

Review Criteria

  • Strong Lead – User Research & Analyst profile (behavioural/user/product/ux analytics)
  • 10+ years of experience in Behavioral Data Analytics, User Research, or Product Insights, driving data-informed decision-making for B2C digital products (web and app).
  • 6 months+ experience in analyzing user journeys, clickstream, and behavioral data using tools such as Google Analytics, Mixpanel, CleverTap, Firebase, or Amplitude.
  • Experience in leading cross-functional user research and analytics initiatives in collaboration with Product, Design, Engineering, and Business teams to translate behavioral insights into actionable strategies.
  • Strong expertise in A/B testing and experimentation, including hypothesis design, execution, statistical validation, and impact interpretation.
  • Ability to identify behavioral patterns, funnel drop-offs, engagement trends, and user journey anomalies using large datasets and mixed-method analysis.
  • Hands-on proficiency in SQL, Excel, and data visualization/storytelling tools such as Tableau, Power BI, or Looker for executive reporting and dashboard creation.
  • Deep understanding of UX principles, customer journey mapping, and product experience design, with experience integrating qualitative and quantitative insights.

 

Preferred

  • Ability to build insightful dashboards and executive reports highlighting user engagement, retention, and behavioral metrics; familiarity with mixed-method research, AI-assisted insight tools (Dovetail, EnjoyHQ, Qualtrics, UserZoom), and mentoring junior researchers


Job Specific Criteria

  • CV Attachment is mandatory
  • We have an alternate Saturday’s working. Are you comfortable to WFH on 1st and 4th Saturday?

 

Role & Responsibilities

Product Conceptualization & UX Strategy Development:

  • Conceptualize customer experience strategies
  • Collaborate with product managers to conceptualize new products & align UX with product roadmaps.
  • Develop and implement UX strategies that align with business objectives.
  • Stay up-to-date with industry trends and best practices in UX & UI for AI.
  • Assist in defining product requirements and features.
  • Use data analytics to inform product strategy and prioritize features.
  • Ensure product alignment with customer needs and business goals.Develop platform blueprints that include a features and functionalities map, ecosystem map, and information architecture.
  • Create wireframes, prototypes, and mock-ups using tools like Figma
  • Conduct usability testing and iterate designs based on feedback
  • Employ tools like X-Mind for brainstorming and mind mapping


Customer Journey Analysis:

  • Understand and map out customer journeys and scenarios.
  • Identify pain points and opportunities for improvement.
  • Develop customer personas and empathy maps.


Cross-Functional Collaboration:

  • Work closely with internal units such as UX Research, Design, UX Content, and UX QA to ensure seamless delivery of CX initiatives.
  • Coordinate with development teams to ensure UX designs are implemented accurately.


Data Analytics and Tools:

  • Utilize clickstream and analytics tools like Google Analytics, CleverTap, and Medallia to gather and analyse user data.
  • Leverage data to drive decisions and optimize customer experiences.
  • Strong background in data analytics, including proficiency in interpreting complex datasets to inform UX decisions.

 

Ideal Candidate

  • Bachelor’s or Master’s degree in a relevant field (e.g., UX Design, Human-Computer Interaction, Computer Science, Marketing).
  • 5+ years of experience in CX/UX roles, preferably in a B2C environment.
  • Proficiency in analytics tools (Google Analytics, CleverTap, Medallia, Hotjar, etc).
  • Strong understanding of wireframing and prototyping tools (Figma, XMind, etc).
  • Excellent communication and collaboration skills.
  • Proven experience in managing cross-functional teams and projects.
  • Strong background in data analytics and data-driven decision-making.
  • Expert understanding of user experience and user-centered design approaches
  • Detail-orientation with experience and will to continuously learn, adapt and evolve
  • Creating and measuring the success and impact of your CX designs
  • Knowledge of testing tools like Maze, UsabiltyHub, UserZoom would be a plus
  • Experienced in designing responsive websites as well as mobile apps
  • Understanding of iOS and Android design guidelines
  • Passion for great customer-focus design, purposeful aesthetic sense and generating simple solutions to complex problems.
  • Excellent communication skills to be able to present their work and ideas to the leadership team.


Read more
NeoGenCode Technologies Pvt Ltd
Ritika Verma
Posted by Ritika Verma
Pune, Noida, Gurugram
5 - 8 yrs
₹13L - ₹17L / yr
ETL Tester
Azure Data factory
Azure SQL database
Azure ecosystem
SQL

Job Title: Sr. ETL Test Engineer

Experience: 7+ Years

Location: Gurgaon / Noida / Pune (Work From Office)

Joining: Immediate joiners only (≤15 days notice)

About the Role

We are seeking an experienced ETL Test Engineer with strong expertise in cloud-based ETL tools, Azure ecosystem, and advanced SQL skills. The ideal candidate will have a proven track record in validating complex data pipelines, ensuring data integrity, and collaborating with cross-functional teams in an Agile environment.

Key Responsibilities

  • Design, develop, and execute ETL test plans, test cases, and test scripts for cloud-based data pipelines.
  • Perform data validation, transformation, and reconciliation between source and target systems.
  • Work extensively with Azure Data Factory, Azure Synapse Analytics, Azure SQL Database, and related Azure services.
  • Develop and run complex SQL queries for data extraction, analysis, and validation.
  • Collaborate with developers, business analysts, and product owners to clarify requirements and ensure comprehensive test coverage.
  • Perform regression, functional, and performance testing of ETL processes.
  • Identify defects, log them, and work with development teams to ensure timely resolution.
  • Participate in Agile ceremonies (daily stand-ups, sprint planning, retrospectives) and contribute to continuous improvement.
  • Ensure adherence to data quality and compliance standards.

Required Skills & Experience

  • 5+ years of experience in ETL testing, preferably with cloud-based ETL tools.
  • Strong hands-on experience with Azure Data Factory, Azure Synapse Analytics, and Azure SQL.
  • Advanced SQL query writing and performance tuning skills.
  • Strong understanding of data warehousing concepts, data models, and data governance.
  • Experience with Agile methodologies and working in a Scrum team.
  • Excellent communication and stakeholder management skills.
  • Strong problem-solving skills and attention to detail.

Preferred Skills

  • Experience with Python, PySpark, or automation frameworks for ETL testing.
  • Exposure to CI/CD pipelines in Azure DevOps or similar tools.
  • Knowledge of data security, compliance, and privacy regulations.


Read more
Cambridge Wealth (Baker Street Fintech)
Pune
2 - 6 yrs
₹9L - ₹16L / yr
skill iconPython
Wealth management
fintech
skill iconDjango
skill iconFlask
+13 more

   

About Us: The Next Generation of WealthTech  

We're Cambridge Wealth, an award-winning force in mutual fund distribution and Fintech. We're not just moving money; we're redefining wealth management for everyone from retail investors to ultra-HNIs (including the NRI segment). Our brand is synonymous with excellence, backed by accolades from the BSE and top Mutual Fund houses.


If you thrive on building high-performance, scalable systems that drive real-world financial impact, you'll feel right at home. Join us in Pune to build the future of finance.

[Learn more: www.cambridgewealth.in]


The Role: Engineering Meets Team Meets Customer

We're looking for an experienced, hands-on Tech Catalyst to accelerate our product innovation. This isn't just a coding job; it's a chance to blend deep backend expertise with product strategy. You will be the engine driving rapid, data-driven product experiments, leveraging AI and Machine Learning to create smart, personalized financial solutions. You'll lead by example, mentoring a small, dedicated team and ensuring technical excellence and rapid deployment in the high-stakes financial domain.


Key Impact Areas: Ship Fast, Break Ground  

1. Backend & AI/ML Innovation  

  • Rapid Prototyping: Design and execute quick, iterative experiments to validate new features and market hypotheses, moving from concept to production in days, not months.
  • AI-Powered Features: Build scalable Python-based backend services that integrate AI/ML models to enhance customer profiling, portfolio recommendation, and risk analysis.
  • System Architecture: Own the performance, stability, and scalability of our core fintech platform, implementing best practices in modern backend development.

2. Product Leadership & Execution  

  • Agile Catalyst: Drive and optimize Agile sprints, ensuring clear technical milestones, efficient resource allocation, backlog grooming and maintaining a laser focus on preventing scope creep.
  • Mentorship & Management: Provide technical guidance and mentorship to a team of developers, fostering a culture of high performance, code quality, and continuous learning.
  • Domain Alignment: Translate complex financial requirements and market insights into precise, actionable technical specifications and seamless user stories.
  • Problem Solver: Proactively identify and resolve technical and process bottlenecks, acting as the ultimate problem solver for the engineering and product teams.

3. Financial Domain Expertise  

  • High-Value Delivery: Apply deep knowledge of the mutual fund and broader fintech landscape to inform product decisions, ensuring our solutions are compliant, competitive, and truly valuable to our clients.
  • Risk & Security: Proactively architect solutions with security and financial risk management baked in from the ground up, protecting client data and assets.


Your Tech Stack & Experience  

The Must-Haves  

  • Mindset: A verifiable track record as a proactive First Principle Problem Solver with an intense Passion to Ship production-ready features frequently.
  • Customer Empathy: Keeps the customer's experience in mind at all times.
  • Team Leadership: Experience in leading, mentoring, or managing a small development team, driving technical excellence and project delivery.
  • Systems Thinker: Diagnoses and solves problems by viewing the organization as an interconnected system to anticipate broad impacts and develop holistic, strategic solutions.
  • Backend Powerhouse: 2+ years of professional experience with a strong focus on backend development.
  • Python Guru: Expert proficiency in Python and related frameworks (e.g., Django, Flask) for building robust, scalable APIs and services.
  • AI/ML Integration: Proven ability to leverage and integrate AI/ML models into production-level applications.
  • Data Driven: Expert in SQL for complex data querying, analysis, and ETL processes.
  • Financial Domain Acumen:Strong, demonstrable knowledge of financial products, especially mutual funds, wealth management, and key fintech metrics.

 

 

Nice-to-Haves  

  • Experience with cloud platforms (AWS, Azure, GCP) and containerization (Docker, Kubernetes).
  • Familiarity with Zoho Analytics, Zoho CRM and Zoho Deluge
  • Familiarity with modern data analysis tools and visualization platforms (e.g., Mixpanel, Tableau, or custom dashboard tools).
  • Understanding of Mutual Fund, AIF, PMS operations

 

Ready to Own the Backend and Shape Finance?  

This is where your code meets the capital market. If you’re a Fintech-savvy Python expert ready to lead a team and build a scalable platform in Pune, we want to talk.

Apply now to join our award-winning, forward-thinking team.

 

Our High-Velocity Hiring Process:  

  • You Apply & Engage: Quick application and a few insightful questions. (5 min)
  • Online Tech Challenge: Prove your tech mettle. (90 min)
  • People Sync: A focused call to understand if there is cultural and value alignment. (30 min)
  • Deep Dive Technical Interview: Discuss architecture and projects with our senior engineers. (1 hour)
  • Founder's Vision Interview: Meet the leadership and discuss your impact. (1 hour)
  • Offer & Onboarding: Reference and BGV check follow the successful offer.

 

What are you building right now that you're most proud of?

 

Read more
 Global Digital Transformation Solutions Provider

Global Digital Transformation Solutions Provider

Agency job
via Peak Hire Solutions by Dhara Thakkar
Pune
6 - 12 yrs
₹10L - ₹30L / yr
skill iconAmazon Web Services (AWS)
AWS CloudFormation
Amazon Redshift
skill iconElastic Search
ECS
+11 more

Job Details

Job Title: ML Engineer II - Aws, Aws Cloud

Industry: Technology

Domain - Information technology (IT)

Experience Required: 6-12 years

Employment Type: Full Time

Job Location: Pune

CTC Range: Best in Industry


Job Description:

Core Responsibilities:

? The MLE will design, build, test, and deploy scalable machine learning systems, optimizing model accuracy and efficiency

? Model Development: Algorithms and architectures span traditional statistical methods to deep learning along with employing LLMs in modern frameworks.

? Data Preparation: Prepare, cleanse, and transform data for model training and evaluation.

? Algorithm Implementation: Implement and optimize machine learning algorithms and statistical models.

? System Integration: Integrate models into existing systems and workflows.

? Model Deployment: Deploy models to production environments and monitor performance.

? Collaboration: Work closely with data scientists, software engineers, and other stakeholders.

? Continuous Improvement: Identify areas for improvement in model performance and systems.


Skills:

? Programming and Software Engineering: Knowledge of software engineering best practices (version control, testing, CI/CD).

? Data Engineering: Ability to handle data pipelines, data cleaning, and feature engineering. Proficiency in SQL for data manipulation + Kafka, Chaossearch logs, etc for troubleshooting; Other tech touch points are ScyllaDB (like BigTable), OpenSearch, Neo4J graph

? Model Deployment and Monitoring: MLOps Experience in deploying ML models to production environments.

? Knowledge of model monitoring and performance evaluation.


Required experience:

? Amazon SageMaker: Deep understanding of SageMaker's capabilities for building, training, and deploying ML models; understanding of the Sagemaker pipeline with ability to analyze gaps and recommend/implement improvements

? AWS Cloud Infrastructure: Familiarity with S3, EC2, Lambda and using these services in


ML workflows

? AWS data: Redshift, Glue

? Containerization and Orchestration: Understanding of Docker and Kubernetes, and their implementation within AWS (EKS, ECS)


Skills: Aws, Aws Cloud, Amazon Redshift, Eks


Must-Haves

Aws, Aws Cloud, Amazon Redshift, Eks

NP: Immediate – 30 Days

 

Read more
Deqode

at Deqode

1 recruiter
Shraddha Katare
Posted by Shraddha Katare
Pune
3 - 6 yrs
₹3.5L - ₹12L / yr
skill icon.NET
skill iconC#
ASP.NET
Web API
Object Oriented Programming (OOPs)
+2 more

Job Title: .NET Developer

Experience: 3-5 Years

Location: Pune

Employment Type: Full-Time

About the Role:

We're looking for an experienced .NET Developer with 3-5 years of expertise to join our team in Pune. You'll be working on building scalable web applications using C#, .NET Core, and modern development practices.


What You'll Do:

  • Develop and maintain web applications using C# and .NET Core (.NET 5+)
  • Build RESTful APIs with ASP.NET Core Web API
  • Design and optimize SQL Server databases with complex queries and stored procedures
  • Implement OOP principles, Design Patterns, and SOLID principles
  • Work with Entity Framework Core for data access
  • Collaborate with teams using Agile/Scrum methodologies
  • Perform code reviews and ensure code quality
  • Manage code using Git version control


Required Skills:

Must-Have:

  • Strong proficiency in C# and .NET Core (or .NET 5+) - 3-5 years
  • Hands-on experience with ASP.NET Core Web API and RESTful services
  • Expertise in SQL (T-SQL), stored procedures, and database design
  • Experience with SQL Server is highly preferred
  • Hands-on experience with Entity Framework Core
  • Solid understanding of OOP, Design Patterns, and SOLID principles
  • Experience with Git and Agile/Scrum methodologies

Good to Have:

  • Front-end technologies (Angular, React)
  • Microservices architecture
  • Cloud platforms (Azure/AWS)
  • CI/CD pipelines
  • Unit testing frameworks

Qualifications:

  • Bachelor's degree in Computer Science, IT, or related field
  • 3-5 years of professional .NET development experience


Read more
Wissen Technology

at Wissen Technology

4 recruiters
Praffull Shinde
Posted by Praffull Shinde
Pune, Mumbai, Bengaluru (Bangalore)
5 - 10 yrs
Best in industry
skill icon.NET
skill iconC#
ASP.NET
SQL
skill iconAmazon Web Services (AWS)

Company Name – Wissen Technology

Location :  Pune / Bangalore / Mumbai (Based on candidate preference)

Work mode: Hybrid 

Experience: 5+ years


Job Description

Wissen Technology is seeking an experienced C# .NET Developer to build and maintain applications related to streaming market data. This role involves developing message-based C#/.NET applications to process, normalize, and summarize large volumes of market data efficiently. The candidate should have a strong foundation in Microsoft .NET technologies and experience working with message-driven, event-based architecture. Knowledge of capital markets and equity market data is highly desirable.


Responsibilities

  • Design, develop, and maintain message-based C#/.NET applications for processing real-time and batch market data feeds.
  • Build robust routines to download and process data from AWS S3 buckets on a frequent schedule.
  • Implement daily data summarization and data normalization routines.
  • Collaborate with business analysts, data providers, and other developers to deliver high-quality, scalable market data solutions.
  • Troubleshoot and optimize market data pipelines to ensure low latency and high reliability.
  • Contribute to documentation, code reviews, and team knowledge sharing.


Required Skills and Experience

  • 5+ years of professional experience programming in C# and Microsoft .NET framework.
  • Strong understanding of message-based and real-time programming architectures.
  • Experience working with AWS services, specifically S3, for data retrieval and processing.
  • Experience with SQL and Microsoft SQL Server.
  • Familiarity with Equity market data, FX, Futures & Options, and capital markets concepts.
  • Excellent interpersonal and communication skills.
  • Highly motivated, curious, and analytical mindset with the ability to work well both independently and in a team environment.


Education

  • Bachelor’s degree in Computer Science, Engineering, or a related technical field.


Read more
Navi Mumbai, Pune, Bengaluru (Bangalore), Hyderabad, Mohali, Panchkula, Dehradun, Gurugram, Chennai
5 - 9 yrs
₹8L - ₹14L / yr
skill icon.NET
skill iconMongoDB
Entity Framework
skill iconC#
SQL
+4 more

Job Title: Mid-Level .NET Developer (Agile/SCRUM)


Location: Mohali, Bangalore, Pune, Navi Mumbai, Chennai, Hyderabad, Panchkula, Gurugram (Delhi NCR), Dehradun


Night Shift from 6:30 pm to 3:30 am IST


Experience: 5+ Years


Job Summary:

We are seeking a proactive and detail-oriented Mid-Level .NET Developer to join our dynamic team. The ideal candidate will be responsible for designing, developing, and maintaining high-quality applications using Microsoft technologies with a strong emphasis on .NET Core, C#, Web API, and modern front-end frameworks. You will collaborate with cross-functional teams in an Agile/SCRUM environment and participate in the full software development lifecycle—from requirements gathering to deployment—while ensuring adherence to best coding and delivery practices.


Key Responsibilities:

  • Design, develop, and maintain applications using C#, .NET, .NET Core, MVC, and databases such as SQL Server, PostgreSQL, and MongoDB.
  • Create responsive and interactive user interfaces using JavaScript, TypeScript, Angular, HTML, and CSS.
  • Develop and integrate RESTful APIs for multi-tier, distributed systems.
  • Participate actively in Agile/SCRUM ceremonies, including sprint planning, daily stand-ups, and retrospectives.
  • Write clean, efficient, and maintainable code following industry best practices.
  • Conduct code reviews to ensure high-quality and consistent deliverables.
  • Assist in configuring and maintaining CI/CD pipelines (Jenkins or similar tools).
  • Troubleshoot, debug, and resolve application issues effectively.
  • Collaborate with QA and product teams to validate requirements and ensure smooth delivery.
  • Support release planning and deployment activities.


Required Skills & Qualifications:

  • 4–6 years of professional experience in .NET development.
  • Strong proficiency in C#, .NET Core, MVC, and relational databases such as SQL Server.
  • Working knowledge of NoSQL databases like MongoDB.
  • Solid understanding of JavaScript/TypeScript and the Angular framework.
  • Experience in developing and integrating RESTful APIs.
  • Familiarity with Agile/SCRUM methodologies.
  • Basic knowledge of CI/CD pipelines and Git version control.
  • Hands-on experience with AWS cloud services.
  • Strong analytical, problem-solving, and debugging skills.
  • Excellent communication and collaboration skills.


Preferred / Nice-to-Have Skills:

  • Advanced experience with AWS services.
  • Knowledge of Kubernetes or other container orchestration platforms.
  • Familiarity with IIS web server configuration and management.
  • Experience in the healthcare domain.
  • Exposure to AI-assisted code development tools (e.g., GitHub Copilot, ChatGPT).
  • Experience with application security and code quality tools such as Snyk or SonarQube.
  • Strong understanding of SOLID principles and clean architecture patterns.


Technical Proficiencies:

  • ASP.NET Core, ASP.NET MVC
  • C#, Entity Framework, Razor Pages
  • SQL Server, MongoDB
  • REST API, jQuery, AJAX
  • HTML, CSS, JavaScript, TypeScript, Angular
  • Azure Services, Azure Functions, AWS
  • Visual Studio
  • CI/CD, Git


Read more
Contribute to all software-development life-cycle phases inc

Contribute to all software-development life-cycle phases inc

Agency job
via FZ Talent Solution by Priya Kumari
Ahmedabad, Pune
5 - 8 yrs
₹5L - ₹14L / yr
skill iconJava
skill iconAngular (2+)
SQL
skill iconSpring Boot
skill iconNodeJS (Node.js)
+3 more

Mandatory Skills

  • Backend: Java, Spring Boot
  • Frontend: Angular
  • Database: Oracle / SQL
  • Node js

Job Description

Contribute to all software-development life-cycle phases including: domain and non-domain problem analysis, solution requirement gathering and analysis, solution design, implementation, code review, source-code control, source building deployment, validation, QA support, and production support.

Essential Duties and Responsibilities

•⁠ ⁠Maintain and enhance multi-tier messaging application suites (Java EE, Springframework, WAS, Oracle, DB2, MQ)

•⁠ ⁠Build and maintain IRIS4Health middle-tier message applications (IRIS Interop/Cache; Java, Drools, Kafka, Restful, MLLP, SQL)

•⁠ ⁠Build and maintain multi-tier Clinical Toxicology application (Angular, Java EE, Springframework, WAS, RHOS, Cache, SQL)

•⁠ ⁠Maintain stat-tracking application (two-tier Delphi, MySQL)

•⁠ ⁠Maintain and enhance Cytogenetics three-tier application (Java EE, WAS, DB2, Oracle, SQL, )

•⁠ ⁠Maintain and enhance Fibrosure application (Java EE, WAS, Derby)

•⁠ ⁠Define develop, validate, and release software products via agile processes for small and large projects

•⁠ ⁠Support applications and people via Kanban processes

•⁠ ⁠Collaborate with laboratory users to analyze problems, design and implement solutions for enterprise systems

•⁠ ⁠Provide support and troubleshooting of production systems according to an on-call schedule

•⁠ ⁠Document problem analysis, solution design, implementations, and system support guidelines

•⁠ ⁠Coach and train team members across lab system organizations to support and develop Java applications

•⁠ ⁠Communicate effectively and constructively with developers, QA, business analysts, and system users

•⁠ ⁠Design and depict via UML relational DB table models, object-oriented class models, messaging models, configuration models

•⁠ ⁠Understand, document, support, and improve inherited code and processes

•⁠ ⁠Help document knowledge and discovery with peer developers

Minimum Requirements

•⁠ ⁠Solid Java EE experience (Servlets, JMS, JSP, EJB, JCA, and JPA) development and support

•⁠ ⁠Solid InfoSystems Cache/IRIS for Health development and support

•⁠ ⁠A minimum of 1 years of JPA/ORM (Hibernate), Junit, XML/XSD, JSON experience or equivalent

•⁠ ⁠Solid SQL (and optionally PLSQL) experience

•⁠ ⁠Experience with Oracle DB including explain plan and or other query optimization techniques/tools

•⁠ ⁠Excellent verbal and written communication skills

•⁠ ⁠Strong UML modeling, ER and OO design, and data-normalization techniques

•⁠ ⁠Strong code-factoring philosophies and techniques

•⁠ ⁠Eclipse or NetBeans (or equivalent) IDE

•⁠ ⁠Strong understanding of client/server design, and smart recognition of separation-of-concern like functional behavior versus non-functional performance

Desired Requirements

•⁠ ⁠Java EE, Angular

•⁠ ⁠InfoSystem's Cache and/or IRIS for Health

•⁠ ⁠Springframework

•⁠ ⁠Modern deployment architectures using containers, API Gateways, load balancers, and AWS cloud based environments

•⁠ ⁠WebSphere or WebLogic, RHOS

•⁠ ⁠RESTful Web Services

•⁠ ⁠JMS interfacing, Apache Kafka, and IBM MQ

•⁠ ⁠Node.js/NPM, Bootstrap, or similar frameworks

•⁠ ⁠Git/BitBucket (git flow), Maven, Nexus, UCD, Jira (Kanban and SCRUM), agile workflow

•⁠ ⁠Unix shell script, DOS script

•⁠ ⁠SQL (optionally PLSQL)

•⁠ ⁠Design patterns

•⁠ ⁠HTML5, CSS3, and TypeScript development

•⁠ ⁠Ability to transform specific domain requirements into generalized technical requirements, and design and implement abstract solutions that are understandable and scalable in performance and reuse

•⁠ ⁠HL7 and/or Healthcare and/or Clinical Toxicology

•⁠ ⁠Oracle, MySQL, Derby DB

Read more
Wissen Technology
Mumbai, Pune
5 - 9 yrs
₹10L - ₹20L / yr
Functional testing
Integration testing
Oracle Fusion
SQL
E2E

Key Responsibilities:

  • Perform comprehensive Functional and Integration Testing across Oracle modules and connected systems.
  • Conduct detailed End-to-End (E2E) Testing to ensure business processes function seamlessly across applications.
  • Collaborate with cross-functional teams, including Business Analysts, Developers, and Automation teams, to validate business requirements and deliver high-quality releases.
  • Identify, document, and track functional defects, ensuring timely closure and root cause analysis.
  • Execute and validate SQL queries for backend data verification and cross-system data consistency checks.
  • Participate in regression cycles and support continuous improvement initiatives through data-driven analysis.

Required Skills & Competencies:

  • Strong knowledge of Functional Testing processes and methodologies.
  • Good to have Oracle fusion knowledge
  • Solid understanding of Integration Flows between Oracle and peripheral systems.
  • Proven ability in E2E Testing, including scenario design, execution, and defect management.
  • Excellent Analytical and Logical Reasoning skills with attention to detail.
  • Hands-on experience with SQL for data validation and analysis.
  • Effective communication, documentation, and coordination skills.

Preferred Qualifications:

  • Exposure to automation-assisted functional testing and cross-platform data validation.
  • Experience in identifying test optimization opportunities and improving testing efficiency.


Read more
Bajaj Finserv Health

at Bajaj Finserv Health

1 recruiter
Agency job
via TIGI HR Solution Pvt. Ltd. by Vaidehi Sarkar
Pune
5 - 8 yrs
₹14L - ₹15L / yr
skill iconData Analytics
Behavioral modeling
clickstream
skill iconGoogle Analytics
CleverTap
+8 more

CTC: 15 LPA to 21 LPA

Exp: 5 to 8 Years



Mandatory

  • Strong Behavioral Data Analyst Profiles
  • Mandatory (Experience 1): Minimum 4+ years of experience in user analytics or behavioural data analysis, focusing on user app and web journeys
  • Mandatory (Experience 2): Experience in analyzing clickstream and behavioral data using tools such as Google Analytics, Mixpanel, CleverTap, or Firebase
  • Mandatory (Skills 1): Hands-on experience in A/B testing, including hypothesis design, experimentation, and result interpretation.
  • Mandatory (Skills 2): Strong analytical ability to identify behavioral patterns, anomalies, funnel drop-offs, and engagement trends from large datasets.
  • Mandatory (Skills 3): Hands-on proficiency in SQL, Excel, and data visualization tools such as Tableau or Power BI for dashboard creation and data storytelling.
  • Mandatory (Skills 4): Basic understanding of UX principles and customer journey mapping, collaborating effectively with UX/CX teams
  • Mandatory (Company): B2C product Companies (fintech, or e-commerce organizations with large user behavior dataset is a plus)
  • Mandatory (Note): Don't want data analysis but business/product/user analysts



Ideal Candidate:

  • Bachelor’s or Master’s degree in a relevant field (e.g., UX Design, Human-Computer Interaction, Computer Science, Marketing).
  • 5+ years of experience in CX/UX roles, preferably in a B2C environment.
  • Proficiency in analytics tools (Google Analytics, CleverTap, Medallia, Hotjar, etc).
  • Strong understanding of wireframing and prototyping tools (Figma, XMind, etc).
  • Excellent communication and collaboration skills.
  • Proven experience in managing cross-functional teams and projects.
  • Strong background in data analytics and data-driven decision-making.
  • Expert understanding of user experience and user-centered design approaches
  • Detail-orientation with experience and will to continuously learn, adapt and evolve
  • Creating and measuring the success and impact of your CX designs
  • Knowledge of testing tools like Maze, UsabiltyHub, UserZoom would be a plus
  • Experienced in designing responsive websites as well as mobile apps
  • Understanding of iOS and Android design guidelines
  • Passion for great customer-focus design, purposeful aesthetic sense and generating simple solutions to complex problems.
  • Excellent communication skills to be able to present their work and ideas to the leadership team.


If interested kindly share your updated resume on 82008 31681

Read more
Deqode

at Deqode

1 recruiter
purvisha Bhavsar
Posted by purvisha Bhavsar
Pune
4.5 - 6 yrs
₹6L - ₹20L / yr
PL/SQL
SQL
Linux/Unix
Shell Scripting
skill iconAmazon Web Services (AWS)
+1 more

🚀 Hiring: PL/SQL Developer

⭐ Experience: 5+ Years

📍 Location: Pune

⭐ Work Mode:- Hybrid

⏱️ Notice Period: Immediate Joiners

(Only immediate joiners & candidates serving notice period)


What We’re Looking For:

☑️ Hands-on PL/SQL developer with strong database and scripting skills, ready to work onsite and collaborate with cross-functional financial domain teams.


Key Skills:

✅ Must Have: PL/SQL, SQL, Databases, Unix/Linux & Shell Scripting

✅ Nice to Have: DevOps tools (Jenkins, Artifactory, Docker, Kubernetes),

✅AWS/Cloud, Basic Python, AML/Fraud/Financial domain, Actimize (AIS/RCM/UDM)

Read more
This is Full time role with Our Client.

This is Full time role with Our Client.

Agency job
via eTalent Services by JaiPrakash Bharti
Pune, Chennai
8 - 10 yrs
₹20L - ₹25L / yr
skill iconSpring Boot
skill iconNodeJS (Node.js)
Microservices
SQL
skill iconJava
+3 more

Role: Lead Java Developer

Work Location: Chennai, Pune

No of years’ experience: 8+ years

Hybrid (3 days office and 2 days home)

Type: Fulltime

 

Skill Set: Java + Spring Boot + Sql + Microservices + DevOps

 

Job Responsibilities:

Design, develop, and maintain high-quality software applications using Java and Spring Boot.

Develop and maintain RESTful APIs to support various business requirements.

Write and execute unit tests using TestNG to ensure code quality and reliability.

Work with NoSQL databases to design and implement data storage solutions.

Collaborate with cross-functional teams in an Agile environment to deliver high-quality software solutions.

Utilize Git for version control and collaborate with team members on code reviews and merge requests.

Troubleshoot and resolve software defects and issues in a timely manner.

Continuously improve software development processes and practices.

 

Description:

8 years of professional experience in backend development using Java and leading a team.

Strong expertise in Spring Boot, Apache Camel, Hibernate, JPA, and REST API design

Hands-on experience with PostgreSQL, MySQL, or other SQL-based databases

Working knowledge of AWS cloud services (EC2, S3, RDS, etc.)

Experience in DevOps activities.

Proficiency in using Docker for containerization and deployment.

Strong understanding of object-oriented programming, multithreading, and performance tuning

Self-driven and capable of working independently with minimal supervision

Read more
Wissen Technology

at Wissen Technology

4 recruiters
Manasi Mankas
Posted by Manasi Mankas
Pune, Mumbai, Bengaluru (Bangalore)
3 - 7 yrs
Best in industry
snowflake
airflow
DBT
Stored Procedures
SQL

Key Responsibilities:

● Analyze and translate legacy MSSQL stored procedures into Snowflake Scripting (SQL) or JavaScript-based stored procedures.

● Rebuild and optimize data pipelines and transformation logic in Snowflake.

● Implement performance-tuning techniques such as query pruning, clustering keys, appropriate warehouse sizing, and materialized views.

● Monitor query performance using the Snowflake Query Profile and resolve bottlenecks.

● Ensure procedures are idempotent, efficient, and scalable for high-volume workloads.

● Collaborate with architects and data teams to ensure accurate and performant data migration.

● Write test cases to validate functional correctness and performance.

● Document changes and follow version control best practices (e.g., Git, CI/CD).


Required Skills:


● 4+ years of SQL development experience, including strong T-SQL proficiency.

● 2+ years of hands-on experience with Snowflake, including stored procedure development.

● Deep knowledge of query optimization and performance tuning in Snowflake.

● Familiarity with Snowflake internals: automatic clustering, micro-partitioning, result caching, and warehouse scaling.

● Solid understanding of ETL/ELT processes, preferably with tools like DBT, Informatica, or Airflow.

● Experience with CI/CD pipelines and Git-based version control


Note : One face-to-face (F2F) round is mandatory, and as per the process, you will need to visit the office for this.



Read more
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort
Why apply via Cutshort?
Connect with actual hiring teams and get their fast response. No spam.
Find more jobs
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort