50+ SQL Jobs in Pune | SQL Job openings in Pune
Apply to 50+ SQL Jobs in Pune on CutShort.io. Explore the latest SQL Job opportunities across top companies like Google, Amazon & Adobe.
* Python (3 to 6 years): Strong expertise in data workflows and automation
* Spark (PySpark): Hands-on experience with large-scale data processing
* Pandas: For detailed data analysis and validation
* Delta Lake: Managing structured and semi-structured datasets at scale
* SQL: Querying and performing operations on Delta tables
* Azure Cloud: Compute and storage services
* Orchestrator: Good experience with either ADF or Airflow
Company Description
NonStop io Technologies, founded in August 2015, is a Bespoke Engineering Studio specializing in Product Development. With over 80 satisfied clients worldwide, we serve startups and enterprises across prominent technology hubs, including San Francisco, New York, Houston, Seattle, London, Pune, and Tokyo. Our experienced team brings over 10 years of expertise in building web and mobile products across multiple industries. Our work is grounded in empathy, creativity, collaboration, and clean code, striving to build products that matter and foster an environment of accountability and collaboration.
Role Description
This is a full-time hybrid role for a Java Software Engineer, based in Pune. The Java Software Engineer will be responsible for designing, developing, and maintaining software applications. Key responsibilities include working with microservices architecture, implementing and managing the Spring Framework, and programming in Java. Collaboration with cross-functional teams to define, design, and ship new features is also a key aspect of this role.
Responsibilities:
● Develop and Maintain: Write clean, efficient, and maintainable code for Java-based applications
● Collaborate: Work with cross-functional teams to gather requirements and translate them into technical solutions
● Code Reviews: Participate in code reviews to maintain high-quality standards
● Troubleshooting: Debug and resolve application issues in a timely manner
● Testing: Develop and execute unit and integration tests to ensure software reliability
● Optimize: Identify and address performance bottlenecks to enhance application performance
Qualifications & Skills:
● Strong knowledge of Java, Spring Framework (Spring Boot, Spring MVC), and Hibernate/JPA
● Familiarity with RESTful APIs and web services
● Proficiency in working with relational databases like MySQL or PostgreSQL
● Practical experience with AWS cloud services and building scalable, microservices-based architectures
● Experience with build tools like Maven or Gradle
● Understanding of version control systems, especially Git
● Strong understanding of object-oriented programming principles and design patterns
● Familiarity with automated testing frameworks and methodologies
● Excellent problem-solving skills and attention to detail
● Strong communication skills and ability to work effectively in a collaborative team environment
Why Join Us?
● Opportunity to work on cutting-edge technology products
● A collaborative and learning-driven environment
● Exposure to AI and software engineering innovations
● Excellent work ethic and culture
If you're passionate about technology and want to work on impactful projects, we'd love to hear from you
Job Description -
Profile: .Net Full Stack Lead
Experience Required: 7–12 Years
Location: Pune, Bangalore, Chennai, Coimbatore, Delhi, Hosur, Hyderabad, Kochi, Kolkata, Trivandrum
Work Mode: Hybrid
Shift: Normal Shift
Key Responsibilities:
- Design, develop, and deploy scalable microservices using .NET Core and C#
- Build and maintain serverless applications using AWS services (Lambda, SQS, SNS)
- Develop RESTful APIs and integrate them with front-end applications
- Work with both SQL and NoSQL databases to optimize data storage and retrieval
- Implement Entity Framework for efficient database operations and ORM
- Lead technical discussions and provide architectural guidance to the team
- Write clean, maintainable, and testable code following best practices
- Collaborate with cross-functional teams to deliver high-quality solutions
- Participate in code reviews and mentor junior developers
- Troubleshoot and resolve production issues in a timely manner
Required Skills & Qualifications:
- 7–12 years of hands-on experience in .NET development
- Strong proficiency in .NET Framework, .NET Core, and C#
- Proven expertise with AWS services (Lambda, SQS, SNS)
- Solid understanding of SQL and NoSQL databases (SQL Server, MongoDB, DynamoDB, etc.)
- Experience building and deploying Microservices architecture
- Proficiency in Entity Framework or EF Core
- Strong knowledge of RESTful API design and development
- Experience with React or Angular is a good to have
- Understanding of CI/CD pipelines and DevOps practices
- Strong debugging, performance optimization, and problem-solving skills
- Experience with design patterns, SOLID principles, and best coding practices
- Excellent communication and team leadership skills
Job Title : QA Lead (AI/ML Products)
Employment Type : Full Time
Experience : 4 to 8 Years
Location : On-site
Mandatory Skills : Strong hands-on experience in testing AI/ML (LLM, RAG) applications with deep expertise in API testing, SQL/NoSQL database validation, and advanced backend functional testing.
Role Overview :
We are looking for an experienced QA Lead who can own end-to-end quality for AI-influenced products and backend-heavy systems. This role requires strong expertise in advanced functional testing, API validation, database verification, and AI model behavior testing in non-deterministic environments.
Key Responsibilities :
- Define and implement comprehensive test strategies aligned with business and regulatory goals.
- Validate AI/ML and LLM-driven applications, including RAG pipelines, hallucination checks, prompt injection scenarios, and model response validation.
- Perform deep API testing using Postman/cURL and validate JSON/XML payloads.
- Execute complex SQL queries (MySQL/PostgreSQL) and work with MongoDB for backend and data integrity validation.
- Analyze server logs and transactional flows to debug issues and ensure system reliability.
- Conduct risk analysis and report key QA metrics such as defect leakage and release readiness.
- Establish and refine QA processes, templates, standards, and agile testing practices.
- Identify performance bottlenecks and basic security vulnerabilities (e.g., IDOR, data exposure).
- Collaborate closely with developers, product managers, and domain experts to translate business requirements into testable scenarios.
- Own feature quality independently from conception to release.
Required Skills & Experience :
- 4+ years of hands-on experience in software testing and QA.
- Strong understanding of testing AI/ML products, LLM validation, and non-deterministic behavior testing.
- Expertise in API Testing, server log analysis, and backend validation.
- Proficiency in SQL (MySQL/PostgreSQL) and MongoDB.
- Deep knowledge of SDLC and Bug Life Cycle.
- Strong problem-solving ability and structured approach to ambiguous scenarios.
- Awareness of performance testing and basic security testing practices.
- Excellent communication skills to articulate defects and QA strategies.
What We’re Looking For :
A proactive QA professional who can go beyond UI testing, understands backend systems deeply, and can confidently test modern AI-driven applications while driving quality standards across the team.
Proficiency in Java 8+.
Solid understanding of REST APIs(Spring boot), microservices, databases (SQL/NoSQL), and caching systems like Redis/Aerospike.
Familiarity with cloud platforms (AWS, GCP, Azure) and DevOps tools (Docker, Kubernetes, CI/CD).
Good understanding of data structures, algorithms, and software design principles.
What You’ll Do:
We are looking for a Staff Operations Engineer based in Pune, India who can master both DeepIntent’s data architectures and pharma research and analytics methodologies to make significant contributions to how health media is analyzed by our clients. This role requires an Engineer who not only understands DBA functions but also how they impact research objectives and can work with researchers and data scientists to achieve impactful results.
This role will be in the Engineering Operations team and will require integration and partnership with the Engineering Organization. The ideal candidate is a self-starter who is inquisitive who is not afraid to take on and learn from challenges and will constantly seek to improve the facets of the business they manage. The ideal candidate will also need to demonstrate the ability to collaborate and partner with others.
- Serve as the Engineering interface between Analytics and Engineering teams.
- Develop and standardize all interface points for analysts to retrieve and analyze data with a focus on research methodologies and data-based decision-making.
- Optimize queries and data access efficiencies, serve as an expert in how to most efficiently attain desired data points.
- Build “mastered” versions of the data for Analytics-specific querying use cases.
- Establish a formal data practice for the Analytics practice in conjunction with the rest of DeepIntent.
- Interpret analytics methodology requirements and apply them to data architecture to create standardized queries and operations for use by analytics teams.
- Implement DataOps practices.
- Master existing and new Data Pipelines and develop appropriate queries to meet analytics-specific objectives.
- Collaborate with various business stakeholders, software engineers, machine learning engineers, and analysts.
- Operate between Engineers and Analysts to unify both practices for analytics insight creation.
Who You Are:
- 8+ years of experience in Tech Support (Specialised in Monitoring and maintaining Data pipeline).
- Adept in market research methodologies and using data to deliver representative insights.
- Inquisitive, curious, understands how to query complicated data sets, move and combine data between databases.
- Deep SQL experience is a must.
- Exceptional communication skills with the ability to collaborate and translate between technical and non-technical needs.
- English Language Fluency and proven success working with teams in the U.S.
- Experience in designing, developing and operating configurable Data pipelines serving high-volume and velocity data.
- Experience working with public clouds like GCP/AWS.
- Good understanding of software engineering, DataOps, and data architecture, Agile and DevOps methodologies.
- Proficient with SQL, Python or JVM-based language, Bash.
- Experience with any of Apache open-source projects such as Spark, Druid, Beam, Airflow etc. and big data databases like BigQuery, Clickhouse, etc.
- Ability to think big, take bets and innovate, dive deep, hire and develop the best talent, learn and be curious.
- Experience in debugging UI and Backend issues will be add on.
The Role
We are looking for a Senior/Lead Azure Data Engineer to join our team in Pune. You will be responsible for the end-to-end lifecycle of data solutions, from initial client requirement gathering and solution architecture design to leading the data engineering team through implementation. You will be the technical anchor for the project, ensuring that our data estates are scalable, governed, and high-performing.
Key Responsibilities
- Architecture & Design: Design robust data architectures using Microsoft Fabric and Azure Synapse, focusing on Medallion architecture and metadata-driven frameworks.
- End-to-End Delivery: Translate complex client business requirements into technical roadmaps and lead the team to deliver them on time.
- Data Governance: Implement and manage enterprise-grade governance, data discovery, and lineage using Microsoft Purview.
- Team Leadership: Act as the technical lead for the team, performing code reviews, mentoring junior engineers, and ensuring best practices in PySpark and SQL.
- Client Management: Interface directly with stakeholders to define project scope and provide technical consultancy.
What We’re Looking For
- 6+ Years in Data Engineering with at least 3+ years leading technical teams or designing architectures.
- Expertise in Microsoft Fabric/Synapse: Deep experience with Lakehouses, Warehouses, and Spark-based processing.
- Governance Specialist: Proven experience implementing Microsoft Purview for data cataloging, sensitivity labeling, and lineage.
- Technical Breadth: Strong proficiency in PySpark, SQL, and Data Factory. Familiarity with Infrastructure as Code (Bicep/Terraform) is a major plus.
Why Work with Us?
- Competitive Pay
- Flexible Hours
- Work on Microsoft’s latest (Fabric, Purview, Foundry) as a Designated Solutions Partner.
- High-Stakes Impact: Solve complex, client-facing problems for enterprise leaders
- Structured learning paths to help you master AI automation and Agentic AI.
Must have Strong SQL skills (queries, optimization, procedures, triggers), Hands-on experience with SQL, Automated through SQL.
Looking for candidates having 2+ years experience who has worked on large datasets with 1cr. datasets or more.
Handling the challenges and breaking.
Must have Advanced Excel skills
Should have 3+ years of relevant experience
Should have Reporting + dashboard creation experience
Should have Database development & maintenance experience
Must have Strong communication for client interactions
Should have Ability to work independently
Willingness to work from client locati
About the Role
We are looking for a motivated Full Stack Developer with 2–5 years of hands-on experience in building scalable web applications. You will work closely with senior engineers and product teams to develop new features, improve system performance, and ensure high-
quality code delivery.
Responsibilities
- Develop and maintain full-stack applications.
- Implement clean, maintainable, and efficient code.
- Collaborate with designers, product managers, and backend engineers.
- Participate in code reviews and debugging.
- Work with REST APIs/GraphQL.
- Contribute to CI/CD pipelines.
- Ability to work independently as well as within a collaborative team environment.
Required Technical Skills
- Strong knowledge of JavaScript/TypeScript.
- Experience with React.js, Next.js.
- Backend experience with Node.js, Express, NestJS.
- Understanding of SQL/NoSQL databases.
- Experience with Git, APIs, debugging tools.ß
- Cloud familiarity (AWS/GCP/Azure).
AI and System Mindset
Experience working with AI-powered systems is a strong plus. Candidates should be comfortable integrating AI agents, third-party APIs, and automation workflows into applications, and should demonstrate curiosity and adaptability toward emerging AI technologies.
Soft Skills
- Strong problem-solving ability.
- Good communication and teamwork.
- Fast learner and adaptable.
Education
Bachelor's degree in Computer Science / Engineering or equivalent.
Experience - 10-20 Yrs
Job Location - CommerZone, Yerwada, Pune
Work Mode - Work from Office
Shifts - General Shift
Work days - 5 days
Quantification - Graduation full time mandatory
Domain - Payment/Card/Banking/BFSI/ Retail Payments
Job Type - Full Time
Notice period - Immediate or 30 days
Interview Process -
1) Screening
2) Virtual L1 interview
3) Managerial Round Face to Face at Pune Office
4) HR Discussion
Job Description
Job Summary:
The Production/L2 Application Support Manager will be responsible for managing the banking applications that supports our payment gateway systems in a production environment. You will oversee the deployment, monitoring, optimization, and maintenance of all application components. You will ensure that our systems run smoothly, meet business and regulatory requirements, and provide high availability for our customers.
Key Responsibilities:
- Manage and optimize the application for the payment gateway systems to ensure high availability, reliability, and scalability.
- Oversee the day-to-day operations of production environments, including managing cloud services (AWS), load balancing, database systems, and monitoring tools.
- Lead a team of application support engineers and administrators, providing technical guidance and support to ensure applications and infrastructure solutions are implemented efficiently and effectively.
- Collaborate with development, security, and product teams to ensure application support the needs of the business and complies with relevant regulations.
- Monitor application performance and system health using monitoring tools and ensure quick resolution of any performance bottlenecks or system failures.
- Develop and maintain capacity planning, monitoring, and backup strategies to ensure scalability and minimal downtime during peak transaction periods.
- Drive continuous improvement of processes and tools for efficient production/application management.
- Ensure robust security practices are in place across production systems, including compliance with industry standards
- Conduct incident response, root cause analysis, and post-mortem analysis to prevent recurring issues and improve system performance.
- Oversee regular patching, updates, and version control of production systems to minimize vulnerabilities.
- Develop and maintain application support documentation, including architecture diagrams, processes, and disaster recovery plans.
- Manage and execute on-call duties, ensuring timely resolution of application-related issues and ensuring proper support coverage.
Skills and Qualifications:
- Bachelor's degree in Computer Science, Information Technology, or a related field (or equivalent work experience).
- 8+ years of experience managing L2 application support in high-availability, mission-critical environments, ideally within a payment gateway or fintech organization.
- Experience with working L2 production support base on Java programming.
- Experience with database systems (SQL, NoSQL) and database management, including high availability and disaster recovery strategies.
- Excellent communication and leadership skills, with the ability to collaborate effectively across teams and drive initiatives forward..
- Ability to work well under pressure and in high-stakes situations, ensuring uptime and service continuity.
We are looking for a skilled and motivated Integration Engineer to join our dynamic team in the payment domain. This role involves the seamless integration of payment systems, APIs, and third-party services into our platform, ensuring smooth and secure payment processing. The ideal candidate will bring experience with payment technologies, integration methodologies, and a strong grasp of industry standards.
Key Responsibilities:
- System Integration:
- Design, develop, and maintain integrations between various payment processors, gateways, and internal platforms using RESTful APIs, SOAP, and related technologies.
- Payment Gateway Integration:
- Integrate third-party payment solutions such as Visa, MasterCard, PayPal, Stripe, and others into the platform.
- Troubleshooting & Support:
- Identify and resolve integration issues including transactional failures, connectivity issues, and third-party service disruptions.
- Testing & Validation:
- Conduct end-to-end integration testing to ensure payment system functionality across development, staging, and production environments.
Qualifications:
- Education:
- Bachelor’s degree in Computer Science, Engineering, Information Technology, or a related field. Equivalent work experience is also acceptable.
- Experience:
- 3+ years of hands-on experience in integrating payment systems and third-party services.
- Proven experience with payment gateways (e.g., Stripe, Square, PayPal, Adyen) and protocols (e.g., ISO 20022, EMV).
- Familiarity with payment processing systems and industry standards.
Desirable Skills:
- Strong understanding of API security, OAuth, and tokenization practices.
- Experience with PCI-DSS compliance.
- Excellent problem-solving and debugging skills.
- Effective communication and cross-functional collaboration capabilities.
We are seeking an experienced and highly skilled Java (Fullstack) Engineer to join our team.
The ideal candidate will have a strong background in both Back-end JAVA, Spring-boot, Spring Framework & Frontend Javascript, React or Angular with ability to salable high performance applications.
Responsibilities
- Develop, test, and deploy scalable and robust backend services Develop, test & deploy scalable & robust back-end services using JAVA & Spring-boot
- Build responsive & user friendly front-end applications using modern Java-script framework with React
- or Angular
- Collaborate with architects & team members to design salable, maintainable & efficient systems.
- Contribute to architectural decisions for micro-services, API’s & cloud solutions.
- Implement & maintain Restful API for seamless integration.
- Write clean, efficient & res-usable code adhering to best practices
- Conduct code reviews, performance optimizations & debugging
- Work with cross functional teams, including UX/UI designers, product managers & QA team.
- Mentor junior developers & provide technical guidance.
Skills & Requirements
- Minimum 3 Years of experience in backend/ fullstack development
- Back-end - Core JAVA/JAVA8, Spring-boot, Spring Framework, Micro-services, Rest API’s, Kafka,
- Front-end - JavaScript, HTML, CSS, Typescript, Angular
- Database - MySQL
Preferred
- Experience with Batch writing, Application performance, Caches security, Web Security
- Experience working in fintech, payments, or high-scale production environments
We are seeking an experienced & highly skilled Java Lead to join our team. The ideal candidate will have a strong background in both front end & Back-end Technologies with expertise in JAVA, and Spring.
As a Lead, you will be responsible for overseeing the development
team, architecture saleable application & ensuring best practices in software development. This role requires a hands on leader with excellent problem solving abilities & a passion for mentoring junior
team members.
Responsibilities
- Lead & mentor a team of developers proving guidance on coding standards, architecture & best
- practices
- Architect, design & develop ent to end JAVA based web applications & ensure high performance,
- security & scalability
- Work closely with cross functional teams, including product managers, designers & other developers
- to ensure alignment on project requirements & deliverable.
- Conduct code reviews & provide constructive feedback to team members to improve code quality &
- maintain a consistent codebase
- Participate in Agile/Scrum Ceremonies such as stand ups, sprint planning & retrospectives to
- contribute to the development process.
- Troubleshoot & resolve complex technical issues & ensure timely resolution of bugs & improvements.
- Stay up to date with emerging technologies & industry trends recommending & implementing
- improvements to keep our stack modern & effective
Skills & Requirements
- Minimum 8 Years of experience in Java development, with at least 2 years in Lead developer role.
- Back-end - Core JAVA/JAVA8, Spring-boot, Spring Framework, Micro-services, Rest API’s, Kafka,
- Database - MySQL
- Must be working in the fintech/ Payments domain
Preferred
- Experience with Batch writing, Application performance, Caches security, Web Security
We are seeking an experienced and highly skilled Java (Fullstack) Engineer to join our team.
The ideal candidate will have a strong background in both Back-end JAVA, Spring-boot, Spring Framework & Frontend Javascript, React or Angular with ability to salable high performance applications.
Responsibilities
- Develop, test, and deploy scalable and robust backend services Develop, test & deploy scalable & robust back-end services using JAVA & Spring-boot
- Build responsive & user friendly front-end applications using modern Java-script framework with React
- or Angular
- Collaborate with architects & team members to design salable, maintainable & efficient systems.
- Contribute to architectural decisions for micro-services, API’s & cloud solutions.
- Implement & maintain Restful API for seamless integration.
- Write clean, efficient & res-usable code adhering to best practices
- Conduct code reviews, performance optimizations & debugging
- Work with cross functional teams, including UX/UI designers, product managers & QA team.
- Mentor junior developers & provide technical guidance.
Skills & Requirements
- Minimum 5 Years of experience in backend/ fullstack development
- Back-end - Core JAVA/JAVA8, Spring-boot, Spring Framework, Micro-services, Rest API’s, Kafka,
- Front-end - JavaScript, HTML, CSS, Typescript, Angular
- Database - MySQL
Preferred
- Experience with Batch writing, Application performance, Caches security, Web Security
- Experience working in fintech, payments, or high-scale production environments
Job Title : Java Developer
Experience : 2 to 10 Years
Location : Pune (Must be currently in Pune)
Notice Period : Immediate to 15 Days (Serving NP acceptable)
Budget :
- 2 to 3.5 yrs → up to 13 LPA
- 3.5 to 5 yrs → up to 18 LPA
- 5+ yrs → up to 25 LPA
Mandatory Skills : Java 8/17, Spring Boot, REST APIs, Hibernate/JPA, SQL/RDBMS, OOPs, Design Patterns, Git/GitHub, Unit Testing, Microservices (Good Coding Skills Mandatory)
Role Overview :
Hiring multiple Java Developers to build scalable and performance-driven applications. Strong hands-on coding and problem-solving skills required.
Key Responsibilities :
- Develop and maintain Java-based applications & REST services
- Write clean, testable code with JUnit & unit tests
- Participate in code reviews, debugging & optimization
- Work with SQL databases, CI/CD & version control tools
- Collaborate with cross-functional teams in Agile setups
Good to Have :
- MongoDB, AWS, Docker, Jenkins/GitHub Actions, Prometheus, Grafana, Spring Actuators, Tomcat/JBoss
What You’ll Do:
As a Sr. Data Scientist, you will work closely across DeepIntent Data Science teams located in New York, India, and Bosnia. The role will focus on building predictive models, implementing data-driven solutions to maximize ad effectiveness. You will also lead efforts in generating analyses and insights related to the measurement of campaign outcomes, Rx, patient journey, and supporting the evolution of the DeepIntent product suite. Activities in this position include developing and deploying models in production, reading campaign results, analyzing medical claims, clinical, demographic and clickstream data, performing analysis and creating actionable insights, summarizing, and presenting results and recommended actions to internal stakeholders and external clients, as needed.
- Explore ways to create better predictive models.
- Analyze medical claims, clinical, demographic and clickstream data to produce and present actionable insights.
- Explore ways of using inference, statistical, and machine learning techniques to improve the performance of existing algorithms and decision heuristics.
- Design and deploy new iterations of production-level code.
- Contribute posts to our upcoming technical blog.
Who You Are:
- Bachelor’s degree in a STEM field, such as Statistics, Mathematics, Engineering, Biostatistics, Econometrics, Economics, Finance, or Data Science.
- 5+ years of working experience as a Data Scientist or Researcher in digital marketing, consumer advertisement, telecom, or other areas requiring customer-level predictive analytics.
- Advanced proficiency in performing statistical analysis in Python, including relevant libraries, is required.
- Experience working with data processing, transformation and building model pipelines using tools such as Spark, Airflow, and Docker.
- You have an understanding of the ad-tech ecosystem, digital marketing and advertising data and campaigns or familiarity with the US healthcare patient and provider systems (e.g. medical claims, medications).
- You have varied and hands-on predictive machine learning experience (deep learning, boosting algorithms, inference…).
- You are interested in translating complex quantitative results into meaningful findings and interpretable deliverables, and communicating with less technical audiences orally and in writing.
- You can write production level code, work with Git repositories.
- Active Kaggle participant.
- Working experience with SQL.
- Familiar with medical and healthcare data (medical claims, Rx, preferred).
- Conversant with cloud technologies such as AWS or Google Cloud.
Company Description
AdElement is a leading digital advertising technology company that has been helping app publishers increase their ad revenue and reach untapped demand since 2011. With our expertise in connecting brands to app audiences on evolving screens, such as VR headsets and vehicle consoles, we enable our clients to be first to market. We have been recognized as the Google Agency of the Year and have offices globally, with our headquarters located in New Brunswick, New Jersey.
Job Description
Work alongside a highly skilled engineering team to design, develop, and maintain large-scale, highly performant, real-time applications.
Own building features, driving directly with product and other engineering teams.
Demonstrate excellent communication skills in working with technical and non-technical audiences.
Be an evangelist for best practices across all functions - developers, QA, and infrastructure/ops.
Be an evangelist for platform innovation and reuse.
Requirements:
2+ years of experience building large-scale and low-latency distributed systems.
Command of Java or C++.
Solid understanding of algorithms, data structures, performance optimization techniques, object-oriented programming, multi-threading, and real-time programming.
Experience with distributed caching, SQL/NO SQL, and other databases is a plus.
Experience with Big Data and cloud services such as AWS/GCP is a plus.
Experience in the advertising domain is a big plus.
B. S. or M. S. degree in Computer Science, Engineering, or equivalent.
Location: Pune, Maharashtra.
We are looking for Senior Software Engineers responsible for designing, developing, and maintaining large scale distributed ad technology systems. This would entail working on several different systems, platforms and technologies.Collaborate with various engineering teams to meet a range of technological challenges. You will work with our product team to contribute and influence the roadmap of our products and technologies and also influence and inspire team members.
Experience
- 3 - 10 Years
Required Skills
- 3+ years of work experience and a degree in computer science or a similar field
- Knowledgeable about computer science fundamentals including data structures, algorithms, and coding
- Enjoy owning projects from creation to completion and wearing multiple hats
- Product focused mindset
- Experience building distributed systems capable of handling large volumes of traffic
- Fluency with Java, Vertex, Redis, Relational Databases
- Possess good communication skills
- Enjoy working in a team-oriented environment that values excellence
- Have a knack for solving very challenging problems
- (Preferred) Previous experience in advertising technology or gaming apps
- (Preferred) Hands-on experience with Spark, Kafka or similar open-source software
Responsibilities
- Creating design and architecture documents
- Conducting code reviews
- Collaborate with others in the engineering teams to meet a range of technological challenges
- Build, Design and Develop large scale advertising technology system capable of handling tens of billions of events daily
Education
- UG - B.Tech/B.E. - Computers; PG - M.Tech - Computer
What We Offer:
- Competitive salary and benefits package.
- Opportunities for professional growth and development.
- A collaborative and inclusive work environment.
Salary budget upto 50 LPA or hike20% on current ctc
you can text me over linkedin for quick response
Specific Knowledge/Skills
- 4-6 years of experience
- Proficiency in Python programming.
- Basic knowledge of front-end development.
- Basic knowledge of Data manipulation and analysis libraries
- Code versioning and collaboration. (Git)
- Knowledge for Libraries for extracting data from websites.
- Knowledge of SQL and NoSQL databases
- Familiarity with RESTful APIs
- Familiarity with Cloud (Azure /AWS) technologies
🚀 Hiring: Java Developer at Deqode
⭐ Experience: 4+ Years
📍 Location: Indore, Pune, Mumbai, Nagpur, Noida, Kolkata, Bangalore,Chennai
⭐ Work Mode:- Hybrid
⏱️ Notice Period: Immediate Joiners
(Only immediate joiners & candidates serving notice period)
Requirements
✅ Strong proficiency in Java (Java 8/11/17)
✅ Experience with Spring / Spring Boot
✅ Knowledge of REST APIs, Microservices architecture
✅ Familiarity with SQL/NoSQL databases
✅ Understanding of Git, CI/CD pipelines
✅ Problem-solving skills and attention to detail
Job Details
- Job Title: Lead I - Data Engineering
- Industry: Global digital transformation solutions provider
- Domain - Information technology (IT)
- Experience Required: 6-9 years
- Employment Type: Full Time
- Job Location: Pune
- CTC Range: Best in Industry
Job Description
Job Title: Senior Data Engineer (Kafka & AWS)
Responsibilities:
- Develop and maintain real-time data pipelines using Apache Kafka (MSK or Confluent) and AWS services.
- Configure and manage Kafka connectors, ensuring seamless data flow and integration across systems.
- Demonstrate strong expertise in the Kafka ecosystem, including producers, consumers, brokers, topics, and schema registry.
- Design and implement scalable ETL/ELT workflows to efficiently process large volumes of data.
- Optimize data lake and data warehouse solutions using AWS services such as Lambda, S3, and Glue.
- Implement robust monitoring, testing, and observability practices to ensure reliability and performance of data platforms.
- Uphold data security, governance, and compliance standards across all data operations.
Requirements:
- Minimum of 5 years of experience in Data Engineering or related roles.
- Proven expertise with Apache Kafka and the AWS data stack (MSK, Glue, Lambda, S3, etc.).
- Proficient in coding with Python, SQL, and Java — with Java strongly preferred.
- Experience with Infrastructure-as-Code (IaC) tools (e.g., CloudFormation) and CI/CD pipelines.
- Excellent problem-solving, communication, and collaboration skills.
- Flexibility to write production-quality code in both Python and Java as required.
Skills: Aws, Kafka, Python
Must-Haves
Minimum of 5 years of experience in Data Engineering or related roles.
Proven expertise with Apache Kafka and the AWS data stack (MSK, Glue, Lambda, S3, etc.).
Proficient in coding with Python, SQL, and Java — with Java strongly preferred.
Experience with Infrastructure-as-Code (IaC) tools (e.g., CloudFormation) and CI/CD pipelines.
Excellent problem-solving, communication, and collaboration skills.
Flexibility to write production-quality code in both Python and Java as required.
Skills: Aws, Kafka, Python
Notice period - 0 to 15days only
We are seeking a motivated Data Analyst to support business operations by analyzing data, preparing reports, and delivering meaningful insights. The ideal candidate should be comfortable working with data, identifying patterns, and presenting findings in a clear and actionable way.
Key Responsibilities:
- Collect, clean, and organize data from internal and external sources
- Analyze large datasets to identify trends, patterns, and opportunities
- Prepare regular and ad-hoc reports for business stakeholders
- Create dashboards and visualizations using tools like Power BI or Tableau
- Work closely with cross-functional teams to understand data requirements
- Ensure data accuracy, consistency, and quality across reports
- Document data processes and analysis methods
Job Description
Wissen Technology is seeking an experienced C# .NET Developer to build and maintain applications related to streaming market data. This role involves developing message-based C#/.NET applications to process, normalize, and summarize large volumes of market data efficiently. The candidate should have a strong foundation in Microsoft .NET technologies and experience working with message-driven, event-based architecture. Knowledge of capital markets and equity market data is highly desirable.
Responsibilities
- Design, develop, and maintain message-based C#/.NET applications for processing real-time and batch market data feeds.
- Build robust routines to download and process data from AWS S3 buckets on a frequent schedule.
- Implement daily data summarization and data normalization routines.
- Collaborate with business analysts, data providers, and other developers to deliver high-quality, scalable market data solutions.
- Troubleshoot and optimize market data pipelines to ensure low latency and high reliability.
- Contribute to documentation, code reviews, and team knowledge sharing.
Required Skills and Experience
- 5+ years of professional experience programming in C# and Microsoft .NET framework.
- Strong understanding of message-based and real-time programming architectures.
- Experience working with AWS services, specifically S3, for data retrieval and processing.
- Experience with SQL and Microsoft SQL Server.
- Familiarity with Equity market data, FX, Futures & Options, and capital markets concepts.
- Excellent interpersonal and communication skills.
- Highly motivated, curious, and analytical mindset with the ability to work well both independently and in a team environment.
What You’ll Do:
- Setting up formal data practices for the company.
- Building and running super stable and scalable data architectures.
- Making it easy for folks to add and use new data with self-service pipelines.
- Getting DataOps practices in place.
- Designing, developing, and running data pipelines to help out Products, Analytics, data scientists and machine learning engineers.
- Creating simple, reliable data storage, ingestion, and transformation solutions that are a breeze to deploy and manage.
- Writing and Managing reporting API for different products.
- Implementing different methodologies for different reporting needs.
- Teaming up with all sorts of people – business folks, other software engineers, machine learning engineers, and analysts.
Who You Are:
- Bachelor’s degree in engineering (CS / IT) or equivalent degree from a well-known Institute / University.
- 3.5+ years of experience in building and running data pipelines for tons of data.
- Experience with public clouds like GCP or AWS.
- Experience with Apache open-source projects like Spark, Druid, Airflow, and big data databases like BigQuery, Clickhouse.
- Experience making data architectures that are optimised for both performance and cost.
- Good grasp of software engineering, DataOps, data architecture, Agile, and DevOps.
- Proficient in SQL, Java, Spring Boot, Python, and Bash.
- Good communication skills for working with technical and non-technical people.
- Someone who thinks big, takes chances, innovates, dives deep, gets things done, hires and develops the best, and is always learning and curious.

Global digital transformation solutions provider.
Role Proficiency:
Performs tests in strict compliance independently guides other testers and assists test leads
Additional Comments:
Position Title: - Automation + Manual Tester Primary
Skills: Playwright, xUnit, Allure Report, Page Object Model, .Net, C#, Database Queries
Secondary Skills: GIT, JIRA, Manual Testing Experience: 4 to 5 years ESSENTIAL FUNCTIONS AND
BASIC DUTIES
1. Leadership in Automation Strategy: o Assess the feasibility and scope of automation efforts to ensure they align with project timelines and requirements. o Identify opportunities for process improvements and automation within the software development life cycle (SDLC).
2. Automation Test Framework Development: o Design, develop, and implement reusable test automation frameworks for various testing phases (unit, integration, functional, performance, etc.). o Ensure the automation frameworks integrate well with CI/CD pipelines and other development tools. o Maintain and optimize test automation scripts and frameworks for continuous improvements.
3. Team Management: o Lead and mentor a team of automation engineers, ensuring they follow best practices, writing efficient test scripts, and developing scalable automation solutions. o Conduct regular performance evaluations and provide constructive feedback. o Facilitate knowledge-sharing sessions within the team.
4. Collaboration with Cross-functional Teams: o Work closely with development, QA, and operations teams to ensure proper implementation of automated testing and automation practices. o Collaborate with business analysts, product owners, and project managers to understand business requirements and translate them into automated test cases.
5. Continuous Integration & Delivery (CI/CD): o Ensure that automated tests are integrated into the CI/CD pipelines to facilitate continuous testing. o Identify and resolve issues related to the automation processes within the CI/CD pipeline.
6. Test Planning and Estimation: o Contribute to the test planning phase by identifying key automation opportunities. o Estimate effort and time required for automating test cases and other automation tasks.
7. Test Reporting and Metrics: o Monitor automation test results and generate detailed reports on test coverage, defects, and progress. o Analyze test results to identify trends, bottlenecks, or issues in the automation process and make necessary improvements.
8. Automation Tools Management: o Evaluate, select, and manage automation tools and technologies that best meet the needs of the project. o Ensure that the automation tools used align with the overall project requirements and help to achieve optimal efficiency.
9. Test Environment and Data Management: o Work on setting up and maintaining the test environments needed for automation. o Ensure automation scripts work across multiple environments, including staging, testing, and production environments.
10. Risk Management & Issue Resolution:
• Proactively identify risks associated with the automation efforts and provide solutions or mitigation strategies.
• Troubleshoot issues in the automation scripts, framework, and infrastructure to ensure minimal downtime and quick issue resolution.
11. Develop and Maintain Automated Tests: Write and maintain automated scripts for different testing levels, including regression, functional, and integration tests.
12. Bug Identification and Tracking: Report, track, and manage defects identified through automation testing to ensure quick resolution.
13. Improve Test Coverage: Identify gaps in test coverage and develop additional test scripts to improve test comprehensiveness. 14. Automation Documentation: Create and maintain detailed documentation for test automation processes, scripts, and frameworks.
15. Quality Assurance: Ensure that all automated testing activities meet the quality standards, contributing to delivering a high-quality software product.
16. Stakeholder Communication: Regularly update project stakeholders about automation progress, risks, and areas for improvement.
REQUIRED KNOWLEDGE
1. Automation Tools Expertise: Proficiency in tools like Playwright, Allure reports and integration with CI/CD pipelines.
2. Programming Languages: Strong knowledge of languages such as .NET and test frameworks like xUnit.
3. Version Control: Experience using Git for script management and collaboration.
4. Test Automation Frameworks: Ability to design scalable, reusable frameworks for different types of tests (functional, integration, etc.).
5. Leadership and Mentoring: Lead and mentor automation teams, ensuring adherence to best practices and continuous improvement.
6. Problem-Solving: Strong troubleshooting and analytical skills to identify and resolve automation issues quickly.
7. Collaboration and Communication: Excellent communication skills for working with cross-functional teams and presenting test results.
8. Time Management: Ability to estimate, prioritize, and manage automation tasks to meet project deadlines.
9. Quality Focus: Strong commitment to improving software quality, test coverage, and automation efficiency.
Skills: xUnit, Allure report, Playwright, C#
Required Skills: Strong SQL Expertise, Data Reporting & Analytics, Database Development, Stakeholder & Client Communication, Independent Problem-Solving & Automation Skills
Review Criteria
· Must have Strong SQL skills (queries, optimization, procedures, triggers)
· Must have Advanced Excel skills
· Should have 3+ years of relevant experience
· Should have Reporting + dashboard creation experience
· Should have Database development & maintenance experience
· Must have Strong communication for client interactions
· Should have Ability to work independently
· Willingness to work from client locations.
Description
Who is an ideal fit for us?
We seek professionals who are analytical, demonstrate self-motivation, exhibit a proactive mindset, and possess a strong sense of responsibility and ownership in their work.
What will you get to work on?
As a member of the Implementation & Analytics team, you will:
● Design, develop, and optimize complex SQL queries to extract, transform, and analyze data
● Create advanced reports and dashboards using SQL, stored procedures, and other reporting tools
● Develop and maintain database structures, stored procedures, functions, and triggers
● Optimize database performance by tuning SQL queries, and indexing to handle large datasets efficiently
● Collaborate with business stakeholders and analysts to understand analytics requirements
● Automate data extraction, transformation, and reporting processes to improve efficiency
What do we expect from you?
For the SQL/Oracle Developer role, we are seeking candidates with the following skills and Expertise:
● Proficiency in SQL (Window functions, stored procedures) and MS Excel (advanced Excel skills)
● More than 3 plus years of relevant experience
● Java / Python experience is a plus but not mandatory
● Strong communication skills to interact with customers to understand their requirements
● Capable of working independently with minimal guidance, showcasing self-reliance and initiative
● Previous experience in automation projects is preferred
● Work From Office: Bangalore/Navi Mumbai/Pune/Client locations
Entry Level | On-Site | Pune
Internship Opportunity: Data + AI Intern
Location: Pune, India (In-office)
Duration: 2 Months
Start Date: Between 11th July 2025 and 15th August 2025
Work Days: Monday to Friday
Stipend: As per company policy
About ImmersiveData.AI
Smarter Data. Smarter Decisions. Smarter Enterprises.™
At ImmersiveData.AI, we don’t just transform data—we challenge and redefine business models. By leveraging cutting-edge AI, intelligent automation, and modern data platforms, we empower enterprises to unlock new value and drive strategic transformation.
About the Internship
As a Data + AI Intern, you will gain hands-on experience at the intersection of data engineering and AI. You’ll be part of a collaborative team working on real-world data challenges using modern tools like Snowflake, DBT, Airflow, and LLM frameworks. This internship is a launchpad for students looking to enter the rapidly evolving field of Data & AI.
Key Responsibilities
- Assist in designing, building, and optimizing data pipelines and ETL workflows
- Work with structured and unstructured datasets across various sources
- Contribute to AI-driven automation and analytics use cases
- Support backend integration of large language models (LLMs)
- Collaborate in building data platforms using tools like Snowflake, DBT, and Airflow
Required Skills
- Proficiency in Python
- Strong understanding of SQL and relational databases
- Basic knowledge of Data Engineering and Data Analysis concepts
- Familiarity with cloud data platforms or willingness to learn (e.g., Snowflake)
Preferred Learning Certifications (Optional but Recommended)
- Python Programming
- SQL & MySQL/PostgreSQL
- Statistical Modeling
- Tableau / Power BI
- Voice App Development (Bonus)
Who Can Apply
Only candidates who:
- Are available full-time (in-office, Pune)
- Can start between 11th July and 15th August 2025
- Are available for a minimum of 2 months
- Have relevant skills and interest in data and AI
Perks
- Internship Certificate
- Letter of Recommendation
- Work with cutting-edge tools and technologies
- Informal dress code
- Exposure to real industry use cases and mentorship
Core Responsibilities:
- The MLE will design, build, test, and deploy scalable machine learning systems, optimizing model accuracy and efficiency
- Model Development: Algorithms and architectures span traditional statistical methods to deep learning along with employing LLMs in modern frameworks.
- Data Preparation: Prepare, cleanse, and transform data for model training and evaluation.
- Algorithm Implementation: Implement and optimize machine learning algorithms and statistical models.
- System Integration: Integrate models into existing systems and workflows.
- Model Deployment: Deploy models to production environments and monitor performance.
- Collaboration: Work closely with data scientists, software engineers, and other stakeholders.
- Continuous Improvement: Identify areas for improvement in model performance and systems.
Skills:
- Programming and Software Engineering: Knowledge of software engineering best practices (version control, testing, CI/CD).
- Data Engineering: Ability to handle data pipelines, data cleaning, and feature engineering. Proficiency in SQL for data manipulation + Kafka, Chaossearch logs, etc for troubleshooting; Other tech touch points are ScyllaDB (like BigTable), OpenSearch, Neo4J graph
- Model Deployment and Monitoring: MLOps Experience in deploying ML models to production environments.
- Knowledge of model monitoring and performance evaluation.
Required experience:
- Amazon SageMaker: Deep understanding of SageMaker's capabilities for building, training, and deploying ML models; understanding of the Sagemaker pipeline with ability to analyze gaps and recommend/implement improvements
- AWS Cloud Infrastructure: Familiarity with S3, EC2, Lambda and using these services in ML workflows
- AWS data: Redshift, Glue
- Containerization and Orchestration: Understanding of Docker and Kubernetes, and their implementation within AWS (EKS, ECS)
Skills: Aws, Aws Cloud, Amazon Redshift, Eks
Must-Haves
Amazon SageMaker, AWS Cloud Infrastructure (S3, EC2, Lambda), Docker and Kubernetes (EKS, ECS), SQL, AWS data (Redshift, Glue)
Skills : Machine Learning, MLOps, AWS Cloud, Redshift OR Glue, Kubernetes, Sage maker
******
Notice period - 0 to 15 days only
Location : Pune & Hyderabad only
Job Description
Role: Java Developer
Location: PAN India
Experience:4+ Years
Required Skills -
- 3+ years Java development experience
- Spring Boot framework expertise (MANDATORY)
- Microservices architecture design & implementation (MANDATORY)
- Hibernate/JPA for database operations (MANDATORY)
- RESTful API development (MANDATORY)
- Database design and optimization (MANDATORY)
- Container technologies (Docker/Kubernetes)
- Cloud platforms experience (AWS/Azure)
- CI/CD pipeline implementation
- Code review and quality assurance
- Problem-solving and debugging skills
- Agile/Scrum methodology
- Version control systems (Git)
Who is an ideal fit for us?
We seek professionals who are analytical, demonstrate self-motivation, exhibit a proactive mindset, and possess a strong sense of responsibility and ownership in their work.
What will you get to work on?
As a member of the Implementation & Analytics team, you will:
● Design, develop, and optimize complex SQL queries to extract, transform, and analyze data
● Create advanced reports and dashboards using SQL, stored procedures, and other reporting tools
● Develop and maintain database structures, stored procedures, functions, and triggers
● Optimize database performance by tuning SQL queries, and indexing to handle large datasets efficiently
● Collaborate with business stakeholders and analysts to understand analytics requirements
● Automate data extraction, transformation, and reporting processes to improve efficiency
What do we expect from you?
For the SQL/Oracle Developer role, we are seeking candidates with the following skills and Expertise:
● Proficiency in SQL (Window functions, stored procedures) and MS Excel (advanced Excel skills)
● More than 3 plus years of relevant experience
● Java / Python experience is a plus but not mandatory
● Strong communication skills to interact with customers to understand their requirements
● Capable of working independently with minimal guidance, showcasing self-reliance and initiative
● Previous experience in automation projects is preferred
● Work From Office: Bangalore/Navi Mumbai/Pune/Client locations
Review Criteria
- Strong Senior Data Engineer profile
- 4+ years of hands-on Data Engineering experience
- Must have experience owning end-to-end data architecture and complex pipelines
- Must have advanced SQL capability (complex queries, large datasets, optimization)
- Must have strong Databricks hands-on experience
- Must be able to architect solutions, troubleshoot complex data issues, and work independently
- Must have Power BI integration experience
- CTC has 80% fixed and 20% variable in their ctc structure
Preferred
- Worked on Call center data, understand nuances of data generated in call centers
- Experience implementing data governance, quality checks, or lineage frameworks
- Experience with orchestration tools (Airflow, ADF, Glue Workflows), Python, Delta Lake, Lakehouse architecture
Job Specific Criteria
- CV Attachment is mandatory
- Are you Comfortable integrating with Power BI datasets?
- We have an alternate Saturdays working. Are you comfortable to WFH on 1st and 4th Saturday?
Role & Responsibilities
We are seeking a highly experienced Senior Data Engineer with strong architectural capability, excellent optimisation skills, and deep hands-on experience in modern data platforms. The ideal candidate will have advanced SQL skills, strong expertise in Databricks, and practical experience working across cloud environments such as AWS and Azure. This role requires end-to-end ownership of complex data engineering initiatives, including architecture design, data governance implementation, and performance optimisation. You will collaborate with cross-functional teams to build scalable, secure, and high-quality data solutions.
Key Responsibilities-
- Lead the design and implementation of scalable data architectures, pipelines, and integration frameworks.
- Develop, optimise, and maintain complex SQL queries, transformations, and Databricks-based data workflows.
- Architect and deliver high-performance ETL/ELT processes across cloud platforms.
- Implement and enforce data governance standards, including data quality, lineage, and access control.
- Partner with analytics, BI (Power BI), and business teams to enable reliable, governed, and high-value data delivery.
- Optimise large-scale data processing, ensuring efficiency, reliability, and cost-effectiveness.
- Monitor, troubleshoot, and continuously improve data pipelines and platform performance.
- Mentor junior engineers and contribute to engineering best practices, standards, and documentation.
Ideal Candidate
- Proven industry experience as a Senior Data Engineer, with ownership of high-complexity projects.
- Advanced SQL skills with experience handling large, complex datasets.
- Strong expertise with Databricks for data engineering workloads.
- Hands-on experience with major cloud platforms — AWS and Azure.
- Deep understanding of data architecture, data modelling, and optimisation techniques.
- Familiarity with BI and reporting environments such as Power BI.
- Strong analytical and problem-solving abilities with a focus on data quality and governance
- Proficiency in python or another programming language in a plus.
ROLES AND RESPONSIBILITIES:
We are seeking a highly experienced Senior Data Engineer with strong architectural capability, excellent optimisation skills, and deep hands-on experience in modern data platforms. The ideal candidate will have advanced SQL skills, strong expertise in Databricks, and practical experience working across cloud environments such as AWS and Azure. This role requires end-to-end ownership of complex data engineering initiatives, including architecture design, data governance implementation, and performance optimisation. You will collaborate with cross-functional teams to build scalable, secure, and high-quality data solutions.
Key Responsibilities-
- Lead the design and implementation of scalable data architectures, pipelines, and integration frameworks.
- Develop, optimise, and maintain complex SQL queries, transformations, and Databricks-based data workflows.
- Architect and deliver high-performance ETL/ELT processes across cloud platforms.
- Implement and enforce data governance standards, including data quality, lineage, and access control.
- Partner with analytics, BI (Power BI), and business teams to enable reliable, governed, and high-value data delivery.
- Optimise large-scale data processing, ensuring efficiency, reliability, and cost-effectiveness.
- Monitor, troubleshoot, and continuously improve data pipelines and platform performance.
- Mentor junior engineers and contribute to engineering best practices, standards, and documentation.
IDEAL CANDIDATE:
- Proven industry experience as a Senior Data Engineer, with ownership of high-complexity projects.
- Advanced SQL skills with experience handling large, complex datasets.
- Strong expertise with Databricks for data engineering workloads.
- Hands-on experience with major cloud platforms — AWS and Azure.
- Deep understanding of data architecture, data modelling, and optimisation techniques.
- Familiarity with BI and reporting environments such as Power BI.
- Strong analytical and problem-solving abilities with a focus on data quality and governance
- Proficiency in python or another programming language in a plus.
PERKS, BENEFITS AND WORK CULTURE:
Our people define our passion and our audacious, incredibly rewarding achievements. The company is one of India’s most diversified Non-banking financial companies, and among Asia’s top 10 Large workplaces. If you have the drive to get ahead, we can help find you an opportunity at any of the 500+ locations we’re present in India.
ROLES AND RESPONSIBILITIES:
We are looking for a Junior Data Engineer who will work under guidance to support data engineering tasks, perform basic coding, and actively learn modern data platforms and tools. The ideal candidate should have foundational SQL knowledge, basic exposure to Databricks. This role is designed for early-career professionals who are eager to grow into full data engineering responsibilities while contributing to data pipeline operations and analytical support.
Key Responsibilities-
- Support the development and maintenance of data pipelines and ETL/ELT workflows under mentorship.
- Write basic SQL queries, transformations, and assist with Databricks notebook tasks.
- Help troubleshoot data issues and contribute to ensuring pipeline reliability.
- Work with senior engineers and analysts to understand data requirements and deliver small tasks.
- Assist in maintaining documentation, data dictionaries, and process notes.
- Learn and apply data engineering best practices, coding standards, and cloud fundamentals.
- Support basic tasks related to Power BI data preparation or integrations as needed.
IDEAL CANDIDATE:
- Foundational SQL skills with the ability to write and understand basic queries.
- Basic exposure to Databricks, data transformation concepts, or similar data tools.
- Understanding of ETL/ELT concepts, data structures, and analytical workflows.
- Eagerness to learn modern data engineering tools, technologies, and best practices.
- Strong problem-solving attitude and willingness to work under guidance.
- Good communication and collaboration skills to work with senior engineers and analysts.
PERKS, BENEFITS AND WORK CULTURE:
Our people define our passion and our audacious, incredibly rewarding achievements. Bajaj Finance Limited is one of India’s most diversified Non-banking financial companies, and among Asia’s top 10 Large workplaces. If you have the drive to get ahead, we can help find you an opportunity at any of the 500+ locations we’re present in India.
ROLES AND RESPONSIBILITIES:
We are seeking a skilled Data Engineer who can work independently on data pipeline development, troubleshooting, and optimisation tasks. The ideal candidate will have strong SQL skills, hands-on experience with Databricks, and familiarity with cloud platforms such as AWS and Azure. You will be responsible for building and maintaining reliable data workflows, supporting analytical teams, and ensuring high-quality, secure, and accessible data across the organisation.
KEY RESPONSIBILITIES:
- Design, develop, and maintain scalable data pipelines and ETL/ELT workflows.
- Build, optimise, and troubleshoot SQL queries, transformations, and Databricks data processes.
- Work with large datasets to deliver efficient, reliable, and high-performing data solutions.
- Collaborate closely with analysts, data scientists, and business teams to support data requirements.
- Ensure data quality, availability, and security across systems and workflows.
- Monitor pipeline performance, diagnose issues, and implement improvements.
- Contribute to documentation, standards, and best practices for data engineering processes.
IDEAL CANDIDATE:
- Proven experience as a Data Engineer or in a similar data-focused role (3+ years).
- Strong SQL skills with experience writing and optimising complex queries.
- Hands-on experience with Databricks for data engineering tasks.
- Experience with cloud platforms such as AWS and Azure.
- Understanding of ETL/ELT concepts, data modelling, and pipeline orchestration.
- Familiarity with Power BI and data integration with BI tools.
- Strong analytical and troubleshooting skills, with the ability to work independently.
- Experience working end-to-end on data engineering workflows and solutions.
PERKS, BENEFITS AND WORK CULTURE:
Our people define our passion and our audacious, incredibly rewarding achievements. The company is one of India’s most diversified Non-banking financial companies, and among Asia’s top 10 Large workplaces. If you have the drive to get ahead, we can help find you an opportunity at any of the 500+ locations we’re present in India.
About the Company:
Verinite is a global technology consulting and services company laser-focused on the banking & financial services sector, especially in cards, payments, lending, trade, and treasury
They partner with banks, fintechs, payment processors, and other financial institutions to modernize their systems, improve operational resilience, and accelerate digital transformation. Their services include consulting, digital strategy, data, application modernization, quality engineering (testing), cloud & infrastructure, and application maintenance.
Skill – Authorization, Clearing and Settlement
1. Individual should have worked on scheme (Visa, Amex, Discover, Rupay & Mastercard both on authorization or clearing section.
2. Should be able to read scheme specifications and create business requirement/mapping for authorization and Clearing
3. Should have Hands on experience in implementing scheme related changes
4. Should be able to validate the and certify the change post development based on the mapping created
5. Should be able to work with Dev team on explaining and guiding on time-to-time basis.
6. Able to communicate with various teams & senior stakeholders
7. Go getter and great googler
8. Schemes – VISA/MC/AMEX/JCB/CUP/Mercury – Discover and Diners, CBUAE, Jaywan ( Local Scheme from UAE)
9.Experience with Issuing side is plus (good to have).
Hiring: Azure Data Engineer
⭐ Experience: 2+ Years
📍 Location: Pune, Bhopal, Jaipur, Gurgaon, Bangalore
⭐ Work Mode:- Hybrid
⏱️ Notice Period: Immediate Joiners
Passport: Mandatory & Valid
(Only immediate joiners & candidates serving notice period)
Mandatory Skills:
Azure Synapse, Azure Databricks, Azure Data Factory (ADF), SQL, Delta Lake, ADLS, ETL/ELT,Pyspark .
Responsibilities:
- Build and maintain data pipelines using ADF, Databricks, and Synapse.
- Develop ETL/ELT workflows and optimize SQL queries.
- Implement Delta Lake for scalable lakehouse architecture.
- Create Synapse data models and Spark/Databricks notebooks.
- Ensure data quality, performance, and security.
- Collaborate with cross-functional teams on data requirements.
Nice to Have:
Azure DevOps, Python, Streaming (Event Hub/Kafka), Power BI, Azure certifications (DP-203).
MUST-HAVES:
- Machine Learning + Aws + (EKS OR ECS OR Kubernetes) + (Redshift AND Glue) + Sage maker
- Notice period - 0 to 15 days only
- Hybrid work mode- 3 days office, 2 days at home
SKILLS: AWS, AWS CLOUD, AMAZON REDSHIFT, EKS
ADDITIONAL GUIDELINES:
- Interview process: - 2 Technical round + 1 Client round
- 3 days in office, Hybrid model.
CORE RESPONSIBILITIES:
- The MLE will design, build, test, and deploy scalable machine learning systems, optimizing model accuracy and efficiency
- Model Development: Algorithms and architectures span traditional statistical methods to deep learning along with employing LLMs in modern frameworks.
- Data Preparation: Prepare, cleanse, and transform data for model training and evaluation.
- Algorithm Implementation: Implement and optimize machine learning algorithms and statistical models.
- System Integration: Integrate models into existing systems and workflows.
- Model Deployment: Deploy models to production environments and monitor performance.
- Collaboration: Work closely with data scientists, software engineers, and other stakeholders.
- Continuous Improvement: Identify areas for improvement in model performance and systems.
SKILLS:
- Programming and Software Engineering: Knowledge of software engineering best practices (version control, testing, CI/CD).
- Data Engineering: Ability to handle data pipelines, data cleaning, and feature engineering. Proficiency in SQL for data manipulation + Kafka, Chaos search logs, etc. for troubleshooting; Other tech touch points are Scylla DB (like BigTable), OpenSearch, Neo4J graph
- Model Deployment and Monitoring: MLOps Experience in deploying ML models to production environments.
- Knowledge of model monitoring and performance evaluation.
REQUIRED EXPERIENCE:
- Amazon SageMaker: Deep understanding of SageMaker's capabilities for building, training, and deploying ML models; understanding of the Sage maker pipeline with ability to analyze gaps and recommend/implement improvements
- AWS Cloud Infrastructure: Familiarity with S3, EC2, Lambda and using these services in ML workflows
- AWS data: Redshift, Glue
- Containerization and Orchestration: Understanding of Docker and Kubernetes, and their implementation within AWS (EKS, ECS)
Review Criteria
- Strong Lead – User Research & Analyst profile (behavioural/user/product/ux analytics)
- 10+ years of experience in Behavioral Data Analytics, User Research, or Product Insights, driving data-informed decision-making for B2C digital products (web and app).
- 6 months+ experience in analyzing user journeys, clickstream, and behavioral data using tools such as Google Analytics, Mixpanel, CleverTap, Firebase, or Amplitude.
- Experience in leading cross-functional user research and analytics initiatives in collaboration with Product, Design, Engineering, and Business teams to translate behavioral insights into actionable strategies.
- Strong expertise in A/B testing and experimentation, including hypothesis design, execution, statistical validation, and impact interpretation.
- Ability to identify behavioral patterns, funnel drop-offs, engagement trends, and user journey anomalies using large datasets and mixed-method analysis.
- Hands-on proficiency in SQL, Excel, and data visualization/storytelling tools such as Tableau, Power BI, or Looker for executive reporting and dashboard creation.
- Deep understanding of UX principles, customer journey mapping, and product experience design, with experience integrating qualitative and quantitative insights.
Preferred
- Ability to build insightful dashboards and executive reports highlighting user engagement, retention, and behavioral metrics; familiarity with mixed-method research, AI-assisted insight tools (Dovetail, EnjoyHQ, Qualtrics, UserZoom), and mentoring junior researchers
Job Specific Criteria
- CV Attachment is mandatory
- We have an alternate Saturday’s working. Are you comfortable to WFH on 1st and 4th Saturday?
Role & Responsibilities
Product Conceptualization & UX Strategy Development:
- Conceptualize customer experience strategies
- Collaborate with product managers to conceptualize new products & align UX with product roadmaps.
- Develop and implement UX strategies that align with business objectives.
- Stay up-to-date with industry trends and best practices in UX & UI for AI.
- Assist in defining product requirements and features.
- Use data analytics to inform product strategy and prioritize features.
- Ensure product alignment with customer needs and business goals.Develop platform blueprints that include a features and functionalities map, ecosystem map, and information architecture.
- Create wireframes, prototypes, and mock-ups using tools like Figma
- Conduct usability testing and iterate designs based on feedback
- Employ tools like X-Mind for brainstorming and mind mapping
Customer Journey Analysis:
- Understand and map out customer journeys and scenarios.
- Identify pain points and opportunities for improvement.
- Develop customer personas and empathy maps.
Cross-Functional Collaboration:
- Work closely with internal units such as UX Research, Design, UX Content, and UX QA to ensure seamless delivery of CX initiatives.
- Coordinate with development teams to ensure UX designs are implemented accurately.
Data Analytics and Tools:
- Utilize clickstream and analytics tools like Google Analytics, CleverTap, and Medallia to gather and analyse user data.
- Leverage data to drive decisions and optimize customer experiences.
- Strong background in data analytics, including proficiency in interpreting complex datasets to inform UX decisions.
Ideal Candidate
- Bachelor’s or Master’s degree in a relevant field (e.g., UX Design, Human-Computer Interaction, Computer Science, Marketing).
- 5+ years of experience in CX/UX roles, preferably in a B2C environment.
- Proficiency in analytics tools (Google Analytics, CleverTap, Medallia, Hotjar, etc).
- Strong understanding of wireframing and prototyping tools (Figma, XMind, etc).
- Excellent communication and collaboration skills.
- Proven experience in managing cross-functional teams and projects.
- Strong background in data analytics and data-driven decision-making.
- Expert understanding of user experience and user-centered design approaches
- Detail-orientation with experience and will to continuously learn, adapt and evolve
- Creating and measuring the success and impact of your CX designs
- Knowledge of testing tools like Maze, UsabiltyHub, UserZoom would be a plus
- Experienced in designing responsive websites as well as mobile apps
- Understanding of iOS and Android design guidelines
- Passion for great customer-focus design, purposeful aesthetic sense and generating simple solutions to complex problems.
- Excellent communication skills to be able to present their work and ideas to the leadership team.
Job Title: Sr. ETL Test Engineer
Experience: 7+ Years
Location: Gurgaon / Noida / Pune (Work From Office)
Joining: Immediate joiners only (≤15 days notice)
About the Role
We are seeking an experienced ETL Test Engineer with strong expertise in cloud-based ETL tools, Azure ecosystem, and advanced SQL skills. The ideal candidate will have a proven track record in validating complex data pipelines, ensuring data integrity, and collaborating with cross-functional teams in an Agile environment.
Key Responsibilities
- Design, develop, and execute ETL test plans, test cases, and test scripts for cloud-based data pipelines.
- Perform data validation, transformation, and reconciliation between source and target systems.
- Work extensively with Azure Data Factory, Azure Synapse Analytics, Azure SQL Database, and related Azure services.
- Develop and run complex SQL queries for data extraction, analysis, and validation.
- Collaborate with developers, business analysts, and product owners to clarify requirements and ensure comprehensive test coverage.
- Perform regression, functional, and performance testing of ETL processes.
- Identify defects, log them, and work with development teams to ensure timely resolution.
- Participate in Agile ceremonies (daily stand-ups, sprint planning, retrospectives) and contribute to continuous improvement.
- Ensure adherence to data quality and compliance standards.
Required Skills & Experience
- 5+ years of experience in ETL testing, preferably with cloud-based ETL tools.
- Strong hands-on experience with Azure Data Factory, Azure Synapse Analytics, and Azure SQL.
- Advanced SQL query writing and performance tuning skills.
- Strong understanding of data warehousing concepts, data models, and data governance.
- Experience with Agile methodologies and working in a Scrum team.
- Excellent communication and stakeholder management skills.
- Strong problem-solving skills and attention to detail.
Preferred Skills
- Experience with Python, PySpark, or automation frameworks for ETL testing.
- Exposure to CI/CD pipelines in Azure DevOps or similar tools.
- Knowledge of data security, compliance, and privacy regulations.
Job Details
- Job Title: ML Engineer II - Aws, Aws Cloud
- Industry: Technology
- Domain - Information technology (IT)
- Experience Required: 6-12 years
- Employment Type: Full Time
- Job Location: Pune
- CTC Range: Best in Industry
Job Description:
Core Responsibilities:
? The MLE will design, build, test, and deploy scalable machine learning systems, optimizing model accuracy and efficiency
? Model Development: Algorithms and architectures span traditional statistical methods to deep learning along with employing LLMs in modern frameworks.
? Data Preparation: Prepare, cleanse, and transform data for model training and evaluation.
? Algorithm Implementation: Implement and optimize machine learning algorithms and statistical models.
? System Integration: Integrate models into existing systems and workflows.
? Model Deployment: Deploy models to production environments and monitor performance.
? Collaboration: Work closely with data scientists, software engineers, and other stakeholders.
? Continuous Improvement: Identify areas for improvement in model performance and systems.
Skills:
? Programming and Software Engineering: Knowledge of software engineering best practices (version control, testing, CI/CD).
? Data Engineering: Ability to handle data pipelines, data cleaning, and feature engineering. Proficiency in SQL for data manipulation + Kafka, Chaossearch logs, etc for troubleshooting; Other tech touch points are ScyllaDB (like BigTable), OpenSearch, Neo4J graph
? Model Deployment and Monitoring: MLOps Experience in deploying ML models to production environments.
? Knowledge of model monitoring and performance evaluation.
Required experience:
? Amazon SageMaker: Deep understanding of SageMaker's capabilities for building, training, and deploying ML models; understanding of the Sagemaker pipeline with ability to analyze gaps and recommend/implement improvements
? AWS Cloud Infrastructure: Familiarity with S3, EC2, Lambda and using these services in
ML workflows
? AWS data: Redshift, Glue
? Containerization and Orchestration: Understanding of Docker and Kubernetes, and their implementation within AWS (EKS, ECS)
Skills: Aws, Aws Cloud, Amazon Redshift, Eks
Must-Haves
Aws, Aws Cloud, Amazon Redshift, Eks
NP: Immediate – 30 Days
Job Title: .NET Developer
Experience: 3-5 Years
Location: Pune
Employment Type: Full-Time
About the Role:
We're looking for an experienced .NET Developer with 3-5 years of expertise to join our team in Pune. You'll be working on building scalable web applications using C#, .NET Core, and modern development practices.
What You'll Do:
- Develop and maintain web applications using C# and .NET Core (.NET 5+)
- Build RESTful APIs with ASP.NET Core Web API
- Design and optimize SQL Server databases with complex queries and stored procedures
- Implement OOP principles, Design Patterns, and SOLID principles
- Work with Entity Framework Core for data access
- Collaborate with teams using Agile/Scrum methodologies
- Perform code reviews and ensure code quality
- Manage code using Git version control
Required Skills:
Must-Have:
- Strong proficiency in C# and .NET Core (or .NET 5+) - 3-5 years
- Hands-on experience with ASP.NET Core Web API and RESTful services
- Expertise in SQL (T-SQL), stored procedures, and database design
- Experience with SQL Server is highly preferred
- Hands-on experience with Entity Framework Core
- Solid understanding of OOP, Design Patterns, and SOLID principles
- Experience with Git and Agile/Scrum methodologies
Good to Have:
- Front-end technologies (Angular, React)
- Microservices architecture
- Cloud platforms (Azure/AWS)
- CI/CD pipelines
- Unit testing frameworks
Qualifications:
- Bachelor's degree in Computer Science, IT, or related field
- 3-5 years of professional .NET development experience
Company Name – Wissen Technology
Location : Pune / Bangalore / Mumbai (Based on candidate preference)
Work mode: Hybrid
Experience: 5+ years
Job Description
Wissen Technology is seeking an experienced C# .NET Developer to build and maintain applications related to streaming market data. This role involves developing message-based C#/.NET applications to process, normalize, and summarize large volumes of market data efficiently. The candidate should have a strong foundation in Microsoft .NET technologies and experience working with message-driven, event-based architecture. Knowledge of capital markets and equity market data is highly desirable.
Responsibilities
- Design, develop, and maintain message-based C#/.NET applications for processing real-time and batch market data feeds.
- Build robust routines to download and process data from AWS S3 buckets on a frequent schedule.
- Implement daily data summarization and data normalization routines.
- Collaborate with business analysts, data providers, and other developers to deliver high-quality, scalable market data solutions.
- Troubleshoot and optimize market data pipelines to ensure low latency and high reliability.
- Contribute to documentation, code reviews, and team knowledge sharing.
Required Skills and Experience
- 5+ years of professional experience programming in C# and Microsoft .NET framework.
- Strong understanding of message-based and real-time programming architectures.
- Experience working with AWS services, specifically S3, for data retrieval and processing.
- Experience with SQL and Microsoft SQL Server.
- Familiarity with Equity market data, FX, Futures & Options, and capital markets concepts.
- Excellent interpersonal and communication skills.
- Highly motivated, curious, and analytical mindset with the ability to work well both independently and in a team environment.
Education
- Bachelor’s degree in Computer Science, Engineering, or a related technical field.
Job Title: Mid-Level .NET Developer (Agile/SCRUM)
Location: Mohali, Bangalore, Pune, Navi Mumbai, Chennai, Hyderabad, Panchkula, Gurugram (Delhi NCR), Dehradun
Night Shift from 6:30 pm to 3:30 am IST
Experience: 5+ Years
Job Summary:
We are seeking a proactive and detail-oriented Mid-Level .NET Developer to join our dynamic team. The ideal candidate will be responsible for designing, developing, and maintaining high-quality applications using Microsoft technologies with a strong emphasis on .NET Core, C#, Web API, and modern front-end frameworks. You will collaborate with cross-functional teams in an Agile/SCRUM environment and participate in the full software development lifecycle—from requirements gathering to deployment—while ensuring adherence to best coding and delivery practices.
Key Responsibilities:
- Design, develop, and maintain applications using C#, .NET, .NET Core, MVC, and databases such as SQL Server, PostgreSQL, and MongoDB.
- Create responsive and interactive user interfaces using JavaScript, TypeScript, Angular, HTML, and CSS.
- Develop and integrate RESTful APIs for multi-tier, distributed systems.
- Participate actively in Agile/SCRUM ceremonies, including sprint planning, daily stand-ups, and retrospectives.
- Write clean, efficient, and maintainable code following industry best practices.
- Conduct code reviews to ensure high-quality and consistent deliverables.
- Assist in configuring and maintaining CI/CD pipelines (Jenkins or similar tools).
- Troubleshoot, debug, and resolve application issues effectively.
- Collaborate with QA and product teams to validate requirements and ensure smooth delivery.
- Support release planning and deployment activities.
Required Skills & Qualifications:
- 4–6 years of professional experience in .NET development.
- Strong proficiency in C#, .NET Core, MVC, and relational databases such as SQL Server.
- Working knowledge of NoSQL databases like MongoDB.
- Solid understanding of JavaScript/TypeScript and the Angular framework.
- Experience in developing and integrating RESTful APIs.
- Familiarity with Agile/SCRUM methodologies.
- Basic knowledge of CI/CD pipelines and Git version control.
- Hands-on experience with AWS cloud services.
- Strong analytical, problem-solving, and debugging skills.
- Excellent communication and collaboration skills.
Preferred / Nice-to-Have Skills:
- Advanced experience with AWS services.
- Knowledge of Kubernetes or other container orchestration platforms.
- Familiarity with IIS web server configuration and management.
- Experience in the healthcare domain.
- Exposure to AI-assisted code development tools (e.g., GitHub Copilot, ChatGPT).
- Experience with application security and code quality tools such as Snyk or SonarQube.
- Strong understanding of SOLID principles and clean architecture patterns.
Technical Proficiencies:
- ASP.NET Core, ASP.NET MVC
- C#, Entity Framework, Razor Pages
- SQL Server, MongoDB
- REST API, jQuery, AJAX
- HTML, CSS, JavaScript, TypeScript, Angular
- Azure Services, Azure Functions, AWS
- Visual Studio
- CI/CD, Git

Contribute to all software-development life-cycle phases inc
Mandatory Skills
- Backend: Java, Spring Boot
- Frontend: Angular
- Database: Oracle / SQL
- Node js
Job Description
Contribute to all software-development life-cycle phases including: domain and non-domain problem analysis, solution requirement gathering and analysis, solution design, implementation, code review, source-code control, source building deployment, validation, QA support, and production support.
Essential Duties and Responsibilities
• Maintain and enhance multi-tier messaging application suites (Java EE, Springframework, WAS, Oracle, DB2, MQ)
• Build and maintain IRIS4Health middle-tier message applications (IRIS Interop/Cache; Java, Drools, Kafka, Restful, MLLP, SQL)
• Build and maintain multi-tier Clinical Toxicology application (Angular, Java EE, Springframework, WAS, RHOS, Cache, SQL)
• Maintain stat-tracking application (two-tier Delphi, MySQL)
• Maintain and enhance Cytogenetics three-tier application (Java EE, WAS, DB2, Oracle, SQL, )
• Maintain and enhance Fibrosure application (Java EE, WAS, Derby)
• Define develop, validate, and release software products via agile processes for small and large projects
• Support applications and people via Kanban processes
• Collaborate with laboratory users to analyze problems, design and implement solutions for enterprise systems
• Provide support and troubleshooting of production systems according to an on-call schedule
• Document problem analysis, solution design, implementations, and system support guidelines
• Coach and train team members across lab system organizations to support and develop Java applications
• Communicate effectively and constructively with developers, QA, business analysts, and system users
• Design and depict via UML relational DB table models, object-oriented class models, messaging models, configuration models
• Understand, document, support, and improve inherited code and processes
• Help document knowledge and discovery with peer developers
Minimum Requirements
• Solid Java EE experience (Servlets, JMS, JSP, EJB, JCA, and JPA) development and support
• Solid InfoSystems Cache/IRIS for Health development and support
• A minimum of 1 years of JPA/ORM (Hibernate), Junit, XML/XSD, JSON experience or equivalent
• Solid SQL (and optionally PLSQL) experience
• Experience with Oracle DB including explain plan and or other query optimization techniques/tools
• Excellent verbal and written communication skills
• Strong UML modeling, ER and OO design, and data-normalization techniques
• Strong code-factoring philosophies and techniques
• Eclipse or NetBeans (or equivalent) IDE
• Strong understanding of client/server design, and smart recognition of separation-of-concern like functional behavior versus non-functional performance
Desired Requirements
• Java EE, Angular
• InfoSystem's Cache and/or IRIS for Health
• Springframework
• Modern deployment architectures using containers, API Gateways, load balancers, and AWS cloud based environments
• WebSphere or WebLogic, RHOS
• RESTful Web Services
• JMS interfacing, Apache Kafka, and IBM MQ
• Node.js/NPM, Bootstrap, or similar frameworks
• Git/BitBucket (git flow), Maven, Nexus, UCD, Jira (Kanban and SCRUM), agile workflow
• Unix shell script, DOS script
• SQL (optionally PLSQL)
• Design patterns
• HTML5, CSS3, and TypeScript development
• Ability to transform specific domain requirements into generalized technical requirements, and design and implement abstract solutions that are understandable and scalable in performance and reuse
• HL7 and/or Healthcare and/or Clinical Toxicology
• Oracle, MySQL, Derby DB
Key Responsibilities:
- Perform comprehensive Functional and Integration Testing across Oracle modules and connected systems.
- Conduct detailed End-to-End (E2E) Testing to ensure business processes function seamlessly across applications.
- Collaborate with cross-functional teams, including Business Analysts, Developers, and Automation teams, to validate business requirements and deliver high-quality releases.
- Identify, document, and track functional defects, ensuring timely closure and root cause analysis.
- Execute and validate SQL queries for backend data verification and cross-system data consistency checks.
- Participate in regression cycles and support continuous improvement initiatives through data-driven analysis.
Required Skills & Competencies:
- Strong knowledge of Functional Testing processes and methodologies.
- Good to have Oracle fusion knowledge
- Solid understanding of Integration Flows between Oracle and peripheral systems.
- Proven ability in E2E Testing, including scenario design, execution, and defect management.
- Excellent Analytical and Logical Reasoning skills with attention to detail.
- Hands-on experience with SQL for data validation and analysis.
- Effective communication, documentation, and coordination skills.
Preferred Qualifications:
- Exposure to automation-assisted functional testing and cross-platform data validation.
- Experience in identifying test optimization opportunities and improving testing efficiency.
CTC: 15 LPA to 21 LPA
Exp: 5 to 8 Years
Mandatory
- Strong Behavioral Data Analyst Profiles
- Mandatory (Experience 1): Minimum 4+ years of experience in user analytics or behavioural data analysis, focusing on user app and web journeys
- Mandatory (Experience 2): Experience in analyzing clickstream and behavioral data using tools such as Google Analytics, Mixpanel, CleverTap, or Firebase
- Mandatory (Skills 1): Hands-on experience in A/B testing, including hypothesis design, experimentation, and result interpretation.
- Mandatory (Skills 2): Strong analytical ability to identify behavioral patterns, anomalies, funnel drop-offs, and engagement trends from large datasets.
- Mandatory (Skills 3): Hands-on proficiency in SQL, Excel, and data visualization tools such as Tableau or Power BI for dashboard creation and data storytelling.
- Mandatory (Skills 4): Basic understanding of UX principles and customer journey mapping, collaborating effectively with UX/CX teams
- Mandatory (Company): B2C product Companies (fintech, or e-commerce organizations with large user behavior dataset is a plus)
- Mandatory (Note): Don't want data analysis but business/product/user analysts
Ideal Candidate:
- Bachelor’s or Master’s degree in a relevant field (e.g., UX Design, Human-Computer Interaction, Computer Science, Marketing).
- 5+ years of experience in CX/UX roles, preferably in a B2C environment.
- Proficiency in analytics tools (Google Analytics, CleverTap, Medallia, Hotjar, etc).
- Strong understanding of wireframing and prototyping tools (Figma, XMind, etc).
- Excellent communication and collaboration skills.
- Proven experience in managing cross-functional teams and projects.
- Strong background in data analytics and data-driven decision-making.
- Expert understanding of user experience and user-centered design approaches
- Detail-orientation with experience and will to continuously learn, adapt and evolve
- Creating and measuring the success and impact of your CX designs
- Knowledge of testing tools like Maze, UsabiltyHub, UserZoom would be a plus
- Experienced in designing responsive websites as well as mobile apps
- Understanding of iOS and Android design guidelines
- Passion for great customer-focus design, purposeful aesthetic sense and generating simple solutions to complex problems.
- Excellent communication skills to be able to present their work and ideas to the leadership team.
If interested kindly share your updated resume on 82008 31681
🚀 Hiring: PL/SQL Developer
⭐ Experience: 5+ Years
📍 Location: Pune
⭐ Work Mode:- Hybrid
⏱️ Notice Period: Immediate Joiners
(Only immediate joiners & candidates serving notice period)
What We’re Looking For:
☑️ Hands-on PL/SQL developer with strong database and scripting skills, ready to work onsite and collaborate with cross-functional financial domain teams.
Key Skills:
✅ Must Have: PL/SQL, SQL, Databases, Unix/Linux & Shell Scripting
✅ Nice to Have: DevOps tools (Jenkins, Artifactory, Docker, Kubernetes),
✅AWS/Cloud, Basic Python, AML/Fraud/Financial domain, Actimize (AIS/RCM/UDM)
Role: Lead Java Developer
Work Location: Chennai, Pune
No of years’ experience: 8+ years
Hybrid (3 days office and 2 days home)
Type: Fulltime
Skill Set: Java + Spring Boot + Sql + Microservices + DevOps
Job Responsibilities:
Design, develop, and maintain high-quality software applications using Java and Spring Boot.
Develop and maintain RESTful APIs to support various business requirements.
Write and execute unit tests using TestNG to ensure code quality and reliability.
Work with NoSQL databases to design and implement data storage solutions.
Collaborate with cross-functional teams in an Agile environment to deliver high-quality software solutions.
Utilize Git for version control and collaborate with team members on code reviews and merge requests.
Troubleshoot and resolve software defects and issues in a timely manner.
Continuously improve software development processes and practices.
Description:
8 years of professional experience in backend development using Java and leading a team.
Strong expertise in Spring Boot, Apache Camel, Hibernate, JPA, and REST API design
Hands-on experience with PostgreSQL, MySQL, or other SQL-based databases
Working knowledge of AWS cloud services (EC2, S3, RDS, etc.)
Experience in DevOps activities.
Proficiency in using Docker for containerization and deployment.
Strong understanding of object-oriented programming, multithreading, and performance tuning
Self-driven and capable of working independently with minimal supervision
Key Responsibilities:
● Analyze and translate legacy MSSQL stored procedures into Snowflake Scripting (SQL) or JavaScript-based stored procedures.
● Rebuild and optimize data pipelines and transformation logic in Snowflake.
● Implement performance-tuning techniques such as query pruning, clustering keys, appropriate warehouse sizing, and materialized views.
● Monitor query performance using the Snowflake Query Profile and resolve bottlenecks.
● Ensure procedures are idempotent, efficient, and scalable for high-volume workloads.
● Collaborate with architects and data teams to ensure accurate and performant data migration.
● Write test cases to validate functional correctness and performance.
● Document changes and follow version control best practices (e.g., Git, CI/CD).
Required Skills:
● 4+ years of SQL development experience, including strong T-SQL proficiency.
● 2+ years of hands-on experience with Snowflake, including stored procedure development.
● Deep knowledge of query optimization and performance tuning in Snowflake.
● Familiarity with Snowflake internals: automatic clustering, micro-partitioning, result caching, and warehouse scaling.
● Solid understanding of ETL/ELT processes, preferably with tools like DBT, Informatica, or Airflow.
● Experience with CI/CD pipelines and Git-based version control
Note : One face-to-face (F2F) round is mandatory, and as per the process, you will need to visit the office for this.
We are looking for an experienced Java Support Engineer with 4+ years of hands-on experience in supporting and maintaining Java/Spring Boot-based applications. The ideal candidate will be responsible for production support, debugging issues, and ensuring smooth application performance.
Key Responsibilities:
- Provide L2/L3 support for Java/Spring Boot applications in production and non-production environments.
- Perform incident analysis, root cause identification, and apply quick fixes or permanent solutions.
- Handle application deployments, environment monitoring, and performance tuning.
- Collaborate with development, DevOps, and database teams to resolve technical issues.
- Write and debug SQL queries, manage data fixes, and ensure database integrity.
- Use monitoring tools like Splunk, Kibana, or ELK Stack for issue investigation.
- Prepare documentation for recurring issues and maintain knowledge base.
Technical Skills:
- Strong in Core Java, Spring Boot, RESTful APIs
- Good knowledge of SQL / PL-SQL (Oracle / MySQL / PostgreSQL)
- Familiar with Linux/Unix commands and Shell scripting
- Exposure to microservices architecture and CI/CD tools (Jenkins, Maven)
- Hands-on experience with application monitoring and log analysis tools
- Knowledge of cloud (AWS / Azure) environments is a plus
Soft Skills:
- Strong problem-solving and analytical mindset
- Good communication and teamwork skills
- Ability to work under pressure and handle on-call support if required













