Cutshort logo

50+ SQL Jobs in India

Apply to 50+ SQL Jobs on CutShort.io. Find your next job, effortlessly. Browse SQL Jobs and apply today!

Sql jobs in other cities
ESQL JobsESQL Jobs in PuneMySQL JobsMySQL Jobs in AhmedabadMySQL Jobs in Bangalore (Bengaluru)MySQL Jobs in BhubaneswarMySQL Jobs in ChandigarhMySQL Jobs in ChennaiMySQL Jobs in CoimbatoreMySQL Jobs in Delhi, NCR and GurgaonMySQL Jobs in HyderabadMySQL Jobs in IndoreMySQL Jobs in JaipurMySQL Jobs in Kochi (Cochin)MySQL Jobs in KolkataMySQL Jobs in MumbaiMySQL Jobs in PunePL/SQL JobsPL/SQL Jobs in AhmedabadPL/SQL Jobs in Bangalore (Bengaluru)PL/SQL Jobs in ChandigarhPL/SQL Jobs in ChennaiPL/SQL Jobs in CoimbatorePL/SQL Jobs in Delhi, NCR and GurgaonPL/SQL Jobs in HyderabadPL/SQL Jobs in IndorePL/SQL Jobs in JaipurPL/SQL Jobs in KolkataPL/SQL Jobs in MumbaiPL/SQL Jobs in PunePostgreSQL JobsPostgreSQL Jobs in AhmedabadPostgreSQL Jobs in Bangalore (Bengaluru)PostgreSQL Jobs in BhubaneswarPostgreSQL Jobs in ChandigarhPostgreSQL Jobs in ChennaiPostgreSQL Jobs in CoimbatorePostgreSQL Jobs in Delhi, NCR and GurgaonPostgreSQL Jobs in HyderabadPostgreSQL Jobs in IndorePostgreSQL Jobs in JaipurPostgreSQL Jobs in Kochi (Cochin)PostgreSQL Jobs in KolkataPostgreSQL Jobs in MumbaiPostgreSQL Jobs in PunePSQL JobsPSQL Jobs in Bangalore (Bengaluru)PSQL Jobs in ChennaiPSQL Jobs in Delhi, NCR and GurgaonPSQL Jobs in HyderabadPSQL Jobs in MumbaiPSQL Jobs in PuneRemote SQL JobsRpgSQL JobsRpgSQL Jobs in HyderabadSQL Jobs in AhmedabadSQL Jobs in Bangalore (Bengaluru)SQL Jobs in BhubaneswarSQL Jobs in ChandigarhSQL Jobs in ChennaiSQL Jobs in CoimbatoreSQL Jobs in Delhi, NCR and GurgaonSQL Jobs in HyderabadSQL Jobs in IndoreSQL Jobs in JaipurSQL Jobs in Kochi (Cochin)SQL Jobs in KolkataSQL Jobs in MumbaiSQL Jobs in PuneTransact-SQL JobsTransact-SQL Jobs in Bangalore (Bengaluru)Transact-SQL Jobs in ChennaiTransact-SQL Jobs in HyderabadTransact-SQL Jobs in JaipurTransact-SQL Jobs in Pune
icon
Bengaluru (Bangalore)
4 - 8 yrs
₹15L - ₹28L / yr
Business Analysis
Data integration
SQL
PMS
CRS
+2 more

Job Description: Business Analyst – Data Integrations

Location: Bangalore / Hybrid / Remote

Company: LodgIQ

Industry: Hospitality / SaaS / Machine Learning

About LodgIQ

Headquartered in New York, LodgIQ delivers a revolutionary B2B SaaS platform to the

travel industry. By leveraging machine learning and artificial intelligence, we enable precise

forecasting and optimized pricing for hotel revenue management. Backed by Highgate

Ventures and Trilantic Capital Partners, LodgIQ is a well-funded, high-growth startup with a

global presence.

About the Role

We’re looking for a skilled Business Analyst – Data Integrations who can bridge the gap

between business operations and technology teams, ensuring smooth, efficient, and scalable

integrations. If you’re passionate about hospitality tech and enjoy solving complex data

challenges, we’d love to hear from you!

What You’ll Do

Key Responsibilities

 Collaborate with vendors to gather requirements for API development and ensure

technical feasibility.

 Collect API documentation from vendors; document and explain business logic to

use external data sources effectively.

 Access vendor applications to create and validate sample data; ensure the accuracy

and relevance of test datasets.

 Translate complex business logic into documentation for developers, ensuring

clarity for successful integration.

 Monitor all integration activities and support tickets in Jira, proactively resolving

critical issues.

 Lead QA testing for integrations, overseeing pilot onboarding and ensuring solution

viability before broader rollout.

 Document onboarding processes and best practices to streamline future

integrations and improve efficiency.

 Build, train, and deploy machine learning models for forecasting, pricing, and

optimization, supporting strategic goals.

 Drive end-to-end execution of data integration projects, including scoping, planning,

delivery, and stakeholder communication.

 Gather and translate business requirements into actionable technical specifications,

liaising with business and technical teams.


 Oversee maintenance and enhancement of existing integrations, performing RCA

and resolving integration-related issues.

 Document workflows, processes, and best practices for current and future

integration projects.

 Continuously monitor system performance and scalability, recommending

improvements to increase efficiency.

 Coordinate closely with Operations for onboarding and support, ensuring seamless

handover and issue resolution.

Desired Skills & Qualifications

 Strong experience in API integration, data analysis, and documentation.

 Familiarity with Jira for ticket management and project workflow.

 Hands-on experience with machine learning model development and deployment.

 Excellent communication skills for requirement gathering and stakeholder

engagement.

 Experience with QA test processes and pilot rollouts.

 Proficiency in project management, data workflow documentation, and system

monitoring.

 Ability to manage multiple integrations simultaneously and work cross-functionally.

Required Qualifications

 Experience: Minimum 4 years in hotel technology or business analytics, preferably

handling data integration or system interoperability projects.

 Technical Skills:

 Basic proficiency in SQL or database querying.

 Familiarity with data integration concepts such as APIs or ETL workflows

(preferred but not mandatory).

 Eagerness to learn and adapt to new tools, platforms, and technologies.

 Hotel Technology Expertise: Understanding of systems such as PMS, CRS, Channel

Managers, or RMS.

 Project Management: Strong organizational and multitasking abilities.

 Problem Solving: Analytical thinker capable of troubleshooting and driving resolution.


 Communication: Excellent written and verbal skills to bridge technical and non-

technical discussions.


 Attention to Detail: Methodical approach to documentation, testing, and deployment.

Preferred Qualification

 Exposure to debugging tools and troubleshooting methodologies.

 Familiarity with cloud environments (AWS).

 Understanding of data security and privacy considerations in the hospitality industry.

Why LodgIQ?

 Join a fast-growing, mission-driven company transforming the future of hospitality.


 Work on intellectually challenging problems at the intersection of machine learning,

decision science, and human behavior.

 Be part of a high-impact, collaborative team with the autonomy to drive initiatives from

ideation to production.

 Competitive salary and performance bonuses.

 For more information, visit https://www.lodgiq.com

Read more
Global digital transformation solutions provider.

Global digital transformation solutions provider.

Agency job
via Peak Hire Solutions by Dhara Thakkar
Bengaluru (Bangalore)
7 - 9 yrs
₹15L - ₹28L / yr
databricks
skill iconPython
SQL
PySpark
skill iconAmazon Web Services (AWS)
+9 more

Role Proficiency:

This role requires proficiency in developing data pipelines including coding and testing for ingesting wrangling transforming and joining data from various sources. The ideal candidate should be adept in ETL tools like Informatica Glue Databricks and DataProc with strong coding skills in Python PySpark and SQL. This position demands independence and proficiency across various data domains. Expertise in data warehousing solutions such as Snowflake BigQuery Lakehouse and Delta Lake is essential including the ability to calculate processing costs and address performance issues. A solid understanding of DevOps and infrastructure needs is also required.


Skill Examples:

  1. Proficiency in SQL Python or other programming languages used for data manipulation.
  2. Experience with ETL tools such as Apache Airflow Talend Informatica AWS Glue Dataproc and Azure ADF.
  3. Hands-on experience with cloud platforms like AWS Azure or Google Cloud particularly with data-related services (e.g. AWS Glue BigQuery).
  4. Conduct tests on data pipelines and evaluate results against data quality and performance specifications.
  5. Experience in performance tuning.
  6. Experience in data warehouse design and cost improvements.
  7. Apply and optimize data models for efficient storage retrieval and processing of large datasets.
  8. Communicate and explain design/development aspects to customers.
  9. Estimate time and resource requirements for developing/debugging features/components.
  10. Participate in RFP responses and solutioning.
  11. Mentor team members and guide them in relevant upskilling and certification.

 

Knowledge Examples:

  1. Knowledge of various ETL services used by cloud providers including Apache PySpark AWS Glue GCP DataProc/Dataflow Azure ADF and ADLF.
  2. Proficient in SQL for analytics and windowing functions.
  3. Understanding of data schemas and models.
  4. Familiarity with domain-related data.
  5. Knowledge of data warehouse optimization techniques.
  6. Understanding of data security concepts.
  7. Awareness of patterns frameworks and automation practices.


 

Additional Comments:

# of Resources: 22 Role(s): Technical Role Location(s): India Planned Start Date: 1/1/2026 Planned End Date: 6/30/2026

Project Overview:

Role Scope / Deliverables: We are seeking highly skilled Data Engineer with strong experience in Databricks, PySpark, Python, SQL, and AWS to join our data engineering team on or before 1st week of Dec, 2025.

The candidate will be responsible for designing, developing, and optimizing large-scale data pipelines and analytics solutions that drive business insights and operational efficiency.

Design, build, and maintain scalable data pipelines using Databricks and PySpark.

Develop and optimize complex SQL queries for data extraction, transformation, and analysis.

Implement data integration solutions across multiple AWS services (S3, Glue, Lambda, Redshift, EMR, etc.).

Collaborate with analytics, data science, and business teams to deliver clean, reliable, and timely datasets.

Ensure data quality, performance, and reliability across data workflows.

Participate in code reviews, data architecture discussions, and performance optimization initiatives.

Support migration and modernization efforts for legacy data systems to modern cloud-based solutions.


Key Skills:

Hands-on experience with Databricks, PySpark & Python for building ETL/ELT pipelines.

Proficiency in SQL (performance tuning, complex joins, CTEs, window functions).

Strong understanding of AWS services (S3, Glue, Lambda, Redshift, CloudWatch, etc.).

Experience with data modeling, schema design, and performance optimization.

Familiarity with CI/CD pipelines, version control (Git), and workflow orchestration (Airflow preferred).

Excellent problem-solving, communication, and collaboration skills.

 

Skills: Databricks, Pyspark & Python, Sql, Aws Services

 

Must-Haves

Python/PySpark (5+ years), SQL (5+ years), Databricks (3+ years), AWS Services (3+ years), ETL tools (Informatica, Glue, DataProc) (3+ years)

Hands-on experience with Databricks, PySpark & Python for ETL/ELT pipelines.

Proficiency in SQL (performance tuning, complex joins, CTEs, window functions).

Strong understanding of AWS services (S3, Glue, Lambda, Redshift, CloudWatch, etc.).

Experience with data modeling, schema design, and performance optimization.

Familiarity with CI/CD pipelines, Git, and workflow orchestration (Airflow preferred).


******

Notice period - Immediate to 15 days

Location: Bangalore

Read more
Mantle Solutions- A Lulu Group Company
Nikita Sinha
Posted by Nikita Sinha
Bangalore (Whitefield)
2 - 4 yrs
Upto ₹20L / yr (Varies
)
skill iconPython
SQL
skill iconMachine Learning (ML)
skill iconData Analytics

We are seeking a hands-on eCommerce Analytics & Insights Lead to help establish and scale our newly launched eCommerce business. The ideal candidate is highly data-savvy, understands eCommerce deeply, and can lead KPI definition, performance tracking, insights generation, and data-driven decision-making.

You will work closely with cross-functional teams—Buying, Marketing, Operations, and Technology—to build dashboards, uncover growth opportunities, and guide the evolution of our online channel.


Key Responsibilities

Define & Monitor eCommerce KPIs

  • Set up and track KPIs across the customer journey: traffic, conversion, retention, AOV/basket size, repeat rate, etc.
  • Build KPI frameworks aligned with business goals.

Data Tracking & Infrastructure

  • Partner with marketing, merchandising, operations, and tech teams to define data tracking requirements.
  • Collaborate with eCommerce and data engineering teams to ensure data quality, completeness, and availability.

Dashboards & Reporting

  • Build dashboards and automated reports to track:
  • Overall site performance
  • Category & product performance
  • Marketing ROI and acquisition effectiveness

Insights & Performance Diagnosis

Identify trends, opportunities, and root causes of underperformance in areas such as:

  • Product availability & stock health
  • Pricing & promotions
  • Checkout funnel drop-offs
  • Customer retention & cohort behavior
  • Channel acquisition performance

Conduct:

  • Cohort analysis
  • Funnel analytics
  • Customer segmentation
  • Basket analysis

Data-Driven Growth Initiatives

  • Propose and evaluate experiments, optimization ideas, and quick wins.
  • Help business teams interpret KPIs and take informed decisions.

Required Skills & Experience

  • 2–5 years experience in eCommerce analytics (grocery retail experience preferred).
  • Strong understanding of eCommerce metrics and analytics frameworks (Traffic → Conversion → Repeat → LTV).
  • Proficiency with tools such as:
  • Google Analytics / GA4
  • Excel
  • SQL
  • Power BI or Tableau
  • Experience working with:
  • Digital marketing data
  • CRM and customer data
  • Product/category performance data
  • Ability to convert business questions into analytical tasks and produce clear, actionable insights.
  • Familiarity with:
  • Customer journey mapping
  • Funnel analysis
  • Basket and behavioral analysis
  • Comfortable working in fast-paced, ambiguous, and build-from-scratch environments.
  • Strong communication and stakeholder management skills.
  • Strong technical capability in at least one programming language: SQL or PySpark.

Good to Have

  • Experience with eCommerce platforms (Shopify, Magento, Salesforce Commerce, etc.).
  • Exposure to A/B testing, recommendation engines, or personalization analytics.
  • Knowledge of Python/R for deeper analytics (optional).
  • Experience with tracking setup (GTM, event tagging, pixel/event instrumentation).


Read more
Loyalytics

at Loyalytics

2 recruiters
Nikita Sinha
Posted by Nikita Sinha
Bengaluru (Bangalore)
4 - 7 yrs
Upto ₹22L / yr (Varies
)
SQL
PowerBI
skill iconData Analytics
Customer Relationship Management (CRM)

In this role, you will drive and support customer analytics for HP’s online store business across the APJ region. You will lead campaign performance analytics, customer database intelligence, and enable data-driven targeting for automation and trigger programs. Your insights will directly shape customer engagement, marketing strategy, and business decision-making.


You will be part of the International Customer Management team, which focuses on customer strategy, base value, monetization, and brand consideration. As part of HP’s Digital Direct organization, you will support the company’s strategic transformation toward direct-to-customer excellence.


Join HP—a US$50B global technology leader known for innovation and being #1 in several business domains.


Key Responsibilities

Customer Insights & Analytics

  • Design and deploy customer success and engagement metrics across APJ.
  • Analyze customer behavior and engagement to drive data-backed marketing decisions.
  • Apply statistical techniques to translate raw data into meaningful insights.

Campaign Performance & Optimization

  • Elevate marketing campaigns across APJ by enabling advanced targeting criteria, performance monitoring, and test-and-learn frameworks.
  • Conduct campaign measurement, identifying trends, patterns, and optimization opportunities.

Data Management & Reporting

  • Develop a deep understanding of business data across markets.
  • Build and maintain SQL-based data assets: tables, stored procedures, scripts, queries, and SQL views.
  • Provide reporting and dashboards for marketing, sales, and CRM teams using Tableau or Power BI.
  • Measure and monitor strategic initiatives against KPIs and provide uplift forecasts for prioritization.

Required Experience

  • 4+ years of relevant experience (flexible for strong profiles).
  • Proficiency in SQL, including:
  • Database design principles
  • Query optimization
  • Data integrity checks
  • Building SQL views, stored procedures, and analytics-ready datasets
  • Experience translating analytics into business outcomes.
  • Hands-on experience analyzing campaign performance.
  • Expertise with data visualization tools such as Tableau or Power BI.
  • Experience with campaign management/marketing automation platforms (preferably Salesforce Marketing Cloud).

About You

  • Strong advocate of customer data–driven marketing.
  • Comfortable working hands-on with data and solving complex problems.
  • Confident communicator who can work with multiple cross-functional stakeholders.
  • Passionate about experimentation (test & learn) and continuous improvement.
  • Self-driven, accountable, and motivated by ownership.
  • Thrive in a diverse, international, dynamic environment.


Read more
Global digital transformation solutions provider.

Global digital transformation solutions provider.

Agency job
via Peak Hire Solutions by Dhara Thakkar
Bengaluru (Bangalore), Chennai, Kochi (Cochin), Trivandrum, Hyderabad, Thiruvananthapuram
8 - 10 yrs
₹10L - ₹25L / yr
Business Analysis
Data Visualization
PowerBI
SQL
Tableau
+18 more

Job Description – Senior Technical Business Analyst

Location: Trivandrum (Preferred) | Open to any location in India

Shift Timings - 8 hours window between the 7:30 PM IST - 4:30 AM IST

 

About the Role

We are seeking highly motivated and analytically strong Senior Technical Business Analysts who can work seamlessly with business and technology stakeholders to convert a one-line problem statement into a well-defined project or opportunity. This role is ideal for fresh graduates who have a strong foundation in data analytics, data engineering, data visualization, and data science, along with a strong drive to learn, collaborate, and grow in a dynamic, fast-paced environment.

As a Technical Business Analyst, you will be responsible for translating complex business challenges into actionable user stories, analytical models, and executable tasks in Jira. You will work across the entire data lifecycle—from understanding business context to delivering insights, solutions, and measurable outcomes.

 

Key Responsibilities

Business & Analytical Responsibilities

  • Partner with business teams to understand one-line problem statements and translate them into detailed business requirementsopportunities, and project scope.
  • Conduct exploratory data analysis (EDA) to uncover trends, patterns, and business insights.
  • Create documentation including Business Requirement Documents (BRDs)user storiesprocess flows, and analytical models.
  • Break down business needs into concise, actionable, and development-ready user stories in Jira.

Data & Technical Responsibilities

  • Collaborate with data engineering teams to design, review, and validate data pipelinesdata models, and ETL/ELT workflows.
  • Build dashboards, reports, and data visualizations using leading BI tools to communicate insights effectively.
  • Apply foundational data science concepts such as statistical analysispredictive modeling, and machine learning fundamentals.
  • Validate and ensure data quality, consistency, and accuracy across datasets and systems.

Collaboration & Execution

  • Work closely with product, engineering, BI, and operations teams to support the end-to-end delivery of analytical solutions.
  • Assist in development, testing, and rollout of data-driven solutions.
  • Present findings, insights, and recommendations clearly and confidently to both technical and non-technical stakeholders.

 

Required Skillsets

Core Technical Skills

  • 6+ years of Technical Business Analyst experience within an overall professional experience of 8+ years
  • Data Analytics: SQL, descriptive analytics, business problem framing.
  • Data Engineering (Foundational): Understanding of data warehousing, ETL/ELT processes, cloud data platforms (AWS/GCP/Azure preferred).
  • Data Visualization: Experience with Power BI, Tableau, or equivalent tools.
  • Data Science (Basic/Intermediate): Python/R, statistical methods, fundamentals of ML algorithms.

 

Soft Skills

  • Strong analytical thinking and structured problem-solving capability.
  • Ability to convert business problems into clear technical requirements.
  • Excellent communication, documentation, and presentation skills.
  • High curiosity, adaptability, and eagerness to learn new tools and techniques.

 

Educational Qualifications

  • BE/B.Tech or equivalent in:
  • Computer Science / IT
  • Data Science

 

What We Look For

  • Demonstrated passion for data and analytics through projects and certifications.
  • Strong commitment to continuous learning and innovation.
  • Ability to work both independently and in collaborative team environments.
  • Passion for solving business problems using data-driven approaches.
  • Proven ability (or aptitude) to convert a one-line business problem into a structured project or opportunity.

 

Why Join Us?

  • Exposure to modern data platforms, analytics tools, and AI technologies.
  • A culture that promotes innovation, ownership, and continuous learning.
  • Supportive environment to build a strong career in data and analytics.

 

Skills: Data Analytics, Business Analysis, Sql


Must-Haves

Technical Business Analyst (6+ years), SQL, Data Visualization (Power BI, Tableau), Data Engineering (ETL/ELT, cloud platforms), Python/R

 

******

Notice period - 0 to 15 days (Max 30 Days)

Educational Qualifications: BE/B.Tech or equivalent in: (Computer Science / IT) /Data Science

Location: Trivandrum (Preferred) | Open to any location in India

Shift Timings - 8 hours window between the 7:30 PM IST - 4:30 AM IST

Read more
Quanteon Solutions
DurgaPrasad Sannamuri
Posted by DurgaPrasad Sannamuri
Hyderabad
0 - 2 yrs
₹3L - ₹5L / yr
skill iconJava
skill iconPython
skill iconJavascript
Selenium
Playwright
+13 more

About the Role

We are looking for a motivated QA Engineer with 0–2 years of experience to join our team at Quanteon. The role is evolving from traditional manual testing to modern, AI-driven and automation-focused QA practices. The ideal candidate should be open to learning development concepts, working with new AI tools, and contributing to intelligent test automation.

Key Responsibilities

  • Perform functional, regression, and integration testing.
  • Develop and execute test cases, test plans, and test scripts.
  • Work closely with developers to understand requirements and identify defects.
  • Learn and implement automation using AI-based test automation tools.
  • Assist in building automated test suites using scripting or programming fundamentals.
  • Analyze test results, document defects, and track issues to closure.
  • Contribute to improving QA processes and adopting modern QA methodologies.

Required Skills

  • Strong understanding of software testing concepts (STLC, SDLC, test design techniques).
  • Basic knowledge of programming concepts (Java, Python, or JavaScript preferred).
  • Understanding of API testing (Postman, Swagger is a plus).
  • Familiarity with automation concepts (Selenium, Playwright, or similar – optional but preferred).
  • Interest/experience in AI-powered testing tools (e.g., TestGPT, Mabl, Katalon AI, etc.).
  • Good analytical and problem-solving skills.
  • Strong communication and documentation abilities.

Preferred Skills (Good to Have)

  • Knowledge of version control (Git/GitHub).
  • Basic understanding of databases and SQL queries.
  • Exposure to CI/CD pipelines is an added advantage.
  • Experience with any bug tracking tools (JIRA, Azure DevOps, etc.).

Who Should Apply?

  • Freshers with strong testing fundamentals and willingness to learn automation & AI testing.
  • QA professionals up to 2 years of experience looking to enhance their skills in AI-driven QA.
  • Candidates eager to grow into full-stack QA roles (Manual + Automation + AI tools).

Educational Qualifications

  • B.Tech / B.E in IT, CSE, AI/ML, ECE
  • M.Tech / M.E in IT, CSE, AI/ML, ECE
  • Strong academic foundation in programming, software engineering, or testing concepts is preferred
  • Certifications in Software Testing, Automation, or AI tools (optional but an added advantage)


Read more
Phi Commerce

at Phi Commerce

2 candid answers
Nikita Sinha
Posted by Nikita Sinha
Pune
3 - 8 yrs
Upto ₹18L / yr (Varies
)
skill iconJava
skill iconSpring Boot
Microservices
SQL
skill iconAngular (2+)

We are seeking skilled and experienced Java Full Stack Developers to join our engineering team. The ideal candidate will have strong backend expertise in Java, Spring Boot, Microservices and hands-on frontend experience with Angular (version 11 or higher). This role requires the ability to build scalable, high-performance applications while working closely across teams such as Product, QA, and Architecture.


Responsibilities

  • Develop, test, and deploy scalable and robust backend services using Java, Spring Boot, and Microservices.
  • Build responsive, user-friendly web applications using Angular (v11+).
  • Collaborate with architects and team members to design scalable, maintainable, and efficient systems.
  • Contribute to system architecture discussions for microservices, APIs, and integrations.
  • Implement and maintain RESTful APIs for seamless frontend-backend interaction.
  • Optimize application performance and perform debugging across the full stack.
  • Write clean, reusable, and maintainable code following engineering best practices.
  • Work cross-functionally with UI/UX, Product Management, QA, and DevOps teams.
  • Mentor junior engineers (for senior positions).

Mandatory Skills

Backend:

  • Java / Java 8
  • Spring Boot
  • Spring Framework
  • Microservices
  • REST API development
  • SQL (MySQL or similar relational database)

Frontend:

  • Angular 11 or higher (mandatory)
  • TypeScript, JavaScript
  • HTML, CSS
  • Note: React or other frameworks are not accepted

Other Mandatory Skills:

  • Strong experience working in Linux-based systems
  • Ability to troubleshoot issues across the full stack
  • Understanding of scalable architecture principles

Preferred Skills

  • Experience in Fintech / Payments / Banking domain
  • Knowledge of caching, performance optimization, and security best practices
  • Exposure to Kafka or messaging systems
  • Hands-on experience with CI/CD pipelines (good to have)

Candidate Profile

  • Strong communication and problem-solving skills
  • Ability to work in a fast-paced environment
  • Collaborative mindset with ownership mentality
  • Open to working from office (Pune, 5 days a week)
  • Willing to travel for the final in-person interview (if not based in Pune)


Read more
Phi Commerce

at Phi Commerce

2 candid answers
Nikita Sinha
Posted by Nikita Sinha
Pune
3 - 8 yrs
Upto ₹18L / yr (Varies
)
skill iconJava
skill iconSpring Boot
Microservices
SQL
Linux/Unix

We are seeking an experienced and highly skilled Backend Java Engineer to join our team.

The ideal candidate will have strong expertise in Core Java, Spring Boot, Microservices, and building high-performance, scalable backend applications.


Responsibilities

  • Develop, test, and deploy scalable and robust backend services using Java, Spring Boot, and Spring Framework.
  • Design and implement RESTful APIs for seamless integrations.
  • Contribute to architectural decisions involving microservices, APIs, and cloud-based solutions.
  • Write clean, efficient, and reusable code following coding standards and best practices.
  • Optimize application performance and participate in debugging and troubleshooting sessions.
  • Collaborate with architects, product managers, and QA engineers to deliver high-quality releases.
  • Conduct peer code reviews and ensure adherence to engineering best practices.
  • Mentor junior engineers and support their technical growth where required.

Skills & Requirements

  • Minimum 2 years of hands-on backend development experience.
  • Strong proficiency in:
  • Core Java / Java 8
  • Spring Boot, Spring Framework
  • Microservices architecture
  • REST APIs
  • Experience with:
  • Kafka (preferred)
  • MySQL or other relational databases
  • Batch processing, application performance tuning, caching strategies
  • Web security / application security
  • Solid understanding of software design principles and scalable system design.

Preferred

  • Male candidates preferred (client-mentioned requirement).
  • Experience working in fintech, payments, or high-scale production environments
Read more
Phi Commerce

at Phi Commerce

2 candid answers
Nikita Sinha
Posted by Nikita Sinha
Pune
11 - 15 yrs
Upto ₹32L / yr (Varies
)
Linux/Unix
SQL
Shell Scripting
skill iconAmazon Web Services (AWS)
CI/CD
+2 more

The Production Infrastructure Manager is responsible for overseeing and maintaining the infrastructure that powers our payment gateway systems in a high-availability production environment. This role requires deep technical expertise in cloud platforms, networking, and security, along with strong leadership capability to guide a team of infrastructure engineers. You will ensure the system’s reliability, performance, and compliance with regulatory standards while driving continuous improvement.


Key Responsibilities:

Infrastructure Management

  • Manage and optimize infrastructure for payment gateway systems to ensure high availability, reliability, and scalability.
  • Oversee daily operations of production environments, including AWS cloud services, load balancers, databases, and monitoring systems.
  • Implement and maintain infrastructure automation, provisioning, configuration management, and disaster recovery strategies.
  • Develop and maintain capacity planning, monitoring, and backup mechanisms to support peak transaction periods.
  • Oversee regular patching, updates, and version control to minimize vulnerabilities.

Team Leadership

  • Lead and mentor a team of infrastructure engineers and administrators.
  • Provide technical direction to ensure efficient and effective implementation of infrastructure solutions.

Cross-Functional Collaboration

  • Work closely with development, security, and product teams to ensure infrastructure aligns with business needs and regulatory requirements (PCI-DSS, GDPR).
  • Ensure infrastructure practices meet industry standards and security requirements (PCI-DSS, ISO 27001).

Monitoring & Incident Management

  • Monitor infrastructure performance using tools like Prometheus, Grafana, Datadog, etc.
  • Conduct incident response, root cause analysis, and post-mortems to prevent recurring issues.
  • Manage and execute on-call duties, ensuring timely resolution of infrastructure-related issues.

Documentation

  • Maintain comprehensive documentation, including architecture diagrams, processes, and disaster recovery plans.

Skills and Qualifications

Required

  • Bachelor’s degree in Computer Science, IT, or equivalent experience.
  • 8+ years of experience managing production infrastructure in high-availability, mission-critical environments (fintech or payment gateways preferred).
  • Expertise in AWS cloud environments.
  • Strong experience with Infrastructure as Code (IaC) tools such as Terraform or CloudFormation.
  • Deep understanding of:
  • Networking (load balancers, firewalls, VPNs, distributed systems)
  • Database systems (SQL/NoSQL), HA & DR strategies
  • Automation tools (Ansible, Chef, Puppet) and containerization/orchestration (Docker, Kubernetes)
  • Security best practices, encryption, vulnerability management, PCI-DSS compliance
  • Experience with monitoring tools (Prometheus, Grafana, Datadog).
  • Strong analytical and problem-solving skills.
  • Excellent communication and leadership capabilities.

Preferred

  • Experience in fintech/payment industry with regulatory exposure.
  • Ability to operate effectively under pressure and ensure service continuity.


Read more
Albert Invent

at Albert Invent

4 candid answers
3 recruiters
Nikita Sinha
Posted by Nikita Sinha
Bengaluru (Bangalore)
4 - 6 yrs
Upto ₹30L / yr (Varies
)
skill iconPython
AWS Lambda
Amazon Redshift
Snow flake schema
SQL

To design, build, and optimize scalable data infrastructure and pipelines that enable efficient

data collection, transformation, and analysis across the organization. The Senior Data Engineer

will play a key role in driving data architecture decisions, ensuring data quality and availability,

and empowering analytics, product, and engineering teams with reliable, well-structured data to

support business growth and strategic decision-making.


Responsibilities:

• Develop, and maintain SQL and NoSQL databases, ensuring high performance,

scalability, and reliability.

• Collaborate with the API team and Data Science team to build robust data pipelines and

automations.

• Work closely with stakeholders to understand database requirements and provide

technical solutions.

• Optimize database queries and performance tuning to enhance overall system

efficiency.

• Implement and maintain data security measures, including access controls and

encryption.

• Monitor database systems and troubleshoot issues proactively to ensure uninterrupted

service.

• Develop and enforce data quality standards and processes to maintain data integrity.

• Create and maintain documentation for database architecture, processes, and

procedures.

• Stay updated with the latest database technologies and best practices to drive

continuous improvement.

• Expertise in SQL queries and stored procedures, with the ability to optimize and fine-tune

complex queries for performance and efficiency.

• Experience with monitoring and visualization tools such as Grafana to monitor database

performance and health.


Requirements:

• 4+ years of experience in data engineering, with a focus on large-scale data systems.

• Proven experience designing data models and access patterns across SQL and NoSQL

ecosystems.

• Hands-on experience with technologies like PostgreSQL, DynamoDB, S3, GraphQL, or

vector databases.

• Proficient in SQL stored procedures with extensive expertise in MySQL schema design,

query optimization, and resolvers, along with hands-on experience in building and

maintaining data warehouses.

• Strong programming skills in Python or JavaScript, with the ability to write efficient,

maintainable code.

• Familiarity with distributed systems, data partitioning, and consistency models.

• Familiarity with observability stacks (Prometheus, Grafana, OpenTelemetry) and

debugging production bottlenecks.

• Deep understanding of cloud infrastructure (preferably AWS), including networking, IAM,

and cost optimization.

• Prior experience building multi-tenant systems with strict performance and isolation

guarantees.

• Excellent communication and collaboration skills to influence cross-functional technical

decisions.

Read more
Global digital transformation solutions provider.

Global digital transformation solutions provider.

Agency job
via Peak Hire Solutions by Dhara Thakkar
Thiruvananthapuram, Chennai, Pune
4 - 7 yrs
₹10L - ₹20L / yr
skill iconC#
Test Automation (QA)
Manual testing
Play Framework
SQL
+6 more

Role Proficiency:

Performs tests in strict compliance independently guides other testers and assists test leads


Additional Comments:

Position Title: - Automation + Manual Tester Primary

Skills: Playwright, xUnit, Allure Report, Page Object Model, .Net, C#, Database Queries

Secondary Skills: GIT, JIRA, Manual Testing Experience: 4 to 5 years ESSENTIAL FUNCTIONS AND


BASIC DUTIES

1. Leadership in Automation Strategy: o Assess the feasibility and scope of automation efforts to ensure they align with project timelines and requirements. o Identify opportunities for process improvements and automation within the software development life cycle (SDLC).

2. Automation Test Framework Development: o Design, develop, and implement reusable test automation frameworks for various testing phases (unit, integration, functional, performance, etc.). o Ensure the automation frameworks integrate well with CI/CD pipelines and other development tools. o Maintain and optimize test automation scripts and frameworks for continuous improvements.

3. Team Management: o Lead and mentor a team of automation engineers, ensuring they follow best practices, writing efficient test scripts, and developing scalable automation solutions. o Conduct regular performance evaluations and provide constructive feedback. o Facilitate knowledge-sharing sessions within the team.

4. Collaboration with Cross-functional Teams: o Work closely with development, QA, and operations teams to ensure proper implementation of automated testing and automation practices. o Collaborate with business analysts, product owners, and project managers to understand business requirements and translate them into automated test cases.

5. Continuous Integration & Delivery (CI/CD): o Ensure that automated tests are integrated into the CI/CD pipelines to facilitate continuous testing. o Identify and resolve issues related to the automation processes within the CI/CD pipeline.

6. Test Planning and Estimation: o Contribute to the test planning phase by identifying key automation opportunities. o Estimate effort and time required for automating test cases and other automation tasks.

7. Test Reporting and Metrics: o Monitor automation test results and generate detailed reports on test coverage, defects, and progress. o Analyze test results to identify trends, bottlenecks, or issues in the automation process and make necessary improvements.

8. Automation Tools Management: o Evaluate, select, and manage automation tools and technologies that best meet the needs of the project. o Ensure that the automation tools used align with the overall project requirements and help to achieve optimal efficiency.

9. Test Environment and Data Management: o Work on setting up and maintaining the test environments needed for automation. o Ensure automation scripts work across multiple environments, including staging, testing, and production environments.

10. Risk Management & Issue Resolution:

• Proactively identify risks associated with the automation efforts and provide solutions or mitigation strategies.

• Troubleshoot issues in the automation scripts, framework, and infrastructure to ensure minimal downtime and quick issue resolution.

11. Develop and Maintain Automated Tests: Write and maintain automated scripts for different testing levels, including regression, functional, and integration tests.

12. Bug Identification and Tracking: Report, track, and manage defects identified through automation testing to ensure quick resolution.

13. Improve Test Coverage: Identify gaps in test coverage and develop additional test scripts to improve test comprehensiveness. 14. Automation Documentation: Create and maintain detailed documentation for test automation processes, scripts, and frameworks.

15. Quality Assurance: Ensure that all automated testing activities meet the quality standards, contributing to delivering a high-quality software product.

16. Stakeholder Communication: Regularly update project stakeholders about automation progress, risks, and areas for improvement.


REQUIRED KNOWLEDGE

1. Automation Tools Expertise: Proficiency in tools like Playwright, Allure reports and integration with CI/CD pipelines.

2. Programming Languages: Strong knowledge of languages such as .NET and test frameworks like xUnit.

3. Version Control: Experience using Git for script management and collaboration.

4. Test Automation Frameworks: Ability to design scalable, reusable frameworks for different types of tests (functional, integration, etc.).

5. Leadership and Mentoring: Lead and mentor automation teams, ensuring adherence to best practices and continuous improvement.

6. Problem-Solving: Strong troubleshooting and analytical skills to identify and resolve automation issues quickly.

7. Collaboration and Communication: Excellent communication skills for working with cross-functional teams and presenting test results.

8. Time Management: Ability to estimate, prioritize, and manage automation tasks to meet project deadlines.

9. Quality Focus: Strong commitment to improving software quality, test coverage, and automation efficiency.


Skills: xUnit, Allure report, Playwright, C#

Read more
CFRA

at CFRA

4 candid answers
2 recruiters
Bisman Gill
Posted by Bisman Gill
Remote only
3yrs+
Upto ₹15L / yr (Varies
)
skill iconAmazon Web Services (AWS)
SQL
Selenium
Appium
cypress
+3 more

Quality Engineer is responsible for planning, developing, and executing tests for CFRA’s financial software. The responsibilities include designing and implementing tests, debugging and defining corrective actions. The role plays an important part in our company’s product development process. Our ideal candidate will be responsible for conducting tests to ensure software runs efficiently and meets client needs, while at the same time being cost-effective. You will be part of CFRA Data Collection Team responsible for collecting, processing and publishing financial market data for internal and external stakeholders. The team uses a contemporary stack in the AWS Cloud to design, build and maintain a robust data architecture, data engineering pipelines, and large-scale data systems. You will be responsible for verifying and validating all data quality and completeness parameters for the automated (ETL) pipeline processes (new and existing).

Key Responsibilities

  • Review requirements, specifications and technical design documents to provide timely and meaningful feedback
  • Create detailed, comprehensive and well-structured test plans and test cases
  • Estimate, prioritize, plan and coordinate testing activities
  • Identify, record, document thoroughly and track bugs
  • Develop and apply testing processes for new and existing products to meet client needs
  • Liaise with internal teams to identify system requirements and develop testing plans
  • Investigate the causes of non-conforming software and train users to implement solutions
  • Stay up-to-date with new testing tools and test strategies

Desired Skills

  • Proven work experience in software development and quality assurance
  • Strong knowledge of software QA methodologies, tools and processes
  • Experience in writing clear, concise and comprehensive test plans and test cases
  • Hands-on experience with automated testing tools
  • Acute attention to detail
  • Experience working in an Agile/Scrum development process
  • Excellent collaboration skills

 Technical Skills

  • Proficient with SQL, and capable of developing queries for testing
  • Familiarity with Python, especially for scripting tests
  • Familiarity with Cloud Technology and working with remote servers


Read more
CFRA

at CFRA

4 candid answers
2 recruiters
Bisman Gill
Posted by Bisman Gill
Remote only
4yrs+
Upto ₹23L / yr (Varies
)
skill iconAmazon Web Services (AWS)
SQL
skill iconPython
skill iconNodeJS (Node.js)
skill iconJava
+1 more

The Senior Software Developer is responsible for development of CFRA’s report generation framework using a modern technology stack: Python on AWS cloud infrastructure, SQL, and Web technologies. This is an opportunity to make an impact on both the team and the organization by being part of the design and development of a new customer-facing report generation framework that will serve as the foundation for all future report development at CFRA.

The ideal candidate has a passion for solving business problems with technology and can effectively communicate business and technical needs to stakeholders. We are looking for candidates that value collaboration with colleagues and having an immediate, tangible impact for a leading global independent financial insights and data company.


Key Responsibilities

  • Analyst Workflows: Design and development of CFRA’s integrated content publishing platform using a proprietary 3rd party editorial and publishing platform for integrated digital publishing.
  • Designing and Developing APIs: Design and development of robust, scalable, and secure APIs on AWS, considering factors like performance, reliability, and cost-efficiency.
  • AWS Service Integration: Integrate APIs with various AWS services such as AWS Lambda, Amazon API Gateway, Amazon SQS, Amazon SNS, AWS Glue, and others, to build comprehensive and efficient solutions.
  • Performance Optimization: Identify and implement optimizations to improve performance, scalability, and efficiency, leveraging AWS services and tools.
  • Security and Compliance: Ensure APIs are developed following best security practices, including authentication, authorization, encryption, and compliance with relevant standards and regulations.
  • Monitoring and Logging: Implement monitoring and logging solutions for APIs using AWS CloudWatch, AWS X-Ray, or similar tools, to ensure availability, performance, and reliability.
  • Continuous Integration and Deployment (CI/CD): Establish and maintain CI/CD pipelines for API development, automating testing, deployment, and monitoring processes on AWS.
  • Documentation and Training: Create and maintain comprehensive documentation for internal and external users, and provide training and support to developers and stakeholders.
  • Team Collaboration: Collaborate effectively with cross-functional teams, including product managers, designers, and other developers, to deliver high-quality solutions that meet business requirements.
  • Problem Solving: troubleshooting efforts, identifying root causes and implementing solutions to ensure system stability and performance.
  • Stay Updated: Stay updated with the latest trends, tools, and technologies related to development on AWS, and continuously improve your skills and knowledge.

Desired Skills and Experience

  • Development: 5+ years of extensive experience in designing, developing, and deploying using modern technologies, with a focus on scalability, performance, and security.
  • AWS Services: proficiency in using AWS services such as AWS Lambda, Amazon API Gateway, Amazon SQS, Amazon SNS, Amazon SES, Amazon RDS, Amazon DynamoDB, and others, to build and deploy API solutions.
  • Programming Languages: Proficiency in programming languages commonly used for development, such as Python, Node.js, or others, as well as experience with serverless frameworks like AWS.
  • Architecture Design: Ability to design scalable and resilient API architectures using microservices, serverless, or other modern architectural patterns, considering factors like performance, reliability, and cost-efficiency.
  • Security: Strong understanding of security principles and best practices, including authentication, authorization, encryption, and compliance with standards like OAuth, OpenID Connect, and AWS IAM.
  • DevOps Practices: Familiarity with DevOps practices and tools, including CI/CD pipelines, infrastructure as code (IaC), and automated testing, to ensure efficient and reliable deployment on AWS.
  • Problem-solving Skills: Excellent problem-solving skills, with the ability to troubleshoot complex issues, identify root causes, and implement effective solutions to ensure the stability and performance.
  • Communication Skills: Strong communication skills, with the ability to effectively communicate technical concepts to both technical and non-technical stakeholders, and collaborate with cross- functional teams.
  • Agile Methodologies: Experience working in Agile development environments, following practices like Scrum or Kanban, and ability to adapt to changing requirements and priorities.
  • Continuous Learning: A commitment to continuous learning and staying updated with the latest trends, tools, and technologies related to development and AWS services.
  • Bachelor's Degree: A bachelor's degree in Computer Science, Software Engineering, or a related field is often preferred, although equivalent experience and certifications can also be valuable.




Read more
CFRA

at CFRA

4 candid answers
2 recruiters
Bisman Gill
Posted by Bisman Gill
Remote only
7yrs+
Upto ₹36L / yr (Varies
)
skill iconAmazon Web Services (AWS)
SQL
skill iconPython
skill iconNodeJS (Node.js)
skill iconJava

The Lead Software Developer is responsible for development of CFRA’s report generation framework using a modern technology stack: Python on AWS cloud infrastructure, SQL, and Web technologies. This is an opportunity to make an impact on both the team and the organization by being part of the design and development of a new customer-facing report generation framework that will serve as the foundation for all future report development at CFRA.

The ideal candidate has a passion for solving business problems with technology and can effectively communicate business and technical needs to stakeholders. We are looking for candidates that value collaboration with colleagues and having an immediate, tangible impact for a leading global independent financial insights and data company.


Key Responsibilities

  • Analyst Workflows: Lead the design and development of CFRA’s integrated content publishing platform using a proprietary 3rd party editorial and publishing platform for integrated digital publishing.
  • Designing and Developing APIs: Lead the design and development of robust, scalable, and secure APIs on AWS, considering factors like performance, reliability, and cost-efficiency.
  • Architecture Planning: Collaborate with architects and stakeholders to define architecture, including API gateway, microservices, and serverless components, ensuring alignment with business goals and AWS best practices.
  • Technical Leadership: Provide technical guidance and leadership to the development team, ensuring adherence to coding standards, best practices, and AWS guidelines.
  • AWS Service Integration: Integrate APIs with various AWS services such as AWS Lambda, Amazon API Gateway, Amazon SQS, Amazon SNS, AWS Glue, and others, to build comprehensive and efficient solutions.
  • Performance Optimization: Identify and implement optimizations to improve performance, scalability, and efficiency, leveraging AWS services and tools.
  • Security and Compliance: Ensure APIs are developed following best security practices, including authentication, authorization, encryption, and compliance with relevant standards and regulations.
  • Monitoring and Logging: Implement monitoring and logging solutions for APIs using AWS CloudWatch, AWS X-Ray, or similar tools, to ensure availability, performance, and reliability.
  • Continuous Integration and Deployment (CI/CD): Establish and maintain CI/CD pipelines for API development, automating testing, deployment, and monitoring processes on AWS.
  • Documentation and Training: Create and maintain comprehensive documentation for internal and external users, and provide training and support to developers and stakeholders.
  • Team Collaboration: Collaborate effectively with cross-functional teams, including product managers, designers, and other developers, to deliver high-quality solutions that meet business requirements.
  • Problem Solving: Lead troubleshooting efforts, identifying root causes and implementing solutions to ensure system stability and performance.
  • Stay Updated: Stay updated with the latest trends, tools, and technologies related to development on AWS, and continuously improve your skills and knowledge.


Desired Skills and Experience

  • Development: 10+ years of extensive experience in designing, developing, and deploying using modern technologies, with a focus on scalability, performance, and security.
  • AWS Services: Strong proficiency in using AWS services such as AWS Lambda, Amazon API Gateway, Amazon SQS, Amazon SNS, Amazon SES, Amazon RDS, Amazon DynamoDB, and others, to build and deploy API solutions.
  • Programming Languages: Proficiency in programming languages commonly used for development, such as Python, Node.js, or others, as well as experience with serverless frameworks like AWS.
  • Architecture Design: Ability to design scalable and resilient API architectures using microservices, serverless, or other modern architectural patterns, considering factors like performance, reliability, and cost-efficiency.
  • Security: Strong understanding of security principles and best practices, including authentication, authorization, encryption, and compliance with standards like OAuth, OpenID Connect, and AWS IAM.
  • DevOps Practices: Familiarity with DevOps practices and tools, including CI/CD pipelines, infrastructure as code (IaC), and automated testing, to ensure efficient and reliable deployment on AWS.
  • Problem-solving Skills: Excellent problem-solving skills, with the ability to troubleshoot complex issues, identify root causes, and implement effective solutions to ensure the stability and performance.
  • Team Leadership: Experience leading and mentoring a team of developers, providing technical guidance, code reviews, and fostering a collaborative and innovative environment.
  • Communication Skills: Strong communication skills, with the ability to effectively communicate technical concepts to both technical and non-technical stakeholders, and collaborate with cross- functional teams.
  • Agile Methodologies: Experience working in Agile development environments, following practices like Scrum or Kanban, and ability to adapt to changing requirements and priorities.
  • Continuous Learning: A commitment to continuous learning and staying updated with the latest trends, tools, and technologies related to development and AWS services.
  • Bachelor's Degree: A bachelor's degree in Computer Science, Software Engineering, or a related field is often preferred, although equivalent experience and certifications can also be valuable.


Read more
Financial Services Company

Financial Services Company

Agency job
via Peak Hire Solutions by Dhara Thakkar
Bengaluru (Bangalore), Delhi
3 - 6 yrs
₹10L - ₹25L / yr
Project Management
SQL
JIRA
SQL Query Analyzer
confluence
+23 more

Required Skills: Excellent Communication Skills, Project Management, SQL queries, Expertise with Tools such as Jira, Confluence etc.


Criteria:

  • Candidate must have Project management experience.
  • Candidate must have strong experience in accounting principles, financial workflows, and R2R (Record to Report) processes.
  • Candidate should have an academic background in Commerce or MBA Finance.
  • Candidates must be from a Fintech/ Financial service only.
  • Good experience with SQL and must have MIS experience.
  • Must have experience in Treasury Module.
  • 3+ years of implementation experience is required.
  • Candidate should have Hands-on experience with tools such as Jira, Confluence, Excel, and project management platforms.
  • Need candidate from Bangalore and Delhi/NCR ONLY.
  • Need Immediate joiner or candidate with up to 30 Days’ Notice period.

 

Description

Position Overview

We are looking for an experienced Implementation Lead with deep expertise in financial workflows, R2R processes, and treasury operations to drive client onboarding and end-to-end implementations. The ideal candidate will bring a strong Commerce / MBA Finance background, proven project management experience, and technical skills in SQL and ETL to ensure seamless deployments for fintech and financial services clients.


Key Responsibilities

  • Lead end-to-end implementation projects for enterprise fintech clients
  • Translate client requirements into detailed implementation plans and configure solutions accordingly.
  • Write and optimize complex SQL queries for data analysis, validation, and integration
  • Oversee ETL processes – extract, transform, and load financial data across systems
  • Collaborate with cross-functional teams including Product, Engineering, and Support
  • Ensure timely, high-quality delivery across multiple stakeholders and client touchpoints
  • Document processes, client requirements, and integration flows in detail.
  • Configure and deploy company solutions for R2R, treasury, and reporting workflows.


Required Qualifications

  • Bachelor’s degree Commerce background / MBA Finance (mandatory).
  • 3+ years of hands-on implementation/project management experience
  • Proven experience delivering projects in Fintech, SaaS, or ERP environments
  • Strong expertise in accounting principles, R2R (Record-to-Report), treasury, and financial workflows.
  • Hands-on SQL experience, including the ability to write and debug complex queries (joins, CTEs, subqueries)
  • Experience working with ETL pipelines or data migration processes
  • Proficiency in tools like Jira, Confluence, Excel, and project tracking systems
  • Strong communication and stakeholder management skills
  • Ability to manage multiple projects simultaneously and drive client success


Qualifications

  • Prior experience implementing financial automation tools (e.g., SAP, Oracle, Anaplan, Blackline)
  • Familiarity with API integrations and basic data mapping
  • Experience in agile/scrum-based implementation environments
  • Exposure to reconciliation, book closure, AR/AP, and reporting systems
  • PMP, CSM, or similar certifications



Skills & Competencies

Functional Skills

  • Financial process knowledge (e.g., reconciliation, accounting, reporting)
  • Business analysis and solutioning
  • Client onboarding and training
  • UAT coordination
  • Documentation and SOP creation

 

Project Skills

  • Project planning and risk management
  • Task prioritization and resource coordination
  • KPI tracking and stakeholder reporting

 

Soft Skills

  • Cross-functional collaboration
  • Communication with technical and non-technical teams
  • Attention to detail and customer empathy
  • Conflict resolution and crisis management


What We Offer

  • An opportunity to shape fintech implementations across fast-growing companies
  • Work in a dynamic environment with cross-functional experts
  • Competitive compensation and rapid career growth
  • A collaborative and meritocratic culture


 


Read more
Wissen Technology

at Wissen Technology

4 recruiters
Swet Patel
Posted by Swet Patel
Bengaluru (Bangalore)
5 - 13 yrs
Best in industry
databricks
skill iconPython
SQL
PySpark
Spark

Key Responsibilities

We are seeking an experienced Data Engineer with a strong background in Databricks, Python, Spark/PySpark and SQL to design, develop, and optimize large-scale data processing applications. The ideal candidate will build scalable, high-performance data engineering solutions and ensure seamless data flow across cloud and on-premise platforms.

Key Responsibilities:

  • Design, develop, and maintain scalable data processing applications using DatabricksPython, and PySpark/Spark.
  • Write and optimize complex SQL queries for data extraction, transformation, and analysis.
  • Collaborate with data engineers, data scientists, and other stakeholders to understand business requirements and deliver high-quality solutions.
  • Ensure data integrity, performance, and reliability across all data processing pipelines.
  • Perform data analysis and implement data validation to ensure high data quality.
  • Implement and manage CI/CD pipelines for automated testing, integration, and deployment.
  • Contribute to continuous improvement of data engineering processes and tools.

Required Skills & Qualifications:

  • Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field.
  • Proven experience as a Databricks with strong expertise in Python, SQL and Spark/PySpark.
  • Strong proficiency in SQL, including working with relational databases and writing optimized queries.
  • Solid programming experience in Python, including data processing and automation.


Read more
Travel Tech

Travel Tech

Agency job
via Jobdost by Saida Pathan
Gurugram
1 - 2 yrs
₹15L - ₹18L / yr
skill iconPython
Tableau
PowerBI
SQL
JIRA
+4 more

What you will do:-


● Partnering with Product Managers and cross-functional teams to define metrics, build dashboards, and track product performance.

● Conducting deep-dive analyses of large-scale data to identify trends, user behavior patterns, growth gaps, and improvement opportunities.

● Performing competitive benchmarking and industry research to support product strategy and prioritization.

● Generating data-backed insights to drive feature enhancements, product experiments, and business decisions.

● Tracking post-launch impact by measuring adoption, engagement, retention, and ROI of new features.

● Working with Data, Engineering, Business, and Ops teams to design and measure experiments (A/B tests, cohorts, funnels).

● Creating reports, visualizations, and presentations that simplify complex data for stakeholdersand leadership.

● Supporting the product lifecycle with relevant data inputs during research, ideation, launch, and optimization phases.


What we are looking for:-

● Bachelor’s degree in engineering, statistics, business, economics, mathematics, data science, or a related field.

● Strong analytical, quantitative, and problem-solving skills.

● Proficiency in SQL and ability to work with large datasets.

● Experience with data visualization/reporting tools (e.g., Excel, Google Sheets, Power BI, Tableau, Looker, Mixpanel, GA).

● Excellent communication skills — able to turn data into clear narratives and actionable recommendations.

● Ability to work collaboratively in cross-functional teams.

● Passion for product, user behavior, and data-driven decision-making

● Prior internship or work experience in product analytics, business analysis, consulting, or growth teams.

● Familiarity with experimentation techniques (A/B testing, funnels, cohorts, retention metrics).

● Understanding of product management concepts and tools (Jira, Confluence, etc.).

● Knowledge of Python or R for data analysis (optional but beneficial).

● Exposure to consumer tech, mobility, travel, or marketplaces. .


- Candidate Must be a graduate from IIT, NIT, NSUT, or DTU.


- Need candidate with 1–2 years of pure Product Analyst experience is mandatory.


- Candidate must have strong hands-on experience in Product + Data Analysis + Python.

- Candidate should have Python skill on the scale of 3/5 at least.


- Proficiency in SQL and ability to work with large datasets.


- The candidate must experience with A/B testing, cohorts, funnels, retention, and product metrics.


- Hands-on experience with data visualization tools (Tableau, Power BI, Looker, Mixpanel, GA, etc.).


- Candidate must have experienece in Jira.



Read more
Financial Services

Financial Services

Agency job
via Jobdost by Saida Pathan
Bengaluru (Bangalore), Delhi, Gurugram, Noida, Ghaziabad, Faridabad
3 - 6 yrs
₹20L - ₹25L / yr
Project Management
SQL
JIRA
confluence

Position Overview

We are looking for an experienced Implementation Lead with deep expertise in financial workflows, R2R processes, and treasury operations to drive client onboarding and end-to-end implementations. The ideal candidate will bring a strong Commerce / MBA Finance background, proven project management experience, and technical skills in SQL and ETL to ensure seamless deployments for fintech and financial services clients.


Key Responsibilities

  • Lead end-to-end implementation projects for enterprise fintech clients
  • Translate client requirements into detailed implementation plans and configure solutions accordingly.
  • Write and optimize complex SQL queries for data analysis, validation, and integration
  • Oversee ETL processes – extract, transform, and load financial data across systems
  • Collaborate with cross-functional teams including Product, Engineering, and Support
  • Ensure timely, high-quality delivery across multiple stakeholders and client touchpoints
  • Document processes, client requirements, and integration flows in detail.
  • Configure and deploy Bluecopa solutions for R2R, treasury, and reporting workflows.


Required Qualifications

  • Bachelor’s degree Commerce background / MBA Finance (mandatory).
  • 3+ years of hands-on implementation/project management experience
  • Proven experience delivering projects in Fintech, SaaS, or ERP environments
  • Strong expertise in accounting principles, R2R (Record-to-Report), treasury, and financial workflows.
  • Hands-on SQL experience, including the ability to write and debug complex queries (joins, CTEs, subqueries)
  • Experience working with ETL pipelines or data migration processes
  • Proficiency in tools like Jira, Confluence, Excel, and project tracking systems
  • Strong communication and stakeholder management skills
  • Ability to manage multiple projects simultaneously and drive client success

Preferred Qualifications

  • Prior experience implementing financial automation tools (e.g., SAP, Oracle, Anaplan, Blackline)
  • Familiarity with API integrations and basic data mapping
  • Experience in agile/scrum-based implementation environments
  • Exposure to reconciliation, book closure, AR/AP, and reporting systems
  • PMP, CSM, or similar certifications


Skills & Competencies

Functional Skills

  • Financial process knowledge (e.g., reconciliation, accounting, reporting)
  • Business analysis and solutioning
  • Client onboarding and training
  • UAT coordination
  • Documentation and SOP creation

Project Skills

  • Project planning and risk management
  • Task prioritization and resource coordination
  • KPI tracking and stakeholder reporting

Soft Skills

  • Cross-functional collaboration
  • Communication with technical and non-technical teams
  • Attention to detail and customer empathy
  • Conflict resolution and crisis management


What We Offer

  • An opportunity to shape fintech implementations across fast-growing companies
  • Work in a dynamic environment with cross-functional experts
  • Competitive compensation and rapid career growth
  • A collaborative and meritocratic culture


Read more
Capace Software Private Limited
Bhopal, Bengaluru (Bangalore)
7 - 13 yrs
₹9L - ₹12L / yr
Android
skill iconAndroid Development
frontend
Backend testing
fintech
+16 more

Job Description -Technical Project Manager

Job Title: Technical Project Manager

Location: Bhopal / Bangalore (On-site)

Experience Required: 7+ Years

Industry: Fintech / SaaS / Software Development

Role Overview

We are looking for a Technical Project Manager (TPM) who can bridge the gap between management and developers. The TPM will manage Android, Frontend, and Backend teams, ensure smooth development processes, track progress, evaluate output quality, resolve technical issues, and deliver timely reports.

Key Responsibilities

Project & Team Management

  • Manage daily tasks for Android, Frontend, and Backend developers
  • Conduct daily stand-ups, weekly planning, and reviews
  • Track progress, identify blockers, and ensure timely delivery
  • Maintain sprint boards, task estimations, and timelines

Technical Requirement Translation

  • Convert business requirements into technical tasks
  • Communicate requirements clearly to developers
  • Create user stories, flow diagrams, and PRDs
  • Ensure requirements are understood and implemented correctly

Quality & Build Review

  • Validate build quality, UI/UX flow, functionality
  • Check API integrations, errors, performance issues
  • Ensure coding practices and architecture guidelines are followed
  • Perform preliminary QA before handover to testing or clients

Issue Resolution

  • Identify development issues early
  • Coordinate with developers to fix bugs
  • Escalate major issues to founders with clear insights

Reporting & Documentation

  • Daily/weekly reports to management
  • Sprint documentation, release notes
  • Maintain project documentation & version control processes

Cross-Team Communication

  • Act as the single point of contact for management
  • Align multiple tech teams with business goals
  • Coordinate with HR and operations for resource planning

Required Skills

  • Strong understanding of Android, Web (Frontend/React), Backend development flows
  • Knowledge of APIs, Git, CI/CD, basic testing
  • Experience with Agile/Scrum methodologies
  • Ability to review builds and suggest improvements
  • Strong documentation skills (Jira, Notion, Trello, Asana)
  • Excellent communication & leadership
  • Ability to handle pressure and multiple projects

Good to Have

  • Prior experience in Fintech projects
  • Basic knowledge of UI/UX
  • Experience in preparing FSD/BRD/PRD
  • QA experience or understanding of test cases

Salary Range: 9 to 12 LPA

Read more
Navi Mumbai
4 - 8 yrs
₹8L - ₹10L / yr
Oracle SQL Developer
MySQL
ETL
Database Design
SQL
+1 more

Company Name : Enlink Managed Services

Company Website : https://enlinkit.com/

Location : Turbhe , Navi Mumbai

Shift Time : 12 pm to 9:30 pm

Working Days : 5 Days Working(Sat-Sun Fixed Off)

SQL Developer 

Roles & Responsibilities :

Designing Database, writing stored procedures, complex and dynamic queries in SQL

Creating Indexes, Views, complex Triggers, effective Functions, and appropriate store procedures to facilitate efficient data manipulation and data consistency

Implementing database architecture, ETL and development activities

Troubleshooting data load, ETL and application support related issues

Demonstrates ability to communicate effectively in both technical and business environments

Troubleshooting failed batch jobs, correcting outstanding issues and resubmitting scheduled jobs to ensure completion

Troubleshoot, optimize, and tune SQL processes and complex SQL queries

Required Qualifications/Experience

4+ years of experience in the design and optimization of MySQL databases

General database development using MySQL

Advanced level of writing stored procedures, reading query plans, tuning indexes and troubleshooting performance bottlenecks

Troubleshoot, optimize, and tune SQL processes and complex SQL queries

Experienced and versed in creating sophisticated MySQL Server databases to quickly handle complex queries

Problem solving, analytical and fluent communication

Read more
CSI Interfusion
Sujitha Kotipalli
Posted by Sujitha Kotipalli
Remote, Hyderabad
5 - 10 yrs
₹35L - ₹45L / yr
skill iconReact.js
skill iconC#
skill icon.NET
SQL
Microsoft Windows Azure

1、Job Responsibilities:

Backend Development (.NET)

  • Design and implement ASP.NET Core WebAPIs
  • Design and implement background jobs using Azure Function Apps
  • Optimize performance for long-running operations, ensuring high concurrency and system stability.
  • Develop efficient and scalable task scheduling solutions to execute periodic tasks

Frontend Development (React)

  • Build high-performance, maintainable React applications and optimize component rendering.
  • Continuously improving front-end performance using best practices
  • Deployment & Operations
  • Deploy React applications on Azure platforms (Azure Web Apps), ensuring smooth and reliable delivery.
  • Collaborate with DevOps teams to enhance CI/CD pipelines and improve deployment efficiency.

2、Job Requirements:

Tech Stack:

  • Backend: ASP.NET Core Web API, C#
  • Frontend: React, JavaScript/TypeScript, Redux or other state management libraries
  • Azure: Function Apps, Web Apps, Logic Apps
  • Database: Cosmos DB, SQL Server

Strong knowledge of asynchronous programmingperformance optimization, and task scheduling

  • Proficiency in React performance optimization techniques, understanding of virtual DOM and component lifecycle.​
  • Experience with cloud deployment, preferably Azure App Service or Azure Static Web Apps.​
  • Familiarity with Git and CI/CD workflows, with strong coding standards.

3、Project Background:

Mission: Transform Microsoft Cloud customers into fans by delivering exceptional support and engagement.​

  • Scope:
  • Customer reliability engineering
  • Advanced cloud engineering and supportability
  • Business management and operations
  • Product and platform orchestration​
  • Activities:
  • Technical skilling programs
  • AI strategy for customer experience
  • Handling escalations and service reliability issues​

4、Project Highlights:

React Js, ASP.NET Core Web API; Azure Function Apps, Cosmos DB

 

Read more
Truetech
Nithya A
Posted by Nithya A
Chennai
4 - 10 yrs
₹15L - ₹25L / yr
OFSAA
SQL
AML
FCCM

Job Title: OFSAA Consultant

 

Please find the detailed Job Description below:

Role: OFSAA Consultant

Location: Chennai (Anna salai)

Job Mode: WFO

Company: Equitas Small Finance Bank

 

Detailed JD *(Roles and Responsibilities):

 

As discussed, please find the JD for OFSAA Consultant

 

Job Summary:

We are seeking an experienced OFSAA Consultant with strong expertise in Oracle Financial Services Analytical Applications (OFSAA) and Financial Crime & Compliance Management (FCCM) modules, including AML, KYC, ECM, and Customer Screening (CS). The ideal candidate will have hands-on experience in end-to-end implementation, configuration, customization, and technical support of FCCM solutions, along with strong SQL development skills. A solid understanding of banking and financial domain processes is essential.

 

Key Responsibilities

OFSAA & FCCM Implementation

  • Perform end-to-end installation, configuration, and deployment of FCCM modules such as AML, ECM, KYC, Customer Screening (CS).
  • Customize and configure OFSAA components, data models, scenarios, alert workflows, and thresholds.
  • Integrate OFSAA/FCCM systems with upstream and downstream banking applications.

Technical Development & Support

  • Develop complex SQL queries, stored procedures, and performance-optimized scripts for data extraction, transformation, and loading (ETL).
  • Troubleshoot technical issues across OFSAA, FCCM, and AML platforms.
  • Work on scenario tuning, data mapping, model validation, and rule configurations.
  • Support UAT, SIT, and production migration activities.

Domain & Functional Expertise

  • Collaborate with compliance, risk, and business teams to understand AML/KYC regulatory requirements.
  • Analyze banking data flows and transactional patterns relevant to AML monitoring.
  • Assist in parameterization of AML scenarios and thresholds based on regulatory needs.

Documentation & Stakeholder Interaction

  • Prepare technical documents, installation guides, configuration reports, and support materials.
  • Work closely with business users, architects, and cross-functional teams for successful project delivery.

 

Required Skills & Qualifications

  • 3–10+ years of experience as an OFSAA / FCCM Technical Consultant (or as specified by employer).
  • Strong hands-on experience in OFSAAFCCMAMLKYCECMCS modules.
  • Expertise in SQL Development and relational databases (Oracle preferred).
  • Practical experience in installation, configuration, patching, and environment setup for OFSAA/FCCM.
  • Knowledge of banking processes, AML regulations, customer due diligence (CDD), and compliance workflows.
  • Strong analytical, problem-solving, and communication skills.


Read more
Appler
Appler Solutions
Posted by Appler Solutions
Remote only
4 - 6 yrs
₹7L - ₹10L / yr
skill iconJavascript
skill iconReact Native
skill iconReact.js
skill iconNodeJS (Node.js)
SQL

Job Title: Sr. Frontend Developer (Javascript)

Location: Remote Only

Experience Required: 4–6 years

Salary Range: 7L – 10L per year

About the Role:

We are looking for an experienced Sr. Frontend Developer with strong expertise in Javascript to join our remote team. The ideal candidate will have 4–6 years of hands-on experience in frontend development, with a focus on building responsive, high-performance web applications. You will work closely with cross-functional teams to design, develop, and implement user-facing features that align with business goals and enhance user experience.

Key Responsibilities:

  • Develop and maintain scalable, reusable frontend components and applications using modern Javascript frameworks and libraries.
  • Collaborate with UI/UX designers, product managers, and backend developers to deliver seamless user experiences.
  • Optimize applications for maximum speed, scalability, and accessibility.
  • Write clean, modular, and well-documented code following best practices.
  • Participate in code reviews, sprint planning, and agile development processes.
  • Troubleshoot, debug, and resolve frontend-related issues.
  • Stay updated with emerging frontend technologies and industry trends.

Must-Have Skills:

  • Javascript (ES6+)
  • React.js
  • React Native
  • NodeJS (Node.js)
  • SQL

Nice-to-Have Skills:

  • Experience with state management libraries (Redux, Context API, etc.)
  • Familiarity with testing frameworks (Jest, Cypress, React Testing Library)
  • Knowledge of frontend build tools (Webpack, Babel, NPM/Yarn)
  • Understanding of RESTful APIs and GraphQL
  • Experience with version control systems (Git)
  • Familiarity with CI/CD pipelines and deployment processes

Qualifications:

  • 4–6 years of professional frontend development experience.
  • Proven track record of delivering high-quality, production-ready applications.
  • Strong understanding of responsive design, cross-browser compatibility, and web performance optimization.
  • Excellent problem-solving skills and attention to detail.
  • Ability to work independently in a remote environment and communicate effectively with distributed teams.

What We Offer:

  • Competitive salary within the range of 7L – 10L per year.
  • Fully remote work flexibility.
  • Opportunity to work on innovative projects with a talented and supportive team.
  • Professional growth and skill development opportunities.


Read more
Banking Industry

Banking Industry

Agency job
via Peak Hire Solutions by Dhara Thakkar
Bengaluru (Bangalore), Mangalore, Pune, Mumbai
3 - 5 yrs
₹8L - ₹11L / yr
skill iconData Analytics
SQL
Relational Database (RDBMS)
skill iconJava
skill iconPython
+1 more

Required Skills: Strong SQL Expertise, Data Reporting & Analytics, Database Development, Stakeholder & Client Communication, Independent Problem-Solving & Automation Skills

 

Review Criteria

· Must have Strong SQL skills (queries, optimization, procedures, triggers)

· Must have Advanced Excel skills

· Should have 3+ years of relevant experience

· Should have Reporting + dashboard creation experience

· Should have Database development & maintenance experience

· Must have Strong communication for client interactions

· Should have Ability to work independently

· Willingness to work from client locations.

 

Description

Who is an ideal fit for us?

We seek professionals who are analytical, demonstrate self-motivation, exhibit a proactive mindset, and possess a strong sense of responsibility and ownership in their work.

 

What will you get to work on?

As a member of the Implementation & Analytics team, you will:

● Design, develop, and optimize complex SQL queries to extract, transform, and analyze data

● Create advanced reports and dashboards using SQL, stored procedures, and other reporting tools

● Develop and maintain database structures, stored procedures, functions, and triggers

● Optimize database performance by tuning SQL queries, and indexing to handle large datasets efficiently

● Collaborate with business stakeholders and analysts to understand analytics requirements

● Automate data extraction, transformation, and reporting processes to improve efficiency


What do we expect from you?

For the SQL/Oracle Developer role, we are seeking candidates with the following skills and Expertise:

● Proficiency in SQL (Window functions, stored procedures) and MS Excel (advanced Excel skills)

● More than 3 plus years of relevant experience

● Java / Python experience is a plus but not mandatory

● Strong communication skills to interact with customers to understand their requirements

● Capable of working independently with minimal guidance, showcasing self-reliance and initiative

● Previous experience in automation projects is preferred

● Work From Office: Bangalore/Navi Mumbai/Pune/Client locations

 

Read more
Retail Industry
Kolkata
4 - 7 yrs
₹10L - ₹20L / yr
Program Management
Project Management
Business Analysis
Product Management
Relational Database (RDBMS)
+10 more

Review Criteria

  • Strong Business Analyst Profile
  • Total 4+ YOE, of which 2+ Business Analysis
  • Must have expertise in Workflows, API documentation and UI/UX specifications
  • Must have experience with API integrations and troubleshooting and conducting UATs
  • Must have working knowledge of RDBMS and basic SQL and Excel
  • Must be a commerce graduate
  • Any company (Preferred B2B SaaS (RetailTech, ERP, OMS, POS, E-commerce SaaS, Logistics)
  • Candidate must be based in Kolkata, or a native of Kolkata or nearby states, (Bihar/ Jharkhand) and wanting to relocate back


Preferred

  • Proficiency in using tools such as Jira, Confluence, Lucidchart, and Visily.
  • Experience in Order Management System or POS product


Job Specific Criteria

  • CV Attachment is mandatory
  • Which product line you have worked on (ERP, POS, OMS, other SaaS, e-commerce)?
  • Reason for job change?
  • What's your under-graduation degree (BE / Btech / Bcom / BBA)


Role & Responsibilities

We are looking for a Business Analyst who will be the vital link between our information technology capacity and our business objectives by supporting and ensuring the successful completion of analysis, testing and release to production tasks of our software product’s features.

  • Converting business problems into functional requirements (User stories, API Documentation, UI/UX, workflow, scenarios).
  • Troubleshoot problems encountered in integration by understanding the API response - relating to the functional understanding.
  • Interacting with the solution architect in supporting to architect the solution.
  • Coordinating with the development team on a day-to-day basis to develop the specifications into the product solution.
  • Conduct user acceptance test to ensure requirement fulfilment.
  • At times he/she would also have to interact with the clients to clarify requirements.
  • Conduct training and provide documentation to share the knowledge of newly developed features.
  • Provide support on gaps (if any) on the developed features.
  • Exposure on product management as client requirements backlog/ ideas needs to be managed. A good exposure for aspiring product managers.
  • Use world’s most famous SDLC tools like Jira, Confluence, Lucidchart, Visily.


Ideal Candidate

  • 4-7 Years experience required.
  • Excellent analytical aptitude and problem-solving abilities.
  • Process mapping - understanding of tools like Lucidchart.
  • Very Methodical.
  • Understanding of API Integrations.
  • Understanding of RDMS - Basic SQL.
  • Deep understanding of the omni retail or e-commerce or POS.
  • Basic/Advance - Excel.
  • Good communication, Documentation and presentation skills.
  • Commerce or BTech background.



Read more
Intellipro
Arthy R
Posted by Arthy R
Bengaluru (Bangalore), Chennai
3 - 7 yrs
₹10L - ₹18L / yr
Delphi
SQL

📢 Hiring: Delphi Developer – 6 Months Contract

Locations: Chennai & Bangalore | Immediate Joiners | Service-Based Project

We are hiring experienced Delphi Developers for a 6-month contractual role with a reputed service-based IT organization. Candidates with strong Delphi expertise who can contribute independently in a fast-paced environment are encouraged to apply.


🔧 Key Highlights

3–7 years of experience in software development

Strong hands-on experience in Delphi

Proficiency in SQL, ADO, and understanding of OOP, data structures, and design patterns

Exposure to JavaScript frameworks (Knockout/Angular) and modern UI concepts

Good communication, analytical, and problem-solving skills

Ability to work independently and multitask effectively

Preferred: Experience in Payments, Retail, EMV, C-Store, or Logistics domains


📍 Locations: Chennai & Bangalore

⏳ Contract Duration: 6 Months

🚀 Start Date: Immediate


Read more
ImmersiveDataAI
Ishan  Agrawal
Posted by Ishan Agrawal
Pune
0 - 1 yrs
₹10000 - ₹20000 / mo
skill iconPython
SQL
Large Language Models (LLM) tuning
Data engineering

Entry Level | On-Site | Pune

Internship Opportunity: Data + AI Intern

Location: Pune, India (In-office)

Duration: 2 Months

Start Date: Between 11th July 2025 and 15th August 2025

Work Days: Monday to Friday

Stipend: As per company policy

About ImmersiveData.AI

Smarter Data. Smarter Decisions. Smarter Enterprises.™

At ImmersiveData.AI, we don’t just transform data—we challenge and redefine business models. By leveraging cutting-edge AI, intelligent automation, and modern data platforms, we empower enterprises to unlock new value and drive strategic transformation.

About the Internship

As a Data + AI Intern, you will gain hands-on experience at the intersection of data engineering and AI. You’ll be part of a collaborative team working on real-world data challenges using modern tools like SnowflakeDBTAirflow, and LLM frameworks. This internship is a launchpad for students looking to enter the rapidly evolving field of Data & AI.

Key Responsibilities

  • Assist in designing, building, and optimizing data pipelines and ETL workflows
  • Work with structured and unstructured datasets across various sources
  • Contribute to AI-driven automation and analytics use cases
  • Support backend integration of large language models (LLMs)
  • Collaborate in building data platforms using tools like SnowflakeDBT, and Airflow

Required Skills

  • Proficiency in Python
  • Strong understanding of SQL and relational databases
  • Basic knowledge of Data Engineering and Data Analysis concepts
  • Familiarity with cloud data platforms or willingness to learn (e.g., Snowflake)

Preferred Learning Certifications (Optional but Recommended)

  • Python Programming
  • SQL & MySQL/PostgreSQL
  • Statistical Modeling
  • Tableau / Power BI
  • Voice App Development (Bonus)

Who Can Apply

Only candidates who:

  • Are available full-time (in-office, Pune)
  • Can start between 11th July and 15th August 2025
  • Are available for a minimum of 2 months
  • Have relevant skills and interest in data and AI

Perks

  • Internship Certificate
  • Letter of Recommendation
  • Work with cutting-edge tools and technologies
  • Informal dress code
  • Exposure to real industry use cases and mentorship


Read more
Inteliment Technologies

at Inteliment Technologies

2 candid answers
Ariba Khan
Posted by Ariba Khan
Pune
4 - 7 yrs
Upto ₹20L / yr (Varies
)
SQL
Data modeling
Data Vault
ERwin
Star schema
+1 more

About the company:

Inteliment is a niche business analytics company with almost 2 decades proven track record of partnering with hundreds of fortunes 500 global companies. Inteliment operates its ISO certified development centre in Pune, India and has business operations in multiple countries through subsidiaries in Singapore, Europe and headquarter in India.


About the Role:

We are seeking an experienced Technical Data Professional with hands-on expertise in designing and implementing dimensional data models using Erwin or any dimensional model tool and building SQL-based solutions adhering to Data Vault 2.0 and Information Mart standards. The ideal candidate will have strong data analysis capabilities, exceptional SQL skills, and a deep understanding of data relationships, metrics, and granularity of the data structures.


Qualifications:

  • Bachelor’s degree in computer science, Information Technology, or a related field.
  • Certifications with related field will be an added advantage.


Key Competencies:

1.Technical Expertise:

  • Proficiency in Erwin for data modeling.
  • Advanced SQL skills with experience in writing and optimizing performance driven queries.
  • Hands-on experience with Data Vault 2.0 and Information Mart standards is highly preferred.
  • Solid understanding of Star Schema, Facts & Dimensions, and Business Unit (BU) architecture.

2. Analytical Skills:

  • Strong data analysis skills to evaluate data relationships, metrics, and granularities.
  • Capability to troubleshoot and resolve complex data modeling and performance issues.

3.Soft Skills:

  • Strong problem-solving and decision-making skills.
  • Excellent communication and stakeholder management abilities.
  • Proactive and detail-oriented with a focus on delivering high-quality results.


Key Responsibilities:

1. Dimensional Data Modeling:

  • Design and develop dimensional data models using Erwin with a focus on Star Schema and BUS architecture (Fact and Dimension tables).
  • Ensure models align with business requirements and provide scalability, performance, and maintainability.

2. SQL Development:

  • Implement data models in SQL using best practices for view creation, ensuring high performance.
  • Write, optimize, and refactor complex SQL queries for efficiency and performance in large-scale databases.
  • Develop solutions adhering to Information Mart and Data Vault 2.0 standards. (Dimensional model that is built using Raw Data vault tables Hubs, Links, satellites, Effectivity satellites , Bridge and PIT tables from Data Vault.)

3. Data Analysis & Relationship Metrics:

  • Perform in-depth data analysis to identify patterns, relationships, and metrics at different levels of granularity.
  • Ensure data integrity and quality by validating data models against business expectations.

4. Performance Optimization:

  • Conduct performance tuning of existing data structures, queries, and ETL processes.
  • Provide guidance on database indexing, partitioning, and query optimization techniques.

5. Collaboration:

  • Work closely with business stakeholders, data engineers, and analysts to understand and translate business needs into effective data solutions.
  • Support cross-functional teams to ensure seamless integration and delivery of data solutions
Read more
Pivotree

Pivotree

Agency job
via AccioJob by AccioJobHiring Board
Bengaluru (Bangalore)
0 - 1 yrs
₹5L - ₹5.5L / yr
skill iconJava
DSA
SQL

AccioJob is conducting a Walk-In Hiring Drive with Pivotree for the position of Technical Support.


To apply, register and select your slot here: https://go.acciojob.com/mFZkWn


Required Skills: Java, DSA, SQL


Eligibility:

  • Degree: BTech./BE
  • Branch: Computer Science/CSE/Other CS related branch, Electrical/Other electrical related branches
  • Graduation Year: 2024, 2025


Work Details:

Work Location: Bangalore (Onsite)

CTC: 5 LPA to 5.5 LPA


Evaluation Process:

Round 1: Offline Assessment at AccioJob Bangalore Centre


Further Rounds (for shortlisted candidates only):

Resume Evaluation, Technical Interview 1, HR Discussion


Important Note: Bring your laptop & earphones for the test.


Register here: https://go.acciojob.com/mFZkWn


FAST SLOT BOOKING

[ DOWNLOAD ACCIOJOB APP ]

https://go.acciojob.com/9hznVG

Read more
Inteliment Technologies

at Inteliment Technologies

2 candid answers
Ariba Khan
Posted by Ariba Khan
Pune
3 - 5 yrs
Upto ₹16L / yr (Varies
)
SQL
skill iconPython
ETL
skill iconAmazon Web Services (AWS)
Azure
+1 more

About the company:

Inteliment is a niche business analytics company with almost 2 decades proven track record of partnering with hundreds of fortunes 500 global companies. Inteliment operates its ISO certified development centre in Pune, India and has business operations in multiple countries through subsidiaries in Singapore, Europe and headquarter in India.


About the Role:

As a Data Engineer, you will contribute to cutting-edge global projects and innovative product initiatives, delivering impactful solutions for our Fortune clients. In this role, you will take ownership of the entire data pipeline and infrastructure development lifecycle—from ideation and design to implementation and ongoing optimization. Your efforts will ensure the delivery of high-performance, scalable, and reliable data solutions. Join us to become a driving force in shaping the future of data infrastructure and innovation, paving the way for transformative advancements in the data ecosystem.


Qualifications:

  • Bachelor’s or master’s degree in computer science, Information Technology, or a related field.
  • Certifications with related field will be an added advantage.


Key Competencies:

  • Must have experience with SQL, Python and Hadoop
  • Good to have experience with Cloud Computing Platforms (AWS, Azure, GCP, etc.), DevOps Practices, Agile Development Methodologies
  • ETL or other similar technologies will be an advantage.
  • Core Skills: Proficiency in SQL, Python, or Scala for data processing and manipulation
  • Data Platforms: Experience with cloud platforms such as AWS, Azure, or Google Cloud.
  • Tools: Familiarity with tools like Apache Spark, Kafka, and modern data warehouses (e.g., Snowflake, Big Query, Redshift).
  • Soft Skills: Strong problem-solving abilities, collaboration, and communication skills to work effectively with technical and non-technical teams.
  • Additional: Knowledge of SAP would be an advantage 


Key Responsibilities:

  • Data Pipeline Development: Build, maintain, and optimize ETL/ELT pipelines for seamless data flow.
  • Data Integration: Consolidate data from various sources into unified systems.
  • Database Management: Design and optimize scalable data storage solutions.
  • Data Quality Assurance: Ensure data accuracy, consistency, and completeness.
  • Collaboration: Work with analysts, scientists, and stakeholders to meet data needs.
  • Performance Optimization: Enhance pipeline efficiency and database performance.
  • Data Security: Implement and maintain robust data security and governance policies
  • Innovation: Adopt new tools and design scalable solutions for future growth.
  • Monitoring: Continuously monitor and maintain data systems for reliability.
  • Data Engineers ensure reliable, high-quality data infrastructure for analytics and decision-making.
Read more
Ladera Technology
Bengaluru (Bangalore)
7 - 11 yrs
₹10L - ₹22L / yr
skill iconJava
skill iconSpring Boot
Spring Security
APM
AWS Lambda
+9 more

Job Title: Software Developer (7-10 Years Experience)


Job Summary: We are seeking an experienced Software Developer with 7-10 years of hands-on development expertise in designing, building, and maintaining enterprise level applications and scalable APIs. Key


Responsibilities:

• Design, develop, and maintain microservices based applications using the Spring framework.

• Build secure, scalable REST and SOAP web services.

• Implement API security protocols including OAuth, JWT, SSL/TLS, X.509 certificates, and SAML, mTLS.

• Develop and deploy applications by leveraging AWS services such as EC2, Lambda, API Gateway, SQS, S3, SNS.

• Work with Azure cloud services and OpenShift for deployment and orchestration.

• Integrate JMS/messaging systems and work with middleware technologies such as MQ.

• Utilize SQL and NoSQL databases, including MySQL, PostgreSQL, and DynamoDB.

• Work with Netflix Conductor or Zuul for orchestration and routing.

• Collaborate with cross functional teams to deliver robust solutions in an Agile setup.


Required Skills:

• Strong JAVA OOPS fundamentals.

• Strong proficiency in Spring Framework (Spring Boot, Spring Cloud, Spring Security).

• Solid experience in microservices architecture.

• Handson experience with AWS cloud and OpenShift ecosystem.

• Familiarity with Azure services.

• Strong understanding of API security mechanisms.

• Expertise in building RESTful APIs.

• Experience working with SQL/NoSQL databases.

• Should have worked on integration with AppDynamics or similar APM tools

• Strong analytical and problem-solving skills.

Good to have skills:

• SOAP web services and graphQL

• Experience with JMS, messaging middleware, and MQ.


Qualifications:

• Bachelor’s or Master's degree in computer science or related field.

• 7-10 years of experience in backend development or full Stack development roles. 

Read more
Client based out in Mohali

Client based out in Mohali

Agency job
via TrueTech Solutions by Poorvi S
Mohali, punjab
5 - 10 yrs
₹15L - ₹15L / yr
Database administrator
MySQL DBA
MSSQL
SQL
MS SQLServer

We are seeking an experienced Database Administrator (DBA) to manage, maintain, and optimize all existing database systems — including Microsoft SQL Server, MySQL, and Azure SQL Managed Instances.

The ideal candidate will be responsible for ensuring high availability, performance, security, and data integrity across all database environments.

This role requires deep technical expertise in database administration, T-SQL development, backup/recovery strategies, and monitoring/troubleshooting.


Key Responsibilities

Database Administration & Maintenance

  • Manage and maintain MS SQL Server, MySQL, and Azure SQL Managed Instance environments (production, UAT, and development).
  • Install, configure, patch, and upgrade database servers and tools.
  • Implement and manage database backup and recovery strategies (native backups, BCP, export/import utilities, and automated backup jobs).
  • Ensure databases are highly available, secure, and performant.
  • Regularly perform health checks, index optimization, statistics updates, and space management.
  • Performance Monitoring & Troubleshooting
  • Monitor database performance using tools such as SQL Profiler, Azure Monitor, SSMS Performance Dashboard, SolarWinds, Redgate SQL Monitor, Nagios, or similar.
  • Identify and resolve performance bottlenecks, blocking/deadlock issues, and slow-running queries.
  • Perform root cause analysis (RCA) for database outages or performance issues and implement preventive measures.
  • Development & Query Optimization
  • Write and optimize T-SQL queries, stored procedures, functions, views, and triggers.
  • Collaborate with developers to ensure efficient database design and query performance tuning.
  • Support deployment of scripts and changes across environments following best practices.
  • Security & Compliance
  • Manage database users, roles, and permissions in alignment with company security policies.
  • Ensure data encryption, auditing, and compliance with regulatory standards (e.g., GDPR, HIPAA if applicable).
  • Automation & Documentation
  • Automate routine DBA tasks (e.g., backups, monitoring, alerts, and maintenance plans).
  • Maintain up-to-date documentation for all databases, configurations, procedures, and change logs.
  • Support disaster recovery planning and testing.


  • Required Skills & Experience
  • 5+ years of experience as a Database Administrator managing SQL Server and MySQL environments.
  • Strong expertise in:
  • T-SQL (queries, stored procedures, triggers, views)
  • Database backup/restore (native, maintenance plans, BCP, logical/physical)
  • Performance tuning and query optimization
  • SQL Agent Jobs and automation
  • Working knowledge of Azure SQL Managed Instance or other cloud-based database services.
  • Experience with monitoring and alerting tools (Redgate, SolarWinds DPA, Azure Monitor, etc.).
  • Strong understanding of replication, mirroring, Always On Availability Groups, and log shipping.
  • Familiarity with MySQL administration, including replication and backup utilities.
  • Ability to diagnose and troubleshoot complex database and performance issues.


  • Preferred Qualifications
  • Microsoft Certified: Azure Database Administrator Associate or SQL Server Database Administrator.
  • Experience with PowerShell or Python scripting for automation.
  • Familiarity with DevOps tools and CI/CD pipelines for database deployments.
  • Knowledge of Linux/Windows Server administration related to DB operations.


  • Soft Skills
  • Strong analytical and problem-solving skills.
  • Excellent communication and documentation abilities.
  • Ability to work under pressure in production environments.
  • Collaborative mindset with cross-functional teams (Developers, DevOps, Infrastructure).
Read more
Banking Industry

Banking Industry

Agency job
via Peak Hire Solutions by Dhara Thakkar
Kochi (Cochin), Mumbai, Bengaluru (Bangalore)
3 - 5 yrs
₹10L - ₹17L / yr
Project Management
skill iconData Analytics
Program Management
SQL
Client Management
+7 more

Required Skills: Project Management, Data Analysis, SQL queries, Client Engagement

 

Criteria:

  • Must have 3+ years of project/program management experience in Financial Services/Banking/NBFC/Fintech companies only.
  • Hands-on proficiency in data analysis and SQL querying, with ability to work on large datasets
  • Ability to lead end-to-end implementation projects and manage cross-functional teams effectively.
  • Experience in process analysis, optimization, and mapping for operational efficiency.
  • Strong client-facing communication and stakeholder management capabilities.
  • Good expertise in financial operations processes and workflows with proven implementation experience.

 

Description

Position Overview:

We are seeking a dynamic and experienced Technical Program Manager to join our team. The successful candidate will be responsible for managing the implementation of company’s solutions at existing and new clients. This role requires a deep understanding of financial operation processes, exceptional problem-solving skills, and the ability to analyze large volumes of data. The Technical Program manager will drive process excellence and ensure outstanding customer satisfaction throughout the implementation lifecycle and beyond.

 

Key Responsibilities:

● Client Engagement: Serve as the primary point of contact for assigned clients, understanding their unique operation processes and requirements. Build and maintain strong relationships to facilitate successful implementations.

● Project Management: Lead the end-to-end implementation of company’s solutions, ensuring projects are delivered on time, within scope, and within budget. Coordinate with cross-functional teams to align resources and objectives.

● Process Analysis and Improvement: Evaluate clients' existing operation workflows, identify inefficiencies, and recommend optimized processes leveraging company’s platform. Utilize process mapping and data analysis to drive continuous improvement.

● Data Analysis: Analyze substantial datasets to ensure accurate configuration and integration of company’s solutions. Employ statistical tools and SQL-based queries to interpret data and provide actionable insights.

● Problem Solving: Break down complex problems into manageable components, developing effective solutions in collaboration with clients and internal teams.

● Process Excellence: Advocate for and implement best practices in process management, utilizing methodologies such as Lean Six Sigma to enhance operational efficiency.

● Customer Excellence: Ensure a superior customer experience by proactively addressing client needs, providing training and support, and promptly resolving any issues that arise.

 

Qualifications:

● Minimum of 3 years of experience in project management, preferably in financial services, software implementation, consulting or analytics.

● Strong analytical skills with experience in data analysis, SQL querying, and handling large datasets.

● Excellent communication and interpersonal skills, with the ability to manage client relationships effectively.

● Demonstrated ability to lead cross-functional teams and manage multiple projects concurrently.

● Proven expertise in financial operation processes and related software solutions is a plus

● Proficiency in developing business intelligence solutions or with low-code tools is a plus

 

Why Join company?

● Opportunity to work with a cutting-edge financial technology company.

● Collaborative and innovative work environment.

● Competitive compensation and benefits package.

● Professional development and growth opportunities.

Read more
Inteliment Technologies

at Inteliment Technologies

2 candid answers
Ariba Khan
Posted by Ariba Khan
Pune
3 - 5 yrs
Upto ₹12L / yr (Varies
)
PowerBI
SQL
DAX
Power Query

About the company:

Inteliment is a niche business analytics company with almost 2 decades proven track record of partnering with hundreds of fortunes 500 global companies. Inteliment operates its ISO certified development centre in Pune, India and has business operations in multiple countries through subsidiaries in Singapore, Europe and headquarter in India.


About the role:

As a Power BI Developer, you will work closely with business analysts, data engineers, and key stakeholders to transform complex datasets into actionable insights. Your expertise will be pivotal in designing and delivering visually engaging reports, dashboards, and data-driven stories that empower informed decision-making across the organization. By translating raw data into meaningful visuals, you will play a critical role in driving strategic initiatives and fostering a culture of data-driven excellence.


Qualifications:

  • Bachelor’s or master’s degree in computer sciences, Information Technology, or a related field.
  • Certifications with related field will be an added advantage


Key Competencies:

  • Technical Skills: Proficiency in Power BI, DAX, Power Query, SQL, and data visualization best practices.
  • Additional Tools: Familiarity with Azure Data Factory, Power Automate, and other components of the Power Platform is advantageous.
  • Soft Skills: Strong analytical thinking, problem-solving, and communication skills for interacting with technical and non-technical audiences.
  • Additional skills: Domain understanding is a plus


Key Responsibilities:

1. Data Integration & Modelling

  • Extract, transform, and load (ETL) data from various sources (SQL, Excel, APIs, etc.).
  • Design and develop efficient data models to support reporting needs.
  • Ensure data integrity and optimize performance through best practices.

2. Report Development

  • Understand the business requirement and build reports to provide analytical insights
  • Build visually appealing, interactive dashboards and reports in Power BI.
  • Implement DAX (Data Analysis Expressions) for complex calculations and measures.
  • Design user-friendly layouts that align with stakeholder requirements.

3. Collaboration

  • Work with stakeholders to gather business requirements and translate them into technical solutions.
  • Collaborate with data engineers and analysts to ensure cohesive reporting strategies.
  • Provide support and training for end-users to maximize adoption and usage of Power BI solutions

4. Performance Optimization

  • Optimize dashboards and reports for better speed and responsiveness.
  • Monitor and improve data refresh processes for real-time reporting.

5. Governance and Security

  • Implement row-level security (RLS) and adhere to organizational data governance policies.
  • Manage Power BI workspaces and permissions.

6. Continuous Improvement

  • Stay updated with Power BI features and industry trends.
  • Proactively recommend enhancements to existing solutions
Read more
Vy Systems

at Vy Systems

1 recruiter
Kalki K
Posted by Kalki K
Remote only
4 - 12 yrs
₹18L - ₹28L / yr
databricks
skill iconAmazon Web Services (AWS)
SQL
skill iconPython
PySpark

Job Summary


We are seeking an experienced Databricks Developer with strong skills in PySpark, SQL, Python, and hands-on experience deploying data solutions on AWS (preferred), Azure. The role involves designing, developing, and optimizing scalable data pipelines and analytics workflows on the Databricks platform.


Key Responsibilities

- Develop and optimize ETL/ELT pipelines using Databricks and PySpark.

- Build scalable data workflows on AWS (EC2, S3, Glue, Lambda, IAM) or Azure (ADF, ADLS, Synapse).

- Implement and manage Delta Lake (ACID, schema evolution, time travel).

- Write efficient, complex SQL for transformation and analytics.

- Build and support batch and streaming ingestion (Kafka, Kinesis, EventHub).

- Optimize Databricks clusters, jobs, notebooks, and PySpark performance.

- Collaborate with cross-functional teams to deliver reliable data solutions.

- Ensure data governance, security, and compliance.

- Troubleshoot pipelines and support CI/CD deployments.


Required Skills & Experience

- 4–8 years in Data Engineering / Big Data development.

- Strong hands-on experience with Databricks (clusters, jobs, workflows).

- Advanced PySpark and strong Python skills.

- Expert-level SQL (complex queries, window functions).

- Practical experience with AWS (preferred) or Azure cloud services.

- Experience with Delta Lake, Parquet, and data lake architectures.

- Familiarity with CI/CD tools (GitHub Actions, Azure DevOps, Jenkins).

- Good understanding of data modeling, optimization, and distributed systems.

Read more
IT Industry

IT Industry

Agency job
Gurugram, Delhi, Noida, Ghaziabad, Faridabad
5 - 10 yrs
₹10L - ₹18L / yr
skill icon.NET
skill iconC#
SQL
ASP.NET MVC
Web API

We’re Hiring: .NET Developer

📍 Location: Gurgaon

🧑‍💻 Experience: 5+ Years

🏢 5 Days Working | WFO

🔧 Roles & Responsibilities:

• 🖥️ Develop & maintain .NET apps (.NET Framework/Core)

• ⚙️ Build secure, scalable web apps (ASP.NET, MVC, Web API)

• 🗄️ Work with Entity Framework & SQL Server

• 🎨 Frontend: HTML, CSS, JS, Angular/React

• 🧪 Write clean, testable code

• 🤝 Collaborate with cross-functional teams

• 🛠️ Code reviews, debugging & performance tuning

• 🔐 Ensure security, scalability & reliability

🧠 Skills & Qualifications:

• 💼 5+ yrs in .NET development

• 💻 Strong in C#, ASP.NET MVC, .NET Core, Web API

• 🗃️ EF + SQL expertise

• 🧱 Full-stack capabilities

• 🔧 OOP & design patterns

• 🗂️ Git knowledge

• 💡 Strong problem-solving skills

Read more
Fountane inc
Remote only
2 - 4 yrs
₹10L - ₹18L / yr
Firebase
skill iconMongoDB
skill iconExpress
skill iconNodeJS (Node.js)
SQL
+8 more

JOB TITLE: Senior Full Stack Developer (SDE-3)

 

LOCATION: Remote/Hybrid.

 

A LITTLE BIT ABOUT THE ROLE:

 

As a Full Stack Developer, you will be responsible for developing digital systems that deliver optimal end-to-end solutions to our business needs. The work will cover all aspects of software delivery, including working with staff, vendors, and outsourced contributors to build, release and maintain the product.

 

Fountane operates a scrum-based Agile delivery cycle, and you will be working within this. You will work with product owners, user experience, test, infrastructure, and operations professionals to build the most effective solutions.

 

WHAT YOU WILL BE DOING:

 

  • Full-stack development on a multinational team on various products across different technologies and industries.
  • Optimize the development process and identify continuing improvements.
  • Monitor technology landscape, assess and introduce new technology. Own and communicate development processes and standards.
  • The job title does not define or limit your duties, and you may be required to carry out other work within your abilities from time to time at our request. We reserve the right to introduce changes in line with technological developments which may impact your job duties or methods of working.

 

 

WHAT YOU WILL NEED TO BE GREAT IN THIS ROLE:

 

  • Minimum of 3+ years of full-stack development, combined back and front-end experience building fast, reliable web and/or mobile applications.
  • Experience with Web frameworks (e.g., React, Angular or Vue) and/or mobile development (e.g., Native, Native Script, React)
  • Proficient in at least one JavaScript framework such as React, NodeJs, AngularJS (2. x), or jQuery.
  • Ability to optimize product development by leveraging software development processes.
  • Bachelor's degree or equivalent (minimum six years) of work experience. If you have an Associate’s Degree must have a minimum of 4 years of work experience.
  • Fountane's current technology stack driving our digital products includes React.js, Node.js, React Native, Angular, Firebase, Bootstrap, MongoDB, Express, Hasura, GraphQl, Amazon Web Services(AWS), and Google Cloud Platform.

 

SOFT SKILLS:

 

  • Collaboration - Ability to work in teams across the world
  • Adaptability - situations are unexpected, and you need to be quick to adapt
  • Open-mindedness - Expect to see things outside the ordinary

 

LIFE AT FOUNTANE:

 

  • Fountane offers an environment where all members are supported, challenged, recognized & given opportunities to grow to their fullest potential.
  • Competitive pay
  • Health insurance
  • Individual/team bonuses
  • Employee stock ownership plan
  • Fun/challenging variety of projects/industries
  • Flexible workplace policy - remote/physical
  • Flat organization - no micromanagement
  • Individual contribution - set your deadlines
  • Above all - culture that helps you grow exponentially.


Qualifications - No bachelor's degree required. Good communication skills are a must!


A LITTLE BIT ABOUT THE COMPANY:

Established in 2017, Fountane Inc is a Ventures Lab incubating and investing in new competitive technology businesses from scratch. Thus far, we’ve created half a dozen multi-million valuation companies in the US and a handful of sister ventures for large corporations, including Target, US Ventures, and Imprint Engine.

We’re a team of 80 strong from around the world that are radically open-minded and believes in excellence, respecting one

Read more
Blurgs AI

at Blurgs AI

2 candid answers
Nikita Sinha
Posted by Nikita Sinha
Hyderabad
4 - 8 yrs
Upto ₹25L / yr (Varies
)
skill iconPython
Data engineering
skill iconMongoDB
SQL
skill iconDocker
+1 more

We are seeking a Technical Lead with strong expertise in backend engineering, real-time data streaming, and platform/infrastructure development to lead the architecture and delivery of our on-premise systems.

You will design and build high-throughput streaming pipelines (Apache Pulsar, Apache Flink), backend services (FastAPI), data storage models (MongoDB, ClickHouse), and internal dashboards/tools (Angular).

In this role, you will guide engineers, drive architectural decisions, and ensure reliable systems deployed on Docker + Kubernetes clusters.


Key Responsibilities

1. Technical Leadership & Architecture

  • Own the end-to-end architecture for backend, streaming, and data systems.
  • Drive system design decisions for ingestion, processing, storage, and DevOps.
  • Review code, enforce engineering best practices, and ensure production readiness.
  • Collaborate closely with founders and domain experts to translate requirements into technical deliverables.

2. Data Pipeline & Streaming Systems

  • Architect and implement real-time, high-throughput data pipelines using Apache Pulsar and Apache Flink.
  • Build scalable ingestion, enrichment, and stateful processing workflows.
  • Integrate multi-sensor maritime data into reliable, unified streaming systems.

3. Backend Services & Platform Engineering

  • Lead development of microservices and internal APIs using FastAPI (or equivalent backend frameworks).
  • Build orchestration, ETL, and system-control services.
  • Optimize backend systems for latency, throughput, resilience, and long-term maintainability.

4. Data Storage & Modeling

  • Design scalable, efficient data models using MongoDB, ClickHouse, and other on-prem databases.
  • Implement indexing, partitioning, retention, and lifecycle strategies for large datasets.
  • Ensure high-performance APIs and analytics workflows.

5. Infrastructure, DevOps & Containerization

  • Deploy and manage distributed systems using Docker and Kubernetes.
  • Own observability, monitoring, logging, and alerting for all critical services.
  • Implement CI/CD pipelines tailored for on-prem and hybrid cloud environments.

6. Team Management & Mentorship

  • Provide technical guidance to engineers across backend, data, and DevOps teams.
  • Break down complex tasks, review designs, and ensure high-quality execution.
  • Foster a culture of clarity, ownership, collaboration, and engineering excellence.

Required Skills & Experience

  • 5–10+ years of strong software engineering experience.
  • Expertise with streaming platforms like Apache Pulsar, Apache Flink, or similar technologies.
  • Strong backend engineering proficiency — preferably FastAPI, Python, Java, or Scala.
  • Hands-on experience with MongoDB and ClickHouse.
  • Solid experience deploying, scaling, and managing services on Docker + Kubernetes.
  • Strong understanding of distributed systems, high-performance data flows, and system tuning.
  • Experience working with Angular for internal dashboards is a plus.
  • Excellent system-design, debugging, and performance-optimization skills.
  • Prior experience owning critical technical components or leading engineering teams.

Nice to Have

  • Experience with sensor data (AIS, Radar, SAR, EO/IR).
  • Exposure to maritime, defence, or geospatial technology.
  • Experience with bare-metal / on-premise deployments.


Read more
Global Digital Transformation Solutions Provider

Global Digital Transformation Solutions Provider

Agency job
via Peak Hire Solutions by Dhara Thakkar
Pune
6 - 12 yrs
₹15L - ₹30L / yr
skill iconMachine Learning (ML)
skill iconAmazon Web Services (AWS)
skill iconKubernetes
ECS
Amazon Redshift
+14 more

Core Responsibilities:

  • The MLE will design, build, test, and deploy scalable machine learning systems, optimizing model accuracy and efficiency
  • Model Development: Algorithms and architectures span traditional statistical methods to deep learning along with employing LLMs in modern frameworks.
  • Data Preparation: Prepare, cleanse, and transform data for model training and evaluation.
  • Algorithm Implementation: Implement and optimize machine learning algorithms and statistical models.
  • System Integration: Integrate models into existing systems and workflows.
  • Model Deployment: Deploy models to production environments and monitor performance.
  • Collaboration: Work closely with data scientists, software engineers, and other stakeholders.
  • Continuous Improvement: Identify areas for improvement in model performance and systems.

 

Skills:

  • Programming and Software Engineering: Knowledge of software engineering best practices (version control, testing, CI/CD).
  • Data Engineering: Ability to handle data pipelines, data cleaning, and feature engineering. Proficiency in SQL for data manipulation + Kafka, Chaossearch logs, etc for troubleshooting; Other tech touch points are ScyllaDB (like BigTable), OpenSearch, Neo4J graph
  • Model Deployment and Monitoring: MLOps Experience in deploying ML models to production environments.
  • Knowledge of model monitoring and performance evaluation.

 

Required experience:

  • Amazon SageMaker: Deep understanding of SageMaker's capabilities for building, training, and deploying ML models; understanding of the Sagemaker pipeline with ability to analyze gaps and recommend/implement improvements
  • AWS Cloud Infrastructure: Familiarity with S3, EC2, Lambda and using these services in ML workflows
  • AWS data: Redshift, Glue
  • Containerization and Orchestration: Understanding of Docker and Kubernetes, and their implementation within AWS (EKS, ECS)

 

Skills: Aws, Aws Cloud, Amazon Redshift, Eks

 

Must-Haves

Machine Learning +Aws+ (EKS OR ECS OR Kubernetes) + (Redshift AND Glue) + Sagemaker

Notice period - 0 to 15days only

Hybrid work mode- 3 days office, 2 days at home

Read more
Deqode

at Deqode

1 recruiter
Samiksha Agrawal
Posted by Samiksha Agrawal
Mumbai, Pune, Delhi, Gurugram, Noida, Ghaziabad, Faridabad, Indore, Bengaluru (Bangalore)
4 - 7 yrs
₹4L - ₹10L / yr
skill iconJava
skill iconSpring Boot
Microservices
SQL
Hibernate (Java)

Job Description

Role: Java Developer

Location: PAN India

Experience:4+ Years

Required Skills -

  1. 3+ years Java development experience
  2. Spring Boot framework expertise (MANDATORY)
  3. Microservices architecture design & implementation (MANDATORY)
  4. Hibernate/JPA for database operations (MANDATORY)
  5. RESTful API development (MANDATORY)
  6. Database design and optimization (MANDATORY)
  7. Container technologies (Docker/Kubernetes)
  8. Cloud platforms experience (AWS/Azure)
  9. CI/CD pipeline implementation
  10. Code review and quality assurance
  11. Problem-solving and debugging skills
  12. Agile/Scrum methodology
  13. Version control systems (Git)


Read more
Service Co
Hyderabad, Visakhapatnam, Vijayawada, Warangal
4 - 5 yrs
₹5L - ₹10L / yr
skill iconC#
Dot Net Core
skill icon.NET
RESTful APIs
skill iconAngular (2+)
+2 more

Hiring for Dot Net Full Stack Developer


Exp : 4 - 5 yrs

Work Location : Hyderabad WFO

F2F Interview

Notice Period : Immediate - 15 days


Skills :


C# , .Net Core , Rest APIs , Angular 10+ , Typescript , HTML , CSS , SQL Server , CI/CD . Git

Read more
Remote only
5 - 15 yrs
₹10L - ₹15L / yr
FastAPI
skill iconPython
RESTful APIs
SQL
NOSQL Databases
+5 more


Summary:

We are seeking a highly skilled Python Backend Developer with proven expertise in FastAPI to join our team as a full-time contractor for 12 months. The ideal candidate will have 5+ years of experience in backend development, a strong understanding of API design, and the ability to deliver scalable, secure solutions. Knowledge of front-end technologies is an added advantage. Immediate joiners are preferred. This role requires full-time commitment—please apply only if you are not engaged in other projects.

Job Type:

Full-Time Contractor (12 months)

Location:

Remote / On-site (Jaipur preferred, as per project needs)

Experience:

5+ years in backend development

Key Responsibilities:

  • Design, develop, and maintain robust backend services using Python and FastAPI.
  •  Implement and manage Prisma ORM for database operations.
  • Build scalable APIs and integrate with SQL databases and third-party services.
  • Deploy and manage backend services using Azure Function Apps and Microsoft Azure Cloud.
  • Collaborate with front-end developers and other team members to deliver high-quality web applications.
  • Ensure application performance, security, and reliability.
  • Participate in code reviews, testing, and deployment processes.

Required Skills:

  • Expertise in Python backend development with strong experience in FastAPI.
  • Solid understanding of RESTful API design and implementation.
  • Proficiency in SQL databases and ORM tools (preferably Prisma)
  • Hands-on experience with Microsoft Azure Cloud and Azure Function Apps.
  • Familiarity with CI/CD pipelines and containerization (Docker).
  • Knowledge of cloud architecture best practices.

Added Advantage:

  • Front-end development knowledge (React, Angular, or similar frameworks).
  • Exposure to AWS/GCP cloud platforms.
  • Experience with NoSQL databases.

Eligibility:

  • Minimum 5 years of professional experience in backend development.
  • Available for full-time engagement.
  • Please excuse if you are currently engaged in other projects—we require dedicated availability.

 

Read more
InEvolution

at InEvolution

2 candid answers
Pavan P K
Posted by Pavan P K
Remote only
5 - 7 yrs
₹6L - ₹10L / yr
SQL
ETL architecture
Data modeling
Data Transformation Tool (DBT)
SQL server
+9 more

Role Overview


As a Senior SQL Developer, you’ll be responsible for data extracts, updating, and maintaining reports as requested by stakeholders. You’ll work closely with finance operations and developers to ensure data requests are appropriately managed.


Key Responsibilities


  • Design, develop, and optimize complex SQL queries, stored procedures, functions, and tasks across multiple databases/schemas.
  • Transform cost-intensive models from full-refreshes to incremental loads based on upstream data.
  • Help design proactive monitoring of data to catch data issues/data delays.


Qualifications


  • 5+ years of experience as a SQL developer, preferably in a B2C or tech environment. • Ability to translate requirements into datasets.
  • Understanding of dbt framework for transformations.
  • Basic usage of git - branching/ PR generation.
  • Detail-oriented with strong organizational and time management skills.
  • Ability to work cross-functionally and manage multiple projects simultaneously.


Bonus Points


  • Experience with Snowflake and AWS data technologies.
  • Experience with Python and containers (Docker)
Read more
AI company

AI company

Agency job
via Peak Hire Solutions by Dhara Thakkar
Bengaluru (Bangalore)
5 - 10 yrs
₹20L - ₹40L / yr
Oracle
Oracle Data Integrator
Oracle ERP
Implementation
Process automation
+30 more

Review Criteria

  • Strong Oracle Integration Cloud (OIC) Implementation profile
  • 5+ years in enterprise integration / middleware roles, with minimum 3+ years of hands-on Oracle Integration Cloud (OIC) implementation experience
  • Strong experience designing and delivering integrations using OIC Integrations, Adapters (File, FTP, DB, SOAP/REST, Oracle ERP), Orchestrations, Mappings, Process Automation, Visual Builder (VBCS), and OIC Insight/Monitoring
  • Proven experience building integrations across Oracle Fusion/ERP/HCM, Salesforce, on-prem systems (AS/400, JDE), APIs, file feeds (FBDI/HDL), databases, and third-party SaaS.
  • Strong expertise in REST/JSON, SOAP/XML, WSDL, XSD, XPath, XSLT, JSON Schema, and web-service–based integrations
  • Good working knowledge of OCI components (API Gateway, Vault, Autonomous DB) and hybrid integration patterns
  • Strong SQL & PL/SQL skills for debugging, data manipulation, and integration troubleshooting
  • Hands-on experience owning end-to-end integration delivery including architecture reviews, deployments, versioning, CI/CD of OIC artifacts, automated testing, environment migrations (Dev→Test→Prod), integration governance, reusable patterns, error-handling frameworks, and observability using OIC/OCI monitoring & logging tools
  • Experience providing technical leadership, reviewing integration designs/code, and mentoring integration developers; must be comfortable driving RCA, performance tuning, and production issue resolution
  • Strong stakeholder management, communication (written + verbal), problem-solving, and ability to collaborate with business/product/architect teams

 

Preferred

  • Preferred (Certification) – Oracle OIC or Oracle Cloud certification
  • Preferred (Domain Exposure) – Experience with Oracle Fusion functional modules (Finance, SCM, HCM), business events/REST APIs, SOA/OSB background, or multi-tenant/API-governed integration environments


Job Specific Criteria

  • CV Attachment is mandatory
  • How many years of experience you have with Oracle Integration Cloud (OIC)?
  • Which is your preferred job location (Mumbai / Bengaluru / Hyderabad / Gurgaon)?
  • Are you okay with 3 Days WFO?
  • Virtual Interview requires video to be on, are you okay with it?


Role & Responsibilities

Company is seeking an experienced OIC Lead to own the design, development and deployment of enterprise integrations. The ideal candidate will have atleast 6+years of prior experience in various integration technologies, with a good experience implementing OIC integration capabilities. This role offers an exciting opportunity to work on diverse projects, collaborating with cross-functional teams to design, build, and optimize data pipelines and infrastructure.

 

Responsibilities:

  • Lead the design and delivery of integration solutions using Oracle Integration Cloud (Integration, Process Automation, Visual Builder, Insight) and related Oracle PaaS components.
  • Build and maintain integrations between Oracle Fusion/ERP/HCM, Salesforce, on-prem applications (e.g., AS/400, JDE), APIs, file feeds (FBDI/HDL), databases and third-party SaaS.
  • Own end-to-end integration delivery - from architecture/design reviews through deployment, monitoring, and post-production support.
  • Create reusable integration patterns, error-handling frameworks, security patterns (OAuth2, client credentials), and governance for APIs and integrations.
  • Own CI/CD, versioning and migration of OIC artifacts across environments (Dev → Test → Prod); implement automated tests and promotion pipelines.
  • Define integration architecture standards and reference patterns for hybrid (cloud/on-prem) deployments.
  • Ensure security, scalability, and fault tolerance are built into all integration designs.
  • Drive performance tuning, monitoring and incident response for integrations; implement observability using OIC/OCI monitoring and logging tools.
  • Provide technical leadership and mentorship to a team of integration developers; review designs and code; run hands-on troubleshooting and production support rotations.
  • Work with business stakeholders, product owners and solution architects to translate requirements into integration designs, data mappings and runbooks

 

Ideal Candidate

  • 5+ years in integration/enterprise middleware roles with at least 3+ years hands-on OIC (Oracle Integration Cloud) implementations.
  • Strong experience with OIC components: Integrations, Adapters (File, FTP, Database, SOAP, REST, Oracle ERP), Orchestrations/Maps, OIC Insight/Monitoring, Visual Builder (VBCS) or similar
  • Expert in web services and message formats: REST/JSON, SOAP/XML, WSDL, XSD, XPath, XSLT, JSON Schema
  • Good knowledge of Oracle Cloud stack / OCI (API Gateway, Vault, Autonomous DB) and on-prem integration patterns
  • SQL & PL/SQL skills for data manipulation and troubleshooting; exposure to FBDI/HDL (for bulk loads) is desirable
  • Strong problem-solving, stakeholder management, written/verbal communication and team mentoring experience

 

Nice-to-have / Preferred:

  • Oracle OIC certification(s) or Oracle Cloud certifications
  • Exposure to OCI services (API Gateway, Vault, Monitoring) and Autonomous Database
  • Experience with Oracle Fusion functional areas (Finance, Supply Chain, HCM) and business events/REST APIs preferred.
  • Background with SOA Suite/Oracle Service Bus (useful if migrating legacy SOA to OIC)
  • Experience designing multi-tenant integrations, rate limiting/throttling and API monetization strategies.


Read more
AI company

AI company

Agency job
via Peak Hire Solutions by Dhara Thakkar
Bengaluru (Bangalore)
5 - 10 yrs
₹20L - ₹45L / yr
Data architecture
Data engineering
SQL
Data modeling
GCS
+21 more

Review Criteria

  • Strong Dremio / Lakehouse Data Architect profile
  • 5+ years of experience in Data Architecture / Data Engineering, with minimum 3+ years hands-on in Dremio
  • Strong expertise in SQL optimization, data modeling, query performance tuning, and designing analytical schemas for large-scale systems
  • Deep experience with cloud object storage (S3 / ADLS / GCS) and file formats such as Parquet, Delta, Iceberg along with distributed query planning concepts
  • Hands-on experience integrating data via APIs, JDBC, Delta/Parquet, object storage, and coordinating with data engineering pipelines (Airflow, DBT, Kafka, Spark, etc.)
  • Proven experience designing and implementing lakehouse architecture including ingestion, curation, semantic modeling, reflections/caching optimization, and enabling governed analytics
  • Strong understanding of data governance, lineage, RBAC-based access control, and enterprise security best practices
  • Excellent communication skills with ability to work closely with BI, data science, and engineering teams; strong documentation discipline
  • Candidates must come from enterprise data modernization, cloud-native, or analytics-driven companies


Preferred

  • Preferred (Nice-to-have) – Experience integrating Dremio with BI tools (Tableau, Power BI, Looker) or data catalogs (Collibra, Alation, Purview); familiarity with Snowflake, Databricks, or BigQuery environments


Job Specific Criteria

  • CV Attachment is mandatory
  • How many years of experience you have with Dremio?
  • Which is your preferred job location (Mumbai / Bengaluru / Hyderabad / Gurgaon)?
  • Are you okay with 3 Days WFO?
  • Virtual Interview requires video to be on, are you okay with it?


Role & Responsibilities

You will be responsible for architecting, implementing, and optimizing Dremio-based data lakehouse environments integrated with cloud storage, BI, and data engineering ecosystems. The role requires a strong balance of architecture design, data modeling, query optimization, and governance enablement in large-scale analytical environments.

  • Design and implement Dremio lakehouse architecture on cloud (AWS/Azure/Snowflake/Databricks ecosystem).
  • Define data ingestion, curation, and semantic modeling strategies to support analytics and AI workloads.
  • Optimize Dremio reflections, caching, and query performance for diverse data consumption patterns.
  • Collaborate with data engineering teams to integrate data sources via APIs, JDBC, Delta/Parquet, and object storage layers (S3/ADLS).
  • Establish best practices for data security, lineage, and access control aligned with enterprise governance policies.
  • Support self-service analytics by enabling governed data products and semantic layers.
  • Develop reusable design patterns, documentation, and standards for Dremio deployment, monitoring, and scaling.
  • Work closely with BI and data science teams to ensure fast, reliable, and well-modeled access to enterprise data.


Ideal Candidate

  • Bachelor’s or master’s in computer science, Information Systems, or related field.
  • 5+ years in data architecture and engineering, with 3+ years in Dremio or modern lakehouse platforms.
  • Strong expertise in SQL optimization, data modeling, and performance tuning within Dremio or similar query engines (Presto, Trino, Athena).
  • Hands-on experience with cloud storage (S3, ADLS, GCS), Parquet/Delta/Iceberg formats, and distributed query planning.
  • Knowledge of data integration tools and pipelines (Airflow, DBT, Kafka, Spark, etc.).
  • Familiarity with enterprise data governance, metadata management, and role-based access control (RBAC).
  • Excellent problem-solving, documentation, and stakeholder communication skills.


Read more
Tecblic Private LImited
Ahmedabad
5 - 6 yrs
₹5L - ₹15L / yr
Windows Azure
skill iconPython
SQL
Data Warehouse (DWH)
Data modeling
+5 more

Job Description: Data Engineer

Location: Ahmedabad

Experience: 5 to 6 years

Employment Type: Full-Time



We are looking for a highly motivated and experienced Data Engineer to join our  team. As a Data Engineer, you will play a critical role in designing, building, and optimizing data pipelines that ensure the availability, reliability, and performance of our data infrastructure. You will collaborate closely with data scientists, analysts, and cross-functional teams to provide timely and efficient data solutions.



Responsibilities


● Design and optimize data pipelines for various data sources


● Design and implement efficient data storage and retrieval mechanisms


● Develop data modelling solutions and data validation mechanisms


● Troubleshoot data-related issues and recommend process improvements


● Collaborate with data scientists and stakeholders to provide data-driven insights and solutions


● Coach and mentor junior data engineers in the team




Skills Required: 


● Minimum 4 years of experience in data engineering or related field


● Proficient in designing and optimizing data pipelines and data modeling


● Strong programming expertise in Python


● Hands-on experience with big data technologies such as Hadoop, Spark, and Hive


● Extensive experience with cloud data services such as AWS, Azure, and GCP


● Advanced knowledge of database technologies like SQL, NoSQL, and data warehousing


● Knowledge of distributed computing and storage systems


● Familiarity with DevOps practices and power automate and Microsoft Fabric will be an added advantage


● Strong analytical and problem-solving skills with outstanding communication and collaboration abilities




Qualifications


  • Bachelor's degree in Computer Science, Data Science, or a Computer related field


Read more
Codemonk

at Codemonk

4 candid answers
2 recruiters
Bisman Gill
Posted by Bisman Gill
Bengaluru (Bangalore)
1yr+
Upto ₹18L / yr (Varies
)
skill iconGo Programming (Golang)
skill iconAmazon Web Services (AWS)
Microsoft Windows Azure
Google Cloud Platform (GCP)
skill iconReact.js
+5 more

Key Responsibilities:

  1. Application Development: Design and implement both client-side and server-side architecture using JavaScript frameworks and back-end technologies like Golang.
  2. Database Management: Develop and maintain relational and non-relational databases (MySQL, PostgreSQL, MongoDB) and optimize database queries and schema design.
  3. API Development: Build and maintain RESTfuI APIs and/or GraphQL services to integrate with front-end applications and third-party services.
  4. Code Quality & Performance: Write clean, maintainable code and implement best practices for scalability, performance, and security.
  5. Testing & Debugging: Perform testing and debugging to ensure the stability and reliability of applications across different environments and devices.
  6. Collaboration: Work closely with product managers, designers, and DevOps engineers to deliver features aligned with business goals.
  7. Documentation: Create and maintain documentation for code, systems, and application architecture to ensure knowledge transfer and team alignment.

Requirements:

  1. Experience: 1+ years in backend development in micro-services ecosystem, with proven experience in front-end and back-end frameworks.
  2. 1+ years experience Golang is mandatory
  3. Problem-Solving & DSA: Strong analytical skills and attention to detail.
  4. Front-End Skills: Proficiency in JavaScript and modern front-end frameworks (React, Angular, Vue.js) and familiarity with HTML/CSS.
  5. Back-End Skills: Experience with server-side languages and frameworks like Node.js, Express, Python or GoLang.
  6. Database Knowledge: Strong knowledge of relational databases (MySQL, PostgreSQL) and NoSQL databases (MongoDB).
  7. API Development: Hands-on experience with RESTfuI API design and integration, with a plus for GraphQL.
  8. DevOps Understanding: Familiarity with cloud platforms (AWS, Azure, GCP) and containerization (Docker, Kubernetes) is a bonus.
  9. Soft Skills: Excellent problem-solving skills, teamwork, and strong communication abilities.

Nice-to-Have:

  1. UI/UX Sensibility: Understanding of responsive design and user experience principles.
  2. CI/CD Knowledge: Familiarity with CI/CD tools and workflows (Jenkins, GitLab CI).
  3. Security Awareness: Basic understanding of web security standards and best practices.
Read more
Banking Industry

Banking Industry

Agency job
via Jobdost by Saida Pathan
Mangalore, Mumbai, Pune, Bengaluru (Bangalore)
3 - 5 yrs
₹8L - ₹10L / yr
SQL
Dashboard
skill iconData Analytics
Database Development

Who is an ideal fit for us?

We seek professionals who are analytical, demonstrate self-motivation, exhibit a proactive mindset, and possess a strong sense of responsibility and ownership in their work.

 

What will you get to work on?


As a member of the Implementation & Analytics team, you will:

● Design, develop, and optimize complex SQL queries to extract, transform, and analyze data

● Create advanced reports and dashboards using SQL, stored procedures, and other reporting tools

● Develop and maintain database structures, stored procedures, functions, and triggers

● Optimize database performance by tuning SQL queries, and indexing to handle large datasets efficiently

● Collaborate with business stakeholders and analysts to understand analytics requirements

● Automate data extraction, transformation, and reporting processes to improve efficiency


What do we expect from you?


For the SQL/Oracle Developer role, we are seeking candidates with the following skills and Expertise:

● Proficiency in SQL (Window functions, stored procedures) and MS Excel (advanced Excel skills)

● More than 3 plus years of relevant experience

● Java / Python experience is a plus but not mandatory

● Strong communication skills to interact with customers to understand their requirements

● Capable of working independently with minimal guidance, showcasing self-reliance and initiative

● Previous experience in automation projects is preferred

● Work From Office: Bangalore/Navi Mumbai/Pune/Client locations


Read more
Deqode

at Deqode

1 recruiter
purvisha Bhavsar
Posted by purvisha Bhavsar
Remote only
5 - 7 yrs
₹10L - ₹25L / yr
Windows Azure
Data engineering
SQL
CI/CD
databricks

Role: Senior Data Engineer (Azure)

Experience: 5+ Years

Location: Anywhere in india

Work Mode: Remote

Notice Period - Immediate joiners or Serving notice period

𝐊𝐞𝐲 𝐑𝐞𝐬𝐩𝐨𝐧𝐬𝐢𝐛𝐢𝐥𝐢𝐭𝐢𝐞𝐬:

  • Data processing on Azure using ADF, Streaming Analytics, Event Hubs, Azure Databricks, Data Migration Services, and Data Pipelines
  • Provisioning, configuring, and developing Azure solutions (ADB, ADF, ADW, etc.)
  • Designing and implementing scalable data models and migration strategies
  • Working on distributed big data batch or streaming pipelines (Kafka or similar)
  • Developing data integration & transformation solutions for structured and unstructured data
  • Collaborating with cross-functional teams for performance tuning and optimization
  • Monitoring data workflows and ensuring compliance with governance and quality standards
  • Driving continuous improvement through automation and DevOps practices

𝐌𝐚𝐧𝐝𝐚𝐭𝐨𝐫𝐲 𝐒𝐤𝐢𝐥𝐥𝐬 & 𝐄𝐱𝐩𝐞𝐫𝐢𝐞𝐧𝐜𝐞:

  • 5–10 years of experience as a Data Engineer
  • Strong proficiency in Azure Databricks, PySpark, Python, SQL, and Azure Data Factory
  • Experience in Data Modelling, Data Migration, and Data Warehousing
  • Good understanding of database structure principles and schema design
  • Hands-on experience with MS SQL Server, Oracle, or similar RDBMS platforms
  • Experience with DevOps tools (Azure DevOps, Jenkins, Airflow, Azure Monitor) — good to have
  • Knowledge of distributed data processing and real-time streaming (Kafka/Event Hub)
  • Familiarity with visualization tools like Power BI or Tableau
  • Strong analytical, problem-solving, and debugging skills
  • Self-motivated, detail-oriented, and capable of managing priorities effectively


Read more
Travelling Industry

Travelling Industry

Agency job
via Peak Hire Solutions by Dhara Thakkar
Gurugram
1 - 2 yrs
₹15L - ₹18L / yr
Data Visualization
skill iconData Analytics
skill iconPython
Product Management
Data-flow analysis
+15 more

MANDATORY CRITERIA:

- Candidate Must be a graduate from IIT, NIT, NSUT, or DTU.

- Need candidate with 1–2 years of pure Product Analyst experience is mandatory.

- Candidate must have strong hands-on experience in Product + Data Analysis + Python.

- Candidate should have Python skill on the scale of 3/5 at least.

- Proficiency in SQL and ability to work with large datasets.

- The candidate must experience with A/B testing, cohorts, funnels, retention, and product metrics.

- Hands-on experience with data visualization tools (Tableau, Power BI, Looker, Mixpanel, GA, etc.).

- Candidate must have experienece in Jira.

- Strong communication skills with the ability to work with Product, Engineering, Business, and Ops teams.


OVERVIEW:

As a Product Analyst, you will play a critical role in driving product decisions through data insights and operational understanding. You’ll work closely with Product Managers, Engineering, Business, and Operations teams to analyze user behavior, monitor feature performance, and identify opportunities that accelerate growth, improve user experience, and increase revenue. Your focus will be on translating data into actionable strategies, supporting product roadmaps, and enabling informed decision-making across demand-side projects and operations.


WHAT YOU WILL DO?

● Partnering with Product Managers and cross-functional teams to define metrics, build dashboards, and track product performance.

● Conducting deep-dive analyses of large-scale data to identify trends, user behavior patterns, growth gaps, and improvement opportunities.

● Performing competitive benchmarking and industry research to support product strategy and prioritization.

● Generating data-backed insights to drive feature enhancements, product experiments, and business decisions.

● Tracking post-launch impact by measuring adoption, engagement, retention, and ROI of new features.

● Working with Data, Engineering, Business, and Ops teams to design and measure experiments (A/B tests, cohorts, funnels).

● Creating reports, visualizations, and presentations that simplify complex data for stakeholdersand leadership.

● Supporting the product lifecycle with relevant data inputs during research, ideation, launch, and optimization phases.


WHAT WE ARE LOOKING FOR?

● Bachelor’s degree in engineering, statistics, business, economics, mathematics, data science, or a related field.

● Strong analytical, quantitative, and problem-solving skills.

● Proficiency in SQL and ability to work with large datasets.

● Experience with data visualization/reporting tools (e.g., Excel, Google Sheets, Power BI, Tableau, Looker, Mixpanel, GA).

● Excellent communication skills — able to turn data into clear narratives and actionable recommendations.

● Ability to work collaboratively in cross-functional teams.

● Passion for product, user behavior, and data-driven decision-making

● Prior internship or work experience in product analytics, business analysis, consulting, or growth teams.

● Familiarity with experimentation techniques (A/B testing, funnels, cohorts, retention metrics).

● Understanding of product management concepts and tools (Jira, Confluence, etc.).

● Knowledge of Python or R for data analysis (optional but beneficial).

● Exposure to consumer tech, mobility, travel, or marketplaces.

Read more
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort
Why apply via Cutshort?
Connect with actual hiring teams and get their fast response. No spam.
Find more jobs
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort