Cutshort logo

50+ SQL Jobs in India

Apply to 50+ SQL Jobs on CutShort.io. Find your next job, effortlessly. Browse SQL Jobs and apply today!

Sql jobs in other cities
ESQL JobsESQL Jobs in PuneMySQL JobsMySQL Jobs in AhmedabadMySQL Jobs in Bangalore (Bengaluru)MySQL Jobs in BhubaneswarMySQL Jobs in ChandigarhMySQL Jobs in ChennaiMySQL Jobs in CoimbatoreMySQL Jobs in Delhi, NCR and GurgaonMySQL Jobs in HyderabadMySQL Jobs in IndoreMySQL Jobs in JaipurMySQL Jobs in Kochi (Cochin)MySQL Jobs in KolkataMySQL Jobs in MumbaiMySQL Jobs in PunePL/SQL JobsPL/SQL Jobs in AhmedabadPL/SQL Jobs in Bangalore (Bengaluru)PL/SQL Jobs in ChandigarhPL/SQL Jobs in ChennaiPL/SQL Jobs in CoimbatorePL/SQL Jobs in Delhi, NCR and GurgaonPL/SQL Jobs in HyderabadPL/SQL Jobs in IndorePL/SQL Jobs in JaipurPL/SQL Jobs in KolkataPL/SQL Jobs in MumbaiPL/SQL Jobs in PunePostgreSQL JobsPostgreSQL Jobs in AhmedabadPostgreSQL Jobs in Bangalore (Bengaluru)PostgreSQL Jobs in BhubaneswarPostgreSQL Jobs in ChandigarhPostgreSQL Jobs in ChennaiPostgreSQL Jobs in CoimbatorePostgreSQL Jobs in Delhi, NCR and GurgaonPostgreSQL Jobs in HyderabadPostgreSQL Jobs in IndorePostgreSQL Jobs in JaipurPostgreSQL Jobs in Kochi (Cochin)PostgreSQL Jobs in KolkataPostgreSQL Jobs in MumbaiPostgreSQL Jobs in PunePSQL JobsPSQL Jobs in Bangalore (Bengaluru)PSQL Jobs in ChennaiPSQL Jobs in Delhi, NCR and GurgaonPSQL Jobs in HyderabadPSQL Jobs in MumbaiPSQL Jobs in PuneRemote SQL JobsRpgSQL JobsRpgSQL Jobs in HyderabadSQL Jobs in AhmedabadSQL Jobs in Bangalore (Bengaluru)SQL Jobs in BhubaneswarSQL Jobs in ChandigarhSQL Jobs in ChennaiSQL Jobs in CoimbatoreSQL Jobs in Delhi, NCR and GurgaonSQL Jobs in HyderabadSQL Jobs in IndoreSQL Jobs in JaipurSQL Jobs in Kochi (Cochin)SQL Jobs in KolkataSQL Jobs in MumbaiSQL Jobs in PuneTransact-SQL JobsTransact-SQL Jobs in Bangalore (Bengaluru)Transact-SQL Jobs in ChennaiTransact-SQL Jobs in HyderabadTransact-SQL Jobs in JaipurTransact-SQL Jobs in Pune
icon
-
Remote only
8 - 13 yrs
₹10L - ₹33L / yr
python
PySpark
Big Data
SQL

Role: Lead Data Engineer Core

Responsibilities: Lead end-to-end design, development, and delivery of complex cloud-based data pipelines.

Collaborate with architects and stakeholders to translate business requirements into technical data solutions.

Ensure scalability, reliability, and performance of data systems across environments. Provide mentorship and technical leadership to data engineering teams. Define and enforce best practices for data modeling, transformation, and governance.


Optimize data ingestion and transformation frameworks for efficiency and cost management. Contribute to data architecture design and review sessions across projects.


Qualifications: Bachelor’s or Master’s degree in Computer Science, Engineering, or related field.

8+ years of experience in data engineering with proven leadership in designing cloud native data systems.


Strong expertise in Python, SQL, Apache Spark, and at least one cloud platform (Azure, AWS, or GCP). Experience with Big Data, DataLake, DeltaLake, and Lakehouse architectures Proficient in one or more database technologies (e.g. PostgreSQL, Redshift, Snowflake, and NoSQL databases).


Ability to recommend and implement scalable data pipelines Preferred Qualifications: Cloud certification (AWS, Azure, or GCP). Experience with Databricks, Snowflake, or Terraform. Familiarity with data governance, lineage, and observability tools. Strong collaboration skills and ability to influence data-driven decisions across teams.

Read more
Industry Automation

Industry Automation

Agency job
via Michael Page by Pramod P
Bengaluru (Bangalore)
5 - 9 yrs
₹20L - ₹30L / yr
skill iconC#
Microsoft Windows Azure
API
SQL
NOSQL Databases
+3 more

Your job: • Develop and maintain software components, including APIs and microservices

• Optimize backend systems on Microsoft Azure using App Services, Functions, and AzureSQL

• Contribute to frontend development as needed in a full-stack capacity

• Participate in code reviews, unit testing, and bug fixing to ensure high code quality

• Collaborate with the development team, product owner, and DevOps team in agile projects

• Maintain clear and comprehensive technical documentation for all feature and APIs


Your qualification:

• Master’s or bachelor’s degree in computer science

• 5 to 8yearsofexperienceinbackendwebapplicationdevelopment

• Expertise in backend technologies such as C#/.NET Core and in databases, including SQL and NoSQL (AzureSQL, Cosmos DB)

• Experience with Microsoft Azure services (App Services, Functions, SQL) and familiarity with frontend technologies (JavaScript/TypeScript and/ or Angular) would be an added advantage

• Proficiency in cloud-based backend development, full-stack development, and software optimization

• Experience with agile methodologies, unit testing, automated testing, and CI/CD pipelines would be beneficial • Excellent written and spoken English communications kills

Read more
Service Co

Service Co

Agency job
via Vikash Technologies by Rishika Teja
Mumbai, Navi Mumbai, Pune
3 - 5 yrs
₹12L - ₹15L / yr
PowerBI
SQL
SQL Server Reporting Services (SSRS)

Hiring for Power Bi Developer


Exp : 3 - 5 yrs

Edu : Any Graduates

Work Location : Mumbai, Airoli 

Notice Period : Immediate - 15 days


Skills :

PowerBI Desktop/Service,

SSRS,

SQL Server


Good to Have - SSIS, SSAS, Report Migration, Performance Optimization

Read more
Ekloud INC
ashwini rathod
Posted by ashwini rathod
Remote only
8 - 15 yrs
₹7L - ₹30L / yr
java
Fullstack Developer
skill iconAngular (2+)
skill iconSpring Boot
SQL
+2 more

Java Angular Fullstack Developer

 

Job Description:


Technical Lead – Full Stack

Experience: 8–12 years (Strong candidates Java 50% - Angular 50%)

Location – remote 

Pf no is mandatory 



Tech Stack: Java, Spring Boot, Microservices, Angular, SQL

Focus: Hands-on coding, solution design, team leadership, delivery ownership

 

Must-Have Skills (Depth)



Java (8+): Streams, concurrency, collections, JVM internals (GC), exception handling.

Spring Boot: Security, Actuator, Data/JPA, Feign/RestTemplate, validation, profiles, configuration management.

Microservices: API design, service discovery, resilience patterns (Hystrix/Resilience4j), messaging (Kafka/RabbitMQ) optional.

React: Hooks, component lifecycle, state management, error boundaries, testing (Jest/RTL).

SQL: Joins, aggregations, indexing, query optimization, transaction isolation, schema design.

Testing: JUnit/Mockito for backend; Jest/RTL/Cypress for frontend.

DevOps: Git, CI/CD, containers (Docker), familiarity with deployment environments.

Read more
Kanerika Software

at Kanerika Software

3 candid answers
2 recruiters
Bisman Gill
Posted by Bisman Gill
Hyderabad, Ahmedabad, Indore
3 - 5 yrs
Upto ₹18L / yr (Varies
)
skill icon.NET
skill iconAngular (2+)
SQL
Training and Development
Azure

Who we are:

Kanerika Inc. is a premier global software products and services firm that specializes in providing innovative solutions and services for data-driven enterprises. Our focus is to empower businesses to achieve their digital transformation goals and maximize their business impact through the effective use of data and AI. We leverage cutting-edge technologies in data analytics, data governance, AI-ML, GenAI/ LLM and industry best practices to deliver custom solutions that help organizations optimize their operations, enhance customer experiences, and drive growth.

Awards and Recognitions

Kanerika has won several awards over the years, including:

  1. Best Place to Work 2023 by Great Place to Work®
  2. Top 10 Most Recommended RPA Start-Ups in 2022 by RPA Today
  3. NASSCOM Emerge 50 Award in 2014
  4. Frost & Sullivan India 2021 Technology Innovation Award for its Kompass composable solution architecture
  5. Kanerika has also been recognized for its commitment to customer privacy and data security, having achieved ISO 27701, SOC2, and GDPR compliances.

Working for us

Kanerika is rated 4.6/5 on Glassdoor, for many good reasons. We truly value our employees' growth, well-being, and diversity, and people’s experiences bear this out. At Kanerika, we offer a host of enticing benefits that create an environment where you can thrive both personally and professionally. From our inclusive hiring practices and mandatory training on creating a safe work environment to our flexible working hours and generous parental leave, we prioritize the well-being and success of our employees. Our commitment to professional development is evident through our mentorship programs, job training initiatives, and support for professional certifications. Additionally, our company-sponsored outings and various time-off benefits ensure a healthy work-life balance. Join us at Kanerika and become part of a vibrant and diverse community where your talents are recognized, your growth is nurtured, and your contributions make a real impact. See the benefits section below for the perks you’ll get while working for Kanerika.


Locations

We are located in Austin (USA), Singapore, Hyderabad, Indore and Ahmedabad (India). 

Job Location: Hyderabad, Indore and Ahmedabad.

Role:

We are looking for A highly skilled Full Stack .NET Developer with strong hands-on experience in C#, .NET Core, ASP.NET Core, Web API, and Microservices Architecture. Proficient in developing scalable and high-performing applications using SQL Server, NoSQL databases, and Entity Framework (v6+). Recognized for excellent troubleshooting, problem-solving, and communication skills, with the ability to collaborate effectively with cross-functional and international teams, including US counterparts.

Technical Skills

  • Programming Languages: C#, TypeScript, JavaScript
  • Frameworks & Technologies: .NET Core, ASP.NET Core, Web API, Angular (v10+), Entity Framework (v6+), Microservices Architecture
  • Databases: SQL Server, NoSQL
  • Cloud Platform: Microsoft Azure
  • Design & Architecture: OOPs Concepts, Design Patterns, Reusable Libraries, Microservices Implementation
  • Front-End Development: Angular Material, HTML5, CSS3, Responsive UI Development
  • Additional Skills: Excellent troubleshooting abilities, strong communication (verbal & written), and effective collaboration with US counterparts

What You’ll Bring:

  • Bachelor’s degree in Computer Science, Engineering, or a related field, or equivalent work experience.
  • 2-5 years of experience.
  • Proven experience delivering high-quality web applications.

Mandatory Skills

  • Strong hands-on experience on C#, SQL Server, OOPS Concepts, Micro Services Architecture.
  • Solid experience on .NET Core, ASP.NET Core, Web API, SQL, No SQL, Entity Framework 6 or above, Azure, Applying Design Patterns.
  • ·Strong proficiency in Angular framework (v10+ preferred)and TypeScript & Solid understanding of HTML5, CSS3, JavaScript 
  • Skill for writing reusable libraries & Experience with Angular Material or other UI component libraries
  • Excellent Communication skills both oral & written.
  • Excellent troubleshooting and communication skills, ability to communicate clearly with US counter parts

Preferred Skills (Nice to Have):

  • Self – Starter with solid analytical and problem- solving skills.
  • Willingness to work extra hours to meet deliverables.
  • Understanding of Agile/Scrum Methodologies.
  • Exposure to cloud platform like AWS/Azure

Why join us? 

  • Work with a passionate and innovative team in a fast-paced, growth-oriented environment.
  • Gain hands-on experience in content marketing with exposure to real-world projects.
  • Opportunity to learn from experienced professionals and enhance your marketing skills.
  • Contribute to exciting initiatives and make an impact from day one. 
  • Competitive stipend and potential for growth within the company.

Employee Benefits

1. Culture:

  1. Open Door Policy: Encourages open communication and accessibility to management.
  2. Open Office Floor Plan: Fosters a collaborative and interactive work environment.
  3. Flexible Working Hours: Allows employees to have flexibility in their work schedules.
  4. Employee Referral Bonus: Rewards employees for referring qualified candidates.
  5. Appraisal Process Twice a Year: Provides regular performance evaluations and feedback.

2. Inclusivity and Diversity:

  1. Hiring practices that promote diversity: Ensures a diverse and inclusive workforce.
  2. Mandatory POSH training: Promotes a safe and respectful work environment.

3. Health Insurance and Wellness Benefits:

  1. GMC and Term Insurance: Offers medical coverage and financial protection.
  2. Health Insurance: Provides coverage for medical expenses.
  3. Disability Insurance: Offers financial support in case of disability.

4. Child Care & Parental Leave Benefits:

  1. Company-sponsored family events: Creates opportunities for employees and their       families to bond.
  2. Generous Parental Leave: Allows parents to take time off after the birth or adoption of a child.
  3. Family Medical Leave: Offers leave for employees to take care of family members' medical needs.

5. Perks and Time-Off Benefits:

  1. Company-sponsored outings: Organizes recreational activities for employees.
  2. Gratuity: Provides a monetary benefit as a token of appreciation.
  3. Provident Fund: Helps employees save for retirement.
  4. Generous PTO: Offers more than the industry standard for paid time off.
  5. Paid sick days: Allows employees to take paid time off when they are unwell.
  6. Paid holidays: Gives employees paid time off for designated holidays.
  7. Bereavement Leave: Provides time off for employees to grieve the loss of a loved one.

6. Professional Development Benefits:

  1. L&D with FLEX- Enterprise Learning Repository: Provides access to a learning repository for professional development.
  2. Mentorship Program: Offers guidance and support from experienced professionals.
  3. Job Training: Provides training to enhance job-related skills.
  4. Professional Certification Reimbursements: Assists employees in obtaining professional      certifications.
  5. Promote from Within: Encourages internal growth and advancement opportunities.



Read more
Grey Chain Technology

at Grey Chain Technology

5 candid answers
Pratikshya Pusty
Posted by Pratikshya Pusty
Remote, Gurugram
3 - 6 yrs
₹6L - ₹8L / yr
Functional testing
SQL
API

Job Description

3-5 years of hands-on experience in manual testing involving functional, non-functional, regression, and integration testing in a structured environment.

Candidate should have exceptional communication skills.

Should have minimum 1 year work experience in data comparison testing.

Experience in testing web-based applications.

Able to define the scope of testing.

Experience in testing large-scale solutions integrating multiple source and target systems.

Experience in API testing.

Experience in Database verification using SQL queries.

Experience working in an Agile team.

Should be able to attend Agile ceremonies in UK hours.

Having a good understanding of Data Migration projects will be a plus.

Read more
Kanerika Software

at Kanerika Software

3 candid answers
2 recruiters
Bisman Gill
Posted by Bisman Gill
Hyderabad, Indore, Ahmedabad
6 - 9 yrs
Upto ₹30L / yr (Varies
)
skill icon.NET
skill iconAngular (2+)
SQL

Who we are:

Kanerika Inc. is a premier global software products and services firm that specializes in providing innovative solutions and services for data-driven enterprises. Our focus is to empower businesses to achieve their digital transformation goals and maximize their business impact through the effective use of data and AI.

We leverage cutting-edge technologies in data analytics, data governance, AI-ML, GenAI/ LLM and industry best practices to deliver custom solutions that help organizations optimize their operations, enhance customer experiences, and drive growth.


Awards and Recognitions:

Kanerika has won several awards over the years, including:

1. Best Place to Work 2023 by Great Place to Work®

2. Top 10 Most Recommended RPA Start-Ups in 2022 by RPA Today

3. NASSCOM Emerge 50 Award in 2014

4. Frost & Sullivan India 2021 Technology Innovation Award for its Kompass composable solution architecture

5. Kanerika has also been recognized for its commitment to customer privacy and data security, having achieved ISO 27701, SOC2, and GDPR compliances.


Working for us:

Kanerika is rated 4.6/5 on Glassdoor, for many good reasons. We truly value our employees' growth, well-being, and diversity, and people’s experiences bear this out. At Kanerika, we offer a host of enticing benefits that create an environment where you can thrive both personally and professionally. From our inclusive hiring practices and mandatory training on creating a safe work environment to our flexible working hours and generous parental leave, we prioritize the well-being and success of our employees.


Our commitment to professional development is evident through our mentorship programs, job training initiatives, and support for professional certifications. Additionally, our company-sponsored outings and various time-off benefits ensure a healthy work-life balance. Join us at Kanerika and become part of a vibrant and diverse community where your talents are recognized, your growth is nurtured, and your contributions make a real impact. See the benefits section below for the perks you’ll get while working for Kanerika.


About the Role:

We are looking for A highly skilled Full Stack .NET Developer with strong hands-on experience in C#, .NET Core, ASP.NET Core, Web API, and Microservices Architecture. Proficient in developing scalable and high-performing applications using SQL Server, NoSQL databases, and Entity Framework (v6+). Recognized for excellent troubleshooting, problem-solving, and communication skills, with the ability to collaborate effectively with cross-functional and international teams, including US counterparts.


Technical Skills:

  • Programming Languages: C#, TypeScript, JavaScript
  • Frameworks & Technologies: .NET Core, ASP.NET Core, Web API, Angular (v10+), Entity Framework (v6+), Microservices Architecture
  • Databases: SQL Server, NoSQL
  • Cloud Platform: Microsoft Azure
  • Design & Architecture: OOPs Concepts, Design Patterns, Reusable Libraries, Microservices Implementation
  • Front-End Development: Angular Material, HTML5, CSS3, Responsive UI Development
  • Additional Skills: Excellent troubleshooting abilities, strong communication (verbal & written), and effective collaboration with US counterparts


What You’ll Bring:

  • Bachelor’s degree in Computer Science, Engineering, or a related field, or equivalent work experience.
  • 6+ years of experience
  • Proven experience delivering high-quality web applications.


Mandatory Skills:

  • Strong hands-on experience on C#, SQL Server, OOPS Concepts, Micro Services Architecture.
  • Solid experience on .NET Core, ASP.NET Core, Web API, SQL, No SQL, Entity Framework 6 or above, Azure, Applying Design Patterns. Strong proficiency in Angular framework (v10+ preferred)and TypeScript & Solid understanding of HTML5, CSS3, JavaScript
  • Skill for writing reusable libraries & Experience with Angular Material or other UI component libraries
  • Excellent Communication skills both oral & written.
  • Excellent troubleshooting and communication skills, ability to communicate clearly with US counter parts


Preferred Skills (Nice to Have):

  • Self – Starter with solid analytical and problem- solving skills. Willingness to work extra hours to meet deliverables
  • Understanding of Agile/Scrum Methodologies.
  • Exposure to cloud platform like AWS/Azure


Employee Benefits:

1. Culture:

  • Open Door Policy: Encourages open communication and accessibility to management.
  • Open Office Floor Plan: Fosters a collaborative and interactive work environment.
  • Flexible Working Hours: Allows employees to have flexibility in their work schedules.
  • Employee Referral Bonus: Rewards employees for referring qualified candidates.
  • Appraisal Process Twice a Year: Provides regular performance evaluations and feedback.


2. Inclusivity and Diversity:

  • Hiring practices that promote diversity: Ensures a diverse and inclusive workforce.
  • Mandatory POSH training: Promotes a safe and respectful work environment.


3. Health Insurance and Wellness Benefits:

  • GMC and Term Insurance: Offers medical coverage and financial protection.
  • Health Insurance: Provides coverage for medical expenses.
  • Disability Insurance: Offers financial support in case of disability.


4. Child Care & Parental Leave Benefits:

  • Company-sponsored family events: Creates opportunities for employees and their families to bond.
  • Generous Parental Leave: Allows parents to take time off after the birth or adoption of a child.
  • Family Medical Leave: Offers leave for employees to take care of family members' medical needs.


5. Perks and Time-Off Benefits:

  • Company-sponsored outings: Organizes recreational activities for employees.
  • Gratuity: Provides a monetary benefit as a token of appreciation.
  • Provident Fund: Helps employees save for retirement.
  • Generous PTO: Offers more than the industry standard for paid time off.
  • Paid sick days: Allows employees to take paid time off when they are unwell.
  • Paid holidays: Gives employees paid time off for designated holidays.
  • Bereavement Leave: Provides time off for employees to grieve the loss of a loved one.


6. Professional Development Benefits:

  • L&D with FLEX- Enterprise Learning Repository: Provides access to a learning repository for professional development.
  • Mentorship Program: Offers guidance and support from experienced professionals.
  • Job Training: Provides training to enhance job-related skills.
  • Professional Certification Reimbursements: Assists employees in obtaining professional      certifications.
  • Promote from Within: Encourages internal growth and advancement opportunities.
Read more
Aryush Infotech India Pvt Ltd
Nitin Gupta
Posted by Nitin Gupta
Bengaluru (Bangalore), Bhopal
2 - 3 yrs
₹3L - ₹4L / yr
Fintech
Test Automation (QA)
Manual testing
skill iconPostman
JIRA
+5 more

Job Title: QA Tester – FinTech (Manual + Automation Testing)

Location: Bangalore, India

Job Type: Full-Time

Experience Required: 3 Years

Industry: FinTech / Financial Services

Function: Quality Assurance / Software Testing

 

About the Role:

We are looking for a skilled QA Tester with 3 years of experience in both manual and automation testing, ideally in the FinTech domain. The candidate will work closely with development and product teams to ensure that our financial applications meet the highest standards of quality, performance, and security.

 

Key Responsibilities:

  • Analyze business and functional requirements for financial products and translate them into test scenarios.
  • Design, write, and execute manual test cases for new features, enhancements, and bug fixes.
  • Develop and maintain automated test scripts using tools such as Selenium, TestNG, or similar frameworks.
  • Conduct API testing using Postman, Rest Assured, or similar tools.
  • Perform functional, regression, integration, and system testing across web and mobile platforms.
  • Work in an Agile/Scrum environment and actively participate in sprint planning, stand-ups, and retrospectives.
  • Log and track defects using JIRA or a similar defect management tool.
  • Collaborate with developers, BAs, and DevOps teams to improve quality across the SDLC.
  • Ensure test coverage for critical fintech workflows like transactions, KYC, lending, payments, and compliance.
  • Assist in setting up CI/CD pipelines for automated test execution using tools like Jenkins, GitLab CI, etc.

 

Required Skills and Experience:

  • 3+ years of hands-on experience in manual and automation testing.
  • Solid understanding of QA methodologies, STLC, and SDLC.
  • Experience in testing FinTech applications such as digital wallets, online banking, investment platforms, etc.
  • Strong experience with Selenium WebDriver, TestNG, Postman, and JIRA.
  • Knowledge of API testing, including RESTful services.
  • Familiarity with SQL to validate data in databases.
  • Understanding of CI/CD processes and basic scripting for automation integration.
  • Good problem-solving skills and attention to detail.
  • Excellent communication and documentation skills.

 

Preferred Qualifications:

  • Exposure to financial compliance and regulatory testing (e.g., PCI DSS, AML/KYC).
  • Experience with mobile app testing (iOS/Android).
  • Working knowledge of test management tools like TestRail, Zephyr, or Xray.
  • Performance testing experience (e.g., JMeter, LoadRunner) is a plus.
  • Basic knowledge of version control systems (e.g., Git).


Read more
Global digital transformation solutions provider

Global digital transformation solutions provider

Agency job
via Peak Hire Solutions by Dhara Thakkar
Pune
6 - 9 yrs
₹15L - ₹25L / yr
Data engineering
Apache Kafka
skill iconPython
skill iconAmazon Web Services (AWS)
AWS Lambda
+11 more

Job Details

- Job Title: Lead I - Data Engineering 

- Industry: Global digital transformation solutions provider

- Domain - Information technology (IT)

- Experience Required: 6-9 years

- Employment Type: Full Time

- Job Location: Pune

- CTC Range: Best in Industry


Job Description

Job Title: Senior Data Engineer (Kafka & AWS)

Responsibilities:

  • Develop and maintain real-time data pipelines using Apache Kafka (MSK or Confluent) and AWS services.
  • Configure and manage Kafka connectors, ensuring seamless data flow and integration across systems.
  • Demonstrate strong expertise in the Kafka ecosystem, including producers, consumers, brokers, topics, and schema registry.
  • Design and implement scalable ETL/ELT workflows to efficiently process large volumes of data.
  • Optimize data lake and data warehouse solutions using AWS services such as Lambda, S3, and Glue.
  • Implement robust monitoring, testing, and observability practices to ensure reliability and performance of data platforms.
  • Uphold data security, governance, and compliance standards across all data operations.

 

Requirements:

  • Minimum of 5 years of experience in Data Engineering or related roles.
  • Proven expertise with Apache Kafka and the AWS data stack (MSK, Glue, Lambda, S3, etc.).
  • Proficient in coding with Python, SQL, and Java — with Java strongly preferred.
  • Experience with Infrastructure-as-Code (IaC) tools (e.g., CloudFormation) and CI/CD pipelines.
  • Excellent problem-solvingcommunication, and collaboration skills.
  • Flexibility to write production-quality code in both Python and Java as required.

 

Skills: Aws, Kafka, Python


Must-Haves

Minimum of 5 years of experience in Data Engineering or related roles.

Proven expertise with Apache Kafka and the AWS data stack (MSK, Glue, Lambda, S3, etc.).

Proficient in coding with Python, SQL, and Java — with Java strongly preferred.

Experience with Infrastructure-as-Code (IaC) tools (e.g., CloudFormation) and CI/CD pipelines.

Excellent problem-solving, communication, and collaboration skills.

Flexibility to write production-quality code in both Python and Java as required.

Skills: Aws, Kafka, Python

Notice period - 0 to 15days only

Read more
AI-First Company

AI-First Company

Agency job
via Peak Hire Solutions by Dhara Thakkar
Bengaluru (Bangalore), Mumbai, Hyderabad, Gurugram
5 - 10 yrs
₹20L - ₹45L / yr
dremio
lakehouse
Data architecture
Data engineering
SQL
+48 more

ROLES AND RESPONSIBILITIES:

You will be responsible for architecting, implementing, and optimizing Dremio-based data Lakehouse environments integrated with cloud storage, BI, and data engineering ecosystems. The role requires a strong balance of architecture design, data modeling, query optimization, and governance enablement in large-scale analytical environments.


  • Design and implement Dremio lakehouse architecture on cloud (AWS/Azure/Snowflake/Databricks ecosystem).
  • Define data ingestion, curation, and semantic modeling strategies to support analytics and AI workloads.
  • Optimize Dremio reflections, caching, and query performance for diverse data consumption patterns.
  • Collaborate with data engineering teams to integrate data sources via APIs, JDBC, Delta/Parquet, and object storage layers (S3/ADLS).
  • Establish best practices for data security, lineage, and access control aligned with enterprise governance policies.
  • Support self-service analytics by enabling governed data products and semantic layers.
  • Develop reusable design patterns, documentation, and standards for Dremio deployment, monitoring, and scaling.
  • Work closely with BI and data science teams to ensure fast, reliable, and well-modeled access to enterprise data.


IDEAL CANDIDATE:

  • Bachelor’s or Master’s in Computer Science, Information Systems, or related field.
  • 5+ years in data architecture and engineering, with 3+ years in Dremio or modern lakehouse platforms.
  • Strong expertise in SQL optimization, data modeling, and performance tuning within Dremio or similar query engines (Presto, Trino, Athena).
  • Hands-on experience with cloud storage (S3, ADLS, GCS), Parquet/Delta/Iceberg formats, and distributed query planning.
  • Knowledge of data integration tools and pipelines (Airflow, DBT, Kafka, Spark, etc.).
  • Familiarity with enterprise data governance, metadata management, and role-based access control (RBAC).
  • Excellent problem-solving, documentation, and stakeholder communication skills.


PREFERRED:

  • Experience integrating Dremio with BI tools (Tableau, Power BI, Looker) and data catalogs (Collibra, Alation, Purview).
  • Exposure to Snowflake, Databricks, or BigQuery environments.
  • Experience in high-tech, manufacturing, or enterprise data modernization programs.
Read more
LogIQ Labs Pvt.Ltd.

at LogIQ Labs Pvt.Ltd.

2 recruiters
HR eShipz
Posted by HR eShipz
Bengaluru (Bangalore)
3 - 5 yrs
₹4L - ₹8L / yr
skill iconPython
API
SQL

An L2 Technical Support Engineer with Python knowledge is responsible for handling escalated, more complex technical issues that the Level 1 (L1) support team cannot resolve. Your primary goal is to perform deep-dive analysis, troubleshooting, and problem resolution to minimize customer downtime and ensure system stability.

Python is a key skill, used for scripting, automation, debugging, and data analysis in this role.

Key Responsibilities

  • Advanced Troubleshooting & Incident Management:
  • Serve as the escalation point for complex technical issues (often involving software bugs, system integrations, backend services, and APIs) that L1 support cannot resolve.
  • Diagnose, analyze, and resolve problems, often requiring in-depth log analysis, code review, and database querying.
  • Own the technical resolution of incidents end-to-end, adhering strictly to established Service Level Agreements (SLAs).
  • Participate in on-call rotation for critical (P1) incident support outside of regular business hours.
  • Python-Specific Tasks:
  • Develop and maintain Python scripts for automation of repetitive support tasks, system health checks, and data manipulation.
  • Use Python for debugging and troubleshooting by analyzing application code, API responses, or data pipeline issues.
  • Write ad-hoc scripts to extract, analyze, or modify data in databases for diagnostic or resolution purposes.
  • Potentially apply basic-to-intermediate code fixes in Python applications in collaboration with development teams.
  • Collaboration and Escalation:
  • Collaborate closely with L3 Support, Software Engineers, DevOps, and Product Teams to report bugs, propose permanent fixes, and provide comprehensive investigation details.
  • Escalate issues that require significant product changes or deeper engineering expertise to the L3 team, providing clear, detailed documentation of all steps taken.
  • Documentation and Process Improvement:
  • Conduct Root Cause Analysis (RCA) for major incidents, documenting the cause, resolution, and preventative actions.
  • Create and maintain a Knowledge Base (KB), runbooks, and Standard Operating Procedures (SOPs) for recurring issues to empower L1 and enable customer self-service.
  • Proactively identify technical deficiencies in processes and systems and recommend improvements to enhance service quality.
  • Customer Communication:
  • Maintain professional, clear, and timely communication with customers, explaining complex technical issues and resolutions in an understandable manner.

Required Technical Skills

  • Programming/Scripting:
  • Strong proficiency in Python (for scripting, automation, debugging, and data manipulation).
  • Experience with other scripting languages like Bash or Shell
  • Databases:
  • Proficiency in SQL for complex querying, debugging data flow issues, and data extraction.
  • Application/Web Technologies:
  • Understanding of API concepts (RESTful/SOAP) and experience troubleshooting them using tools like Postman or curl.
  • Knowledge of application architectures (e.g., microservices, SOA) is a plus.
  • Monitoring & Tools:
  • Experience with support ticketing systems (e.g., JIRA, ServiceNow).
  • Familiarity with log aggregation and monitoring tools (Kibana, Splunk, ELK Stack, Grafana)


Read more
LogIQ Labs Pvt.Ltd.

at LogIQ Labs Pvt.Ltd.

2 recruiters
HR eShipz
Posted by HR eShipz
Bengaluru (Bangalore)
3 - 4 yrs
₹4L - ₹10L / yr
skill iconPython
API
SQL

An L2 Technical Support Engineer with Python knowledge is responsible for handling escalated, more complex technical issues that the Level 1 (L1) support team cannot resolve. Your primary goal is to perform deep-dive analysis, troubleshooting, and problem resolution to minimize customer downtime and ensure system stability.

Python is a key skill, used for scripting, automation, debugging, and data analysis in this role.

Key Responsibilities

  • Advanced Troubleshooting & Incident Management:
  • Serve as the escalation point for complex technical issues (often involving software bugs, system integrations, backend services, and APIs) that L1 support cannot resolve.
  • Diagnose, analyze, and resolve problems, often requiring in-depth log analysis, code review, and database querying.
  • Own the technical resolution of incidents end-to-end, adhering strictly to established Service Level Agreements (SLAs).
  • Participate in on-call rotation for critical (P1) incident support outside of regular business hours.
  • Python-Specific Tasks:
  • Develop and maintain Python scripts for automation of repetitive support tasks, system health checks, and data manipulation.
  • Use Python for debugging and troubleshooting by analyzing application code, API responses, or data pipeline issues.
  • Write ad-hoc scripts to extract, analyze, or modify data in databases for diagnostic or resolution purposes.
  • Potentially apply basic-to-intermediate code fixes in Python applications in collaboration with development teams.
  • Collaboration and Escalation:
  • Collaborate closely with L3 Support, Software Engineers, DevOps, and Product Teams to report bugs, propose permanent fixes, and provide comprehensive investigation details.
  • Escalate issues that require significant product changes or deeper engineering expertise to the L3 team, providing clear, detailed documentation of all steps taken.
  • Documentation and Process Improvement:
  • Conduct Root Cause Analysis (RCA) for major incidents, documenting the cause, resolution, and preventative actions.
  • Create and maintain a Knowledge Base (KB), runbooks, and Standard Operating Procedures (SOPs) for recurring issues to empower L1 and enable customer self-service.
  • Proactively identify technical deficiencies in processes and systems and recommend improvements to enhance service quality.
  • Customer Communication:
  • Maintain professional, clear, and timely communication with customers, explaining complex technical issues and resolutions in an understandable manner.

Required Technical Skills

  • Programming/Scripting:
  • Strong proficiency in Python (for scripting, automation, debugging, and data manipulation).
  • Experience with other scripting languages like Bash or Shell
  • Databases:
  • Proficiency in SQL for complex querying, debugging data flow issues, and data extraction.
  • Application/Web Technologies:
  • Understanding of API concepts (RESTful/SOAP) and experience troubleshooting them using tools like Postman or curl.
  • Knowledge of application architectures (e.g., microservices, SOA) is a plus.
  • Monitoring & Tools:
  • Experience with support ticketing systems (e.g., JIRA, ServiceNow).
  • Familiarity with log aggregation and monitoring tools (Kibana, Splunk, ELK Stack, Grafana)


Read more
Remote only
5 - 10 yrs
₹25L - ₹55L / yr
Data engineering
Databases
skill iconPython
SQL
skill iconPostgreSQL
+4 more

Role: Full-Time, Long-Term Required: Python, SQL Preferred: Experience with financial or crypto data


OVERVIEW

We are seeking a data engineer to join as a core member of our technical team. This is a long-term position for someone who wants to build robust, production-grade data infrastructure and grow with a small, focused team. You will own the data layer that feeds our machine learning pipeline—from ingestion and validation through transformation, storage, and delivery.


The ideal candidate is meticulous about data quality, thinks deeply about failure modes, and builds systems that run reliably without constant attention. You understand that downstream ML models are only as good as the data they consume.


CORE TECHNICAL REQUIREMENTS

Python (Required): Professional-level proficiency. You write clean, maintainable code for data pipelines—not throwaway scripts. Comfortable with Pandas, NumPy, and their performance characteristics. You know when to use Python versus push computation to the database.


SQL (Required): Advanced SQL skills. Complex queries, query optimization, schema design, execution plans. PostgreSQL experience strongly preferred. You think about indexing, partitioning, and query performance as second nature.


Data Pipeline Design (Required): You build pipelines that handle real-world messiness gracefully. You understand idempotency, exactly-once semantics, backfill strategies, and incremental versus full recomputation tradeoffs. You design for failure—what happens when an upstream source is late, returns malformed data, or goes down entirely. Experience with workflow orchestration required: Airflow, Prefect, Dagster, or similar.


Data Quality (Required): You treat data quality as a first-class concern. You implement validation checks, anomaly detection, and monitoring. You know the difference between data that is missing versus data that should not exist. You build systems that catch problems before they propagate downstream.


WHAT YOU WILL BUILD

Data Ingestion: Pipelines pulling from diverse sources—crypto exchanges, traditional market feeds, on-chain data, alternative data. Handling rate limits, API quirks, authentication, and source-specific idiosyncrasies.


Data Validation: Checks ensuring completeness, consistency, and correctness. Schema validation, range checks, freshness monitoring, cross-source reconciliation.


Transformation Layer: Converting raw data into clean, analysis-ready formats. Time series alignment, handling different frequencies and timezones, managing gaps.


Storage and Access: Schema design optimized for both write patterns (ingestion) and read patterns (ML training, feature computation). Data lifecycle and retention management.

Monitoring and Alerting: Observability into pipeline health. Knowing when something breaks before it affects downstream systems.


DOMAIN EXPERIENCE

Preference for candidates with experience in financial or crypto data—understanding market data conventions, exchange-specific quirks, and point-in-time correctness. You know why look-ahead bias is dangerous and how to prevent it.


Time series data at scale—hundreds of symbols with years of history, multiple frequencies, derived features. You understand temporal joins, windowed computations, and time-aligned data challenges.


High-dimensional feature stores—we work with hundreds of thousands of derived features. Experience managing, versioning, and serving large feature sets is valuable.


ENGINEERING STANDARDS

Reliability: Pipelines run unattended. Failures are graceful with clear errors, not silent corruption. Recovery is straightforward.


Reproducibility: Same inputs and code version produce identical outputs. You version schemas, track lineage, and can reconstruct historical states.


Documentation: Schemas, data dictionaries, pipeline dependencies, operational runbooks. Others can understand and maintain your systems.


Testing: You write tests for pipelines—validation logic, transformation correctness, edge cases. Untested pipelines are broken pipelines waiting to happen.


TECHNICAL ENVIRONMENT

PostgreSQL, Python, workflow orchestration (flexible on tool), cloud infrastructure (GCP preferred but flexible), Git.


WHAT WE ARE LOOKING FOR

Attention to Detail: You notice when something is slightly off and investigate rather than ignore.


Defensive Thinking: You assume sources will send bad data, APIs will fail, schemas will change. You build accordingly.


Self-Direction: You identify problems, propose solutions, and execute without waiting to be told.


Long-Term Orientation: You build systems you will maintain for years.


Communication: You document clearly, explain data issues to non-engineers, and surface problems early.


EDUCATION

University degree in a quantitative/technical field preferred: Computer Science, Mathematics, Statistics, Engineering. Equivalent demonstrated expertise also considered.


TO APPLY

Include: (1) CV/resume, (2) Brief description of a data pipeline you built and maintained, (3) Links to relevant work if available, (4) Availability and timezone.

Read more
SimplyFI Softech

at SimplyFI Softech

2 candid answers
Nikita Sinha
Posted by Nikita Sinha
Mumbai
4 - 8 yrs
Upto ₹20L / yr (Varies
)
skill iconReact.js
skill iconPython
skill iconDjango
API
SQL
+1 more

SimplyFI is a fast-growing AI- and blockchain-powered product company transforming trade finance and banking through digital innovation. We build scalable, intelligent platforms that simplify complex financial workflows for enterprises and financial institutions.

We are looking for a Full Stack Tech Lead with strong expertise in ReactJS (primary) and solid working knowledge of Python (secondary) to join our team in Thane, Mumbai.


Role: Full Stack Tech Lead (ReactJS + Python)


Key Responsibilities:

  • Design, develop, and maintain scalable full-stack applications, with ReactJS as the primary frontend technology
  • Build and integrate backend services using Python (Flask / Django / FastAPI)
  • Design and manage RESTful APIs for internal and external system integrations
  • Collaborate on AI-driven product features and support machine-learning model integrations when required
  • Work closely with DevOps teams to deploy, monitor, and optimize applications on AWS
  • Ensure performance, scalability, security, and code quality across the application stack
  • Collaborate with product managers, designers, and QA teams to deliver high-quality features
  • Write clean, maintainable, and testable code following engineering best practices
  • Participate in agile processes, including code reviews, sprint planning, and daily stand-ups


Required Skills & Qualifications:

  • Strong hands-on experience with ReactJS, including hooks, state management, Redux, and API integrations
  • Proficiency in backend development using Python (Flask, Django, or FastAPI)
  • Solid understanding of RESTful API design and secure authentication mechanisms (OAuth2, JWT)
  • Experience working with databases such as MySQL, PostgreSQL, and MongoDB
  • Familiarity with microservices architecture and modern software design patterns
  • Hands-on experience with Git, CI/CD pipelines, Docker, and Kubernetes
  • Strong problem-solving, debugging, and performance optimization skills
Read more
Bengaluru (Bangalore)
1 - 4 yrs
₹5L - ₹15L / yr
skill iconDjango
skill iconFlask
skill iconHTML/CSS
SQL

Job Responsibilities :


- Work closely with product managers and other cross functional teams to help define, scope and deliver world-class products and high quality features addressing key user needs.


- Translate requirements into system architecture and implement code while considering performance issues of dealing with billions of rows of data and serving millions of API requests every hour.


- Ability to take full ownership of the software development lifecycle from requirement to release.


- Writing and maintaining clear technical documentation enabling other engineers to step in and deliver efficiently.


- Embrace design and code reviews to deliver quality code.


- Play a key role in taking Trendlyne to the next level as a world-class engineering team


-Develop and iterate on best practices for the development team, ensuring adherence through code reviews.


- As part of the core team, you will be working on cutting-edge technologies like AI products, online backtesting, data visualization, and machine learning.


- Develop and maintain scalable, robust backend systems using Python and Django framework.


- Proficient understanding of the performance of web and mobile applications.


- Mentor junior developers and foster skill development within the team.


Job Requirements :


- 1+ years of experience with Python and Django.


- Strong understanding of relational databases like PostgreSQL or MySQL and Redis.


- (Optional) : Experience with web front-end technologies such as JavaScript, HTML, and CSS


Who are we :


Trendlyne, is a Series-A products startup in the financial markets space with cutting-edge analytics products aimed at businesses in stock markets and mutual funds.


Our founders are IIT + IIM graduates, with strong tech, analytics, and marketing experience. We have top finance and management experts on the Board of Directors.


What do we do :


We build powerful analytics products in the stock market space that are best in class. Organic growth in B2B and B2C products have already made the company profitable. We deliver 900 million+ APIs every month to B2B customers. Trendlyne analytics deals with 100s of millions rows of data to generate insights, scores, and visualizations which are an industry benchmark.

Read more
SCA Technologies

at SCA Technologies

4 candid answers
1 video
Reshika Mendiratta
Posted by Reshika Mendiratta
Gurugram
4yrs+
Upto ₹40L / yr (Varies
)
skill iconJava
skill iconSpring Boot
skill iconJavascript
skill iconNodeJS (Node.js)
skill iconPython
+10 more

Job Responsibilities:

  • Develop features across multiple sub-modules within our applications, including collaboration in requirements definition, prototyping, design, coding, testing, debugging, effort estimation, and continuous quality improvement of the design & code and deployment.
  • Design and implement new features, provide fixes/workarounds to bugs, and innovate in alternate solutions.
  • Provide quick solutions to problems and take a feature/component through the entire life cycle, improving space–time performance and usability/reliability.
  • Design, implement, and adhere to the overall architecture to fulfill the functional requirements through software components.
  • Take accountability for the successful delivery of functionality or modules contributing to the overall product objective.
  • Create consistent design specifications using flowcharts, class diagrams, Entity Relationship Diagrams (ERDs), and other visual techniques to convey the development approach to the lead developer and other stakeholders.
  • Conduct source code walkthroughs, refactoring, and ensure adherence to documentation standards.
  • Support troubleshooting efforts in production systems and fulfill support requests from developers.

Experience and Skills:

  • Bachelor’s degree in Computer Science or similar technical discipline required; Master’s degree preferred.
  • Strong experience as a software engineer with demonstrated success developing a variety of software systems and increasing responsibility in analysis, design, implementation, and deployment tasks with a reputed software product company.
  • Hands-on experience in product development using Java 8, J2EE, Spring Boot, Spring MVC, JSF, REST API, JSON, SQL Server, PostgreSQL, Oracle, Redis Cache, Amber, JavaScript/jQuery.
  • Good to have experience in Handlebars.js, Flyway, PrimeFaces.
  • Experience developing data-driven applications utilizing major relational database engines (SQL Server, Oracle, DB2) including writing complex queries, stored procedures, and performing query optimization.
  • Experience building web-based software systems with N-tier architectures, dynamic content, scalable solutions, and complex security implementations.
  • Strong understanding of Design Patterns, system architecture, and configurations for enterprise web applications.
  • Exposure to development environments such as Eclipse, GitHub/Bitbucket.
  • Comfortable with source code management concepts (version control).
  • Self-motivated, energetic, fast learner with excellent communication skills (interaction with remote teams required).
  • Experience with Agile software development is a plus.

Travel: Based on business needs.

Location: Gurgaon

Read more
Mumbai
3 - 5 yrs
₹6L - ₹12L / yr
SQL
skill iconPython
confluence
JIRA
Shell Scripting
+2 more

Job Title: Jira, Confluence, and Bitbucket Administrator

Location: Mumbai, India (candidate must be willing to attend onsite interviews in Malad, Mumbai)

Job Type: Full-time

Experience: 3–4 years (seeking immediate joiners or candidates available within 15–30 days and Local profiles are preferred)

Key Responsibilities:

·       Administer, configure, and maintain Jira, Confluence, and Bitbucket environments, ensuring optimal performance and reliability.

·       Work closely with cross-functional teams to gather requirements and deliver effective solutions using Atlassian tools

·       Implement and manage user access controls, roles, and permissions within Jira, Confluence, and Bitbucket.

·       Collaborate with development and project teams to gather requirements and provide solutions using Jira workflows and Confluence documentation.

·       Create and maintain custom scripts using Groovy for automation, improvements, and enhancements across the Atlassian suite.

·       Develop and implement project management features, dashboards, and reports to support various stakeholders.

·       Troubleshoot and resolve issues related to Jira, Confluence, and Bitbucket, providing timely support to users.

·       Conduct training sessions and workshops to inform users of best practices and new features in the Atlassian tools.

·       Stay up-to-date with new releases and features from Atlassian and evaluate their applicability to our processes.


Qualifications:

·       Bachelor's degree in Computer Science, Information Technology, or a related field.

·       3-4 years of experience in administering Jira, Confluence, and Bitbucket in a corporate environment.

·       Proficiency in Groovy scripting for customizing and automating Atlassian products.

·       Strong analytical and problem-solving skills.

·       Excellent communication and collaboration abilities.

·       Familiarity with Agile methodologies and project management principles.

·       Experience with other development tools and practices is a plus.

·       Required Skills

·       3-4 years of experience in administering Jira, Confluence, and Bitbucket in a corporate environment.

·       Proficiency in Groovy scripting for customizing and automating Atlassian products. Strong analytical and problem-solving skills. Excellent communication and collaboration abilities. Familiarity with Agile methodologies and project management principles.

·       Experience with other development tools and practices is a plus.

Read more
Highfly Sourcing

at Highfly Sourcing

2 candid answers
Highfly Hr
Posted by Highfly Hr
Singapore, Switzerland, New Zealand, Dubai, Dublin, Ireland, Augsburg, Germany, Manchester (United Kingdom), Qatar, Kuwait, Malaysia, Bengaluru (Bangalore), Mumbai, Delhi, Gurugram, Noida, Ghaziabad, Faridabad, Pune, Hyderabad, Goa
3 - 5 yrs
₹15L - ₹25L / yr
SQL
skill iconPHP
skill iconPython
Data Visualization
Data Structures
+5 more

We are seeking a motivated Data Analyst to support business operations by analyzing data, preparing reports, and delivering meaningful insights. The ideal candidate should be comfortable working with data, identifying patterns, and presenting findings in a clear and actionable way.

Key Responsibilities:

  • Collect, clean, and organize data from internal and external sources
  • Analyze large datasets to identify trends, patterns, and opportunities
  • Prepare regular and ad-hoc reports for business stakeholders
  • Create dashboards and visualizations using tools like Power BI or Tableau
  • Work closely with cross-functional teams to understand data requirements
  • Ensure data accuracy, consistency, and quality across reports
  • Document data processes and analysis methods


Read more
Xebo.ai

Xebo.ai

Agency job
via AccioJob by AccioJobHiring Board
Noida
0 - 1 yrs
₹6L - ₹7L / yr
DSA
SQL
Object Oriented Programming (OOPs)
skill iconJavascript
skill iconReact.js

AccioJob is conducting a Walk-In Hiring Drive with Xebo.ai for the position of Software Engineer.


To apply, register and select your slot here: https://go.acciojob.com/pPWDDm


Required Skills: DSA, SQL, OOPS, JavaScript, React


Eligibility:

Degree: BTech./BE

Branch: Computer Science/CSE/Other CS related branch, IT

Graduation Year: 2025, 2026


Work Details:

Work Location: Noida (Onsite)

CTC: ₹6 LPA to ₹7 LPA


Evaluation Process:

Round 1: Offline Assessment at AccioJob Noida Centre


Further Rounds (for shortlisted candidates only):

Resume Evaluation, Technical Interview 1, Technical Interview 2, Technical Interview 3, HR Discussion


Important Note: Bring your laptop & earphones for the test.


Register here: https://go.acciojob.com/pPWDDm


👇 FAST SLOT BOOKING 👇

[ 📲 DOWNLOAD ACCIOJOB APP ]

https://go.acciojob.com/HSeuqZ

Read more
Uni Cards

at Uni Cards

4 candid answers
2 recruiters
Bisman Gill
Posted by Bisman Gill
Bengaluru (Bangalore)
1yr+
Upto ₹22L / yr (Varies
)
SQL
Stakeholder management
Agile/Scrum
JIRA
Asana
+5 more

We’re looking for a Program Manager-1 to join our Growth team- someone who thrives in fast- paced environments and can turn user insights into measurable impact. You’ll work across product and business functions to drive growth, optimize funnels, and enhance the user journey.


What You’ll Do

  • Own parts of the user journey and drive improvements across acquisition, activation, and retention funnels.
  • Partner with Product, Marketing, Engineering, and Design teams to identify growth opportunities and execute data-backed experiments.
  • Use data and user insights to pin point drop-offs and design solutions that improve conversion.
  •   Build, track, and measure growth metrics and KPIs.
  • Bring structure and clarity to ambiguous problems and drive alignment across teams.
  •   Stay on top of product trends and best practices to inspire new growth ideas.


What We’re Looking For

  • Graduate from a Tier 1 institute (IITs, IIMs, ISB, BITS, etc.)
  • 2 - 2.5 years of experience, preferably in a B2C startup(not early-stage).
  • Exposure to digital products or services is a plus.
  • Experience working closely with product and business teams. Strong analytical skills and structured thinking
Read more
DeepIntent

at DeepIntent

2 candid answers
17 recruiters
Amruta Mundale
Posted by Amruta Mundale
Pune
8 - 10 yrs
Best in industry
SQL
Apache
Google Cloud Platform (GCP)
skill iconAmazon Web Services (AWS)
Data architecture
+1 more

What You’ll Do:

We are looking for a Staff Software Engineer based in Pune, India who can master both DeepIntent’s data architectures and pharma research and analytics methodologies to make significant contributions to how health media is analyzed by our clients. This role requires an Engineer who not only understands DBA functions but also how they impact research objectives and can work with researchers and data scientists to achieve impactful results.  

This role will be in the Analytics Organization and will require integration and partnership with the Engineering Organization. The ideal candidate is a self-starter who is inquisitive who is not afraid to take on and learn from challenges and will constantly seek to improve the facets of the business they manage. The ideal candidate will also need to demonstrate the ability to collaborate and partner with others.  

  • Serve as the Engineering interface between Analytics and Engineering teams.
  • Develop and standardize all interface points for analysts to retrieve and analyze data with a focus on research methodologies and data-based decision-making.
  • Optimize queries and data access efficiencies, serve as an expert in how to most efficiently attain desired data points.
  • Build “mastered” versions of the data for Analytics-specific querying use cases.
  • Help with data ETL, table performance optimization.
  • Establish a formal data practice for the Analytics practice in conjunction with the rest of DeepIntent
  • Build & operate scalable and robust data architectures.
  • Interpret analytics methodology requirements and apply them to data architecture to create standardized queries and operations for use by analytics teams.
  • Implement DataOps practices.
  • Master existing and new Data Pipelines and develop appropriate queries to meet analytics-specific objectives.
  • Collaborate with various business stakeholders, software engineers, machine learning engineers, and analysts.
  • Operate between Engineers and Analysts to unify both practices for analytics insight creation.

Who You Are:

  • 8+ years of experience in Tech Support (Specialised in Monitoring and maintaining Data pipeline).
  • Adept in market research methodologies and using data to deliver representative insights.
  • Inquisitive, curious, understands how to query complicated data sets, move and combine data between databases.
  • Deep SQL experience is a must.
  • Exceptional communication skills with the ability to collaborate and translate between technical and non-technical needs.
  • English Language Fluency and proven success working with teams in the U.S.
  • Experience in designing, developing and operating configurable Data pipelines serving high-volume and velocity data.
  • Experience working with public clouds like GCP/AWS.
  • Good understanding of software engineering, DataOps, and data architecture, Agile and DevOps methodologies.
  • Experience building Data architectures that optimize performance and cost, whether the components are prepackaged or homegrown.
  • Proficient with SQL, Python or JVM-based language, Bash.
  • Experience with any of Apache open-source projects such as Spark, Druid, Beam, Airflow etc. and big data databases like BigQuery, Clickhouse, etc. 
  • Ability to think big, take bets and innovate, dive deep, hire and develop the best talent, learn and be curious.


Read more
Trential Technologies

at Trential Technologies

1 candid answer
Garima Jangid
Posted by Garima Jangid
Gurugram
3 - 5 yrs
₹20L - ₹35L / yr
skill iconJavascript
skill iconNodeJS (Node.js)
skill iconAmazon Web Services (AWS)
NOSQL Databases
Google Cloud Platform (GCP)
+7 more

What you'll be doing:

As a Software Developer at Trential, you will be the bridge between technical strategy and hands-on execution. You will be working with our dedicated engineering team designing, building, and deploying our core platforms and APIs. You will ensure our solutions are scalable, secure, interoperable, and aligned with open standards and our core vision. Build and maintain back-end interfaces using modern frameworks.

  • Design & Implement: Lead the design, implementation and management of Trential’s products.
  • Code Quality & Best Practices: Enforce high standards for code quality, security, and performance through rigorous code reviews, automated testing, and continuous delivery pipelines.
  • Standards Adherence: Ensure all solutions comply with relevant open standards like W3C Verifiable Credentials (VCs), Decentralized Identifiers (DIDs) & Privacy Laws, maintaining global interoperability.
  • Continuous Improvement: Lead the charge to continuously evaluate and improve the products & processes. Instill a culture of metrics-driven process improvement to boost team efficiency and product quality.
  • Cross-Functional Collaboration: Work closely with the Co-Founders & Product Team to translate business requirements and market needs into clear, actionable technical specifications and stories. Represent Trential in interactions with external stakeholders for integrations.


What we're looking for:

  • 3+ years of experience in backend development.
  • Deep proficiency in JavaScript, Node.js experience in building and operating distributed, fault tolerant systems.
  • Hands-on experience with cloud platforms (AWS & GCP) and modern DevOps practices (e.g., CI/CD, Infrastructure as Code, Docker).
  • Strong knowledge of SQL/NoSQL databases and data modeling for high-throughput, secure applications.

Preferred Qualifications (Nice to Have)

  • Knowledge of decentralized identity principles, Verifiable Credentials (W3C VCs), DIDs, and relevant protocols (e.g., OpenID4VC, DIDComm)
  • Familiarity with data privacy and security standards (GDPR, SOC 2, ISO 27001) and designing systems complying to these laws.
  • Experience integrating AI/ML models into verification or data extraction workflows.
Read more
Wissen Technology

at Wissen Technology

4 recruiters
Praffull Shinde
Posted by Praffull Shinde
Pune, Mumbai, Bengaluru (Bangalore)
5 - 10 yrs
Best in industry
skill icon.NET
skill iconC#
SQL

Job Description

Wissen Technology is seeking an experienced C# .NET Developer to build and maintain applications related to streaming market data. This role involves developing message-based C#/.NET applications to process, normalize, and summarize large volumes of market data efficiently. The candidate should have a strong foundation in Microsoft .NET technologies and experience working with message-driven, event-based architecture. Knowledge of capital markets and equity market data is highly desirable.


Responsibilities

  • Design, develop, and maintain message-based C#/.NET applications for processing real-time and batch market data feeds.
  • Build robust routines to download and process data from AWS S3 buckets on a frequent schedule.
  • Implement daily data summarization and data normalization routines.
  • Collaborate with business analysts, data providers, and other developers to deliver high-quality, scalable market data solutions.
  • Troubleshoot and optimize market data pipelines to ensure low latency and high reliability.
  • Contribute to documentation, code reviews, and team knowledge sharing.

Required Skills and Experience

  • 5+ years of professional experience programming in C# and Microsoft .NET framework.
  • Strong understanding of message-based and real-time programming architectures.
  • Experience working with AWS services, specifically S3, for data retrieval and processing.
  • Experience with SQL and Microsoft SQL Server.
  • Familiarity with Equity market data, FX, Futures & Options, and capital markets concepts.
  • Excellent interpersonal and communication skills.
  • Highly motivated, curious, and analytical mindset with the ability to work well both independently and in a team environment.


Read more
Hashone Careers
Madhavan I
Posted by Madhavan I
Hyderabad
6 - 10 yrs
₹15L - ₹28L / yr
skill iconData Analytics
skill iconPython
SQL
Data Warehouse (DWH)
Data modeling

Job Description

Role: Data Analyst

Experience: 6 - 9 Years

Location: Hyderabad

WorkMode: Work from Office (5 Days)


Overview

We are seeking a highly skilled Data Analyst with 6+ years of experience in analytics, data modeling, and advanced SQL. The ideal candidate has strong expertise in building scalable data models using dbt, writing efficient Python scripts, and delivering high-quality insights that support data-driven decision-making.


Key Responsibilities

Design, develop, and maintain data models using dbt (Core and dbt Cloud).

Build and optimize complex SQL queries to support reporting, analytics, and data pipelines.

Write Python scripts for data transformation, automation, and analytics workflows.

Ensure data quality, integrity, and consistency across multiple data sources.

Collaborate with cross-functional teams (Engineering, Product, Business) to understand data needs.

Develop dashboards and reports to visualize insights (using tools such as Tableau, Looker, or Power BI).

Perform deep-dive exploratory analysis to identify trends, patterns, and business opportunities.

Document data models, pipelines, and processes.

Contribute to scaling the analytics stack and improving data architecture.


Required Qualifications

6 - 9 years of hands-on experience in data analytics or data engineering.

Expert-level skills in SQL (complex joins, window functions, performance tuning).

Strong experience building and maintaining dbt data models.

Proficiency in Python for data manipulation, scripting, and automation.

Solid understanding of data warehousing concepts (e.g., dimensional modeling, ELT/ETL pipelines).

Understanding with cloud data platforms (Snowflake, BigQuery, Redshift, etc.).

Strong analytical thinking and problem-solving skills.

Excellent communication skills with the ability to present insights to stakeholders.

Trino and lakehouse architecture experience good to have


Read more
Bengaluru (Bangalore)
3 - 6 yrs
₹8L - ₹18L / yr
Dot Net
skill iconAngular (2+)
Windows Azure
SQL
skill iconC#
+3 more

Skills required:

  • Strong expertise in .NET Core / ASP.NET MVC
  • Candidate must have 4+ years of experience in Dot Net.
  • Candidate must have experience with Angular.
  • Hands-on experience with Entity Framework & LINQ
  • Experience with SQL Server (performance tuning, stored procedures, indexing)
  • Understanding of multi-tenancy architecture
  • Experience with Microservices / API development (REST, GraphQL)
  • Hands-on experience in Azure Services (App Services, Azure SQL, Blob Storage, Key Vault, Functions, etc.)
  • Experience in CI/CD pipelines with Azure DevOps
  • Knowledge of security best practices in cloud-based applications
  • Familiarity with Agile/Scrum methodologies
  • Flexible to use copilot or any other AI tool to write automated test cases and faster code writing

Roles and Responsibilities:

- Good communication Skills is must.

- Develop features across multiple subsystems within our applications, including collaboration in requirements definition, prototyping, design, coding, testing, and deployment.

- Understand how our applications operate, are structured, and how customers use them

- Provide engineering support (when necessary) to our technical operations staff when they are building, deploying, configuring, and supporting systems for customers.

Read more
Nuware Systems
Bengaluru (Bangalore)
5 - 10 yrs
Upto ₹25L / yr (Varies
)
UFT
Software Testing (QA)
SQL
Shell Scripting
Manual testing

About Nuware

NuWare is a global technology and IT services company built on the belief that organizations require transformational strategies to scale, grow and build into the future owing to a dynamically evolving ecosystem. We strive towards our clients’ success in today’s hyper-competitive market by servicing their needs with next-gen technologies - AI/ML, NLP, chatbots, digital and automation tools.


We empower businesses to enhance their competencies, processes and technologies to fully leverage opportunities and accelerate impact. Through our focus on market differentiation and innovation - we offer services that are agile, streamlined, efficient and customer-centric.


Headquartered in Iselin, NJ, NuWare has been creating business value and generating growth opportunities for clients through its network of partners, global resources, highly skilled talent and SME’s for 25 years. NuWare is technology agnostic and offers services for Systems Integration, Cloud, Infrastructure Management, Mobility, Test automation, Data Sciences and Social & Big Data Analytics.


Skills Required

  • Automation testing with UFT, strong into SQL, Good communication skills
  • 5 years of experience in automation testing
  • Experience with UFT for at least 3 years
  • Good knowledge of VB Scripting
  • Knowledge of Manual testing
  • Knowledge of automation frameworks
Read more
Financial Services Industry

Financial Services Industry

Agency job
via Peak Hire Solutions by Dhara Thakkar
Hyderabad
4 - 5 yrs
₹10L - ₹20L / yr
skill iconPython
CI/CD
SQL
skill iconKubernetes
Stakeholder management
+14 more

Required Skills: CI/CD Pipeline, Kubernetes, SQL Database, Excellent Communication & Stakeholder Management, Python

 

Criteria:

Looking for 15days and max 30 days of notice period candidates.

looking candidates from Hyderabad location only

Looking candidates from EPAM company only 

1.4+ years of software development experience

2. Strong experience with Kubernetes, Docker, and CI/CD pipelines in cloud-native environments.

3. Hands-on with NATS for event-driven architecture and streaming.

4. Skilled in microservices, RESTful APIs, and containerized app performance optimization.

5. Strong in problem-solving, team collaboration, clean code practices, and continuous learning.

6.  Proficient in Python (Flask) for building scalable applications and APIs.

7. Focus: Java, Python, Kubernetes, Cloud-native development

8. SQL database 

 

Description

Position Overview

We are seeking a skilled Developer to join our engineering team. The ideal candidate will have strong expertise in Java and Python ecosystems, with hands-on experience in modern web technologies, messaging systems, and cloud-native development using Kubernetes.


Key Responsibilities

  • Design, develop, and maintain scalable applications using Java and Spring Boot framework
  • Build robust web services and APIs using Python and Flask framework
  • Implement event-driven architectures using NATS messaging server
  • Deploy, manage, and optimize applications in Kubernetes environments
  • Develop microservices following best practices and design patterns
  • Collaborate with cross-functional teams to deliver high-quality software solutions
  • Write clean, maintainable code with comprehensive documentation
  • Participate in code reviews and contribute to technical architecture decisions
  • Troubleshoot and optimize application performance in containerized environments
  • Implement CI/CD pipelines and follow DevOps best practices
  •  

Required Qualifications

  • Bachelor's degree in Computer Science, Information Technology, or related field
  • 4+ years of experience in software development
  • Strong proficiency in Java with deep understanding of web technology stack
  • Hands-on experience developing applications with Spring Boot framework
  • Solid understanding of Python programming language with practical Flask framework experience
  • Working knowledge of NATS server for messaging and streaming data
  • Experience deploying and managing applications in Kubernetes
  • Understanding of microservices architecture and RESTful API design
  • Familiarity with containerization technologies (Docker)
  • Experience with version control systems (Git)


Skills & Competencies

  • Skills Java (Spring Boot, Spring Cloud, Spring Security) 
  • Python (Flask, SQL Alchemy, REST APIs)
  • NATS messaging patterns (pub/sub, request/reply, queue groups)
  • Kubernetes (deployments, services, ingress, ConfigMaps, Secrets)
  • Web technologies (HTTP, REST, WebSocket, gRPC)
  • Container orchestration and management
  • Soft Skills Problem-solving and analytical thinking
  • Strong communication and collaboration
  • Self-motivated with ability to work independently
  • Attention to detail and code quality
  • Continuous learning mindset
  • Team player with mentoring capabilities


Read more
Techno Wise
Ishita Panwar
Posted by Ishita Panwar
Pune
6 - 10 yrs
₹30L - ₹35L / yr
Microsoft Windows Azure
SQL
Informatica MDM
skill iconAmazon Web Services (AWS)
Informatica PowerCenter
+2 more

Profile: Senior Data Engineer (Informatica MDM)


Primary Purpose:

The Senior Data Engineer will be responsible for building new segments in a Customer Data Platform (CDP), maintaining the segments, understanding the data requirements for use cases, data integrity, data quality and data sources involved to build the specific use cases. The resource should also have an understanding of ETL processes. This position will have an understanding of integrations with cloud service providers like Microsoft Azure, Azure Data Lake Services, Azure Data Factory and cloud data warehouse platforms in addition to Enterprise Data Ware house environments. The ideal candidate will also have proven experience in data analysis and management, with excellent analytical and problem-solving abilities.


Major Functions/Responsibilities

• Design, develop and implement robust and extensible solutions to build segmentations using Customer Data Platform.

• Work closely with subject matter experts to identify and document based on the business requirements, functional specs and translate them into appropriate technical solutions.

• Responsible for estimating, planning, and managing the user stories, tasks and reports on Agile Projects.

• Develop advanced SQL Procedures, Functions and SQL jobs.

• Performance tuning and optimization of ETL Jobs, SQL Queries and Scripts.

• Configure and maintain scheduled ETL jobs, data segments and refresh.

• Support exploratory data analysis, statistical analysis, and predictive analytics.

• Support production issues and maintain existing data systems by researching and trouble shooting any issues/problems in a timely manner.

• Proactive, great attention to detail, results-oriented problem solver.


Preferred Experience

• 6+ years of experience in writing SQL queries and stored procedures to extract, manipulate and load data.

• 6+ years’ experience with design, build, test, and maintain data integrations for data marts and data warehouses.

• 3+ years of experience in integrations Azure / AWS Data Lakes, Azure Data Factory & IDMC (Informatica Cloud Services).

• In depth understanding of database management systems, online analytical processing (OLAP) and ETL (Extract, transform, load) framework.

• Excellent verbal and written communication skills

• Collaboration with both onshore and offshore development teams.

• Good Understanding of Marketing tools like Sales Force Marketing cloud, Adobe Marketing or Microsoft Customer Insights Journey and Customer Data Platform will be important to this role. Communication

• Facilitate project team meetings effectively.

• Effectively communicate relevant project information to superiors

• Deliver engaging, informative, well-organized presentations that are effectively tailored to the intended audience.

• Serve as a technical liaison with development partner.

• Serve as a communication bridge between applications team, developers and infrastructure team members to facilitate understanding of current systems

• Resolve and/or escalate issues in a timely fashion.

• Understand how to communicate difficult/sensitive information tactfully.

• Works under the direction of Technical Data Lead / Data architect. Education

Bachelor’s Degree or higher in Engineering, Technology or related field experience required. 

Read more
Matchmaking platform

Matchmaking platform

Agency job
via Peak Hire Solutions by Dhara Thakkar
Mumbai
2 - 5 yrs
₹15L - ₹28L / yr
skill iconData Science
skill iconPython
Natural Language Processing (NLP)
MySQL
skill iconMachine Learning (ML)
+15 more

Review Criteria

  • Strong Data Scientist/Machine Learnings/ AI Engineer Profile
  • 2+ years of hands-on experience as a Data Scientist or Machine Learning Engineer building ML models
  • Strong expertise in Python with the ability to implement classical ML algorithms including linear regression, logistic regression, decision trees, gradient boosting, etc.
  • Hands-on experience in minimum 2+ usecaseds out of recommendation systems, image data, fraud/risk detection, price modelling, propensity models
  • Strong exposure to NLP, including text generation or text classification (Text G), embeddings, similarity models, user profiling, and feature extraction from unstructured text
  • Experience productionizing ML models through APIs/CI/CD/Docker and working on AWS or GCP environments
  • Preferred (Company) – Must be from product companies

 

Job Specific Criteria

  • CV Attachment is mandatory
  • What's your current company?
  • Which use cases you have hands on experience?
  • Are you ok for Mumbai location (if candidate is from outside Mumbai)?
  • Reason for change (if candidate has been in current company for less than 1 year)?
  • Reason for hike (if greater than 25%)?

 

Role & Responsibilities

  • Partner with Product to spot high-leverage ML opportunities tied to business metrics.
  • Wrangle large structured and unstructured datasets; build reliable features and data contracts.
  • Build and ship models to:
  • Enhance customer experiences and personalization
  • Boost revenue via pricing/discount optimization
  • Power user-to-user discovery and ranking (matchmaking at scale)
  • Detect and block fraud/risk in real time
  • Score conversion/churn/acceptance propensity for targeted actions
  • Collaborate with Engineering to productionize via APIs/CI/CD/Docker on AWS.
  • Design and run A/B tests with guardrails.
  • Build monitoring for model/data drift and business KPIs


Ideal Candidate

  • 2–5 years of DS/ML experience in consumer internet / B2C products, with 7–8 models shipped to production end-to-end.
  • Proven, hands-on success in at least two (preferably 3–4) of the following:
  • Recommender systems (retrieval + ranking, NDCG/Recall, online lift; bandits a plus)
  • Fraud/risk detection (severe class imbalance, PR-AUC)
  • Pricing models (elasticity, demand curves, margin vs. win-rate trade-offs, guardrails/simulation)
  • Propensity models (payment/churn)
  • Programming: strong Python and SQL; solid git, Docker, CI/CD.
  • Cloud and data: experience with AWS or GCP; familiarity with warehouses/dashboards (Redshift/BigQuery, Looker/Tableau).
  • ML breadth: recommender systems, NLP or user profiling, anomaly detection.
  • Communication: clear storytelling with data; can align stakeholders and drive decisions.



Read more
Technology Industry

Technology Industry

Agency job
via Peak Hire Solutions by Dhara Thakkar
Bengaluru (Bangalore)
3 - 7 yrs
₹10L - ₹24L / yr
SaaS
Software implementation
Customer Success
Implementation
Tech Support
+8 more

Review Criteria

  • Strong Implementation Manager / Customer Success Implementation / Technical Solutions / Post-Sales SaaS Delivery
  • 3+ years of hands-on experience in software/tech Implementation roles within technical B2B SaaS companies, preferably working with global or US-based clients
  • Must have direct experience leading end-to-end SaaS product implementations — including onboarding, workflow configuration, API integrations, data setup, and customer training
  • Must have strong technical understanding — including ability to read and write basic SQL queries, debug API workflows, and interpret JSON payloads for troubleshooting or configuration validation.
  • Must have worked in post-sales environments, owning customer success and delivery after deal closure, ensuring product adoption, accurate setup, and smooth go-live.
  • Must have experience collaborating cross-functionally with product, engineering, and sales teams to ensure timely resolution of implementation blockers and seamless client onboarding.
  • (Company): B2B SaaS startup or growth-stage company
  • Mandatory (Note): Good growth opportunity, this role will have team leading option after a few months


Preferred

  • Preferred (Experience): Previous experience in FinTech SaaS like BillingTech, finance automation, or subscription management platforms will be a strong plus


Job Specific Criteria

  • CV Attachment is mandatory
  • Are you open to work in US timings (4/5:00 PM - 3:00 AM) - to target the US market?
  • Please provide CTC Breakup (Fixed + Variable)?
  • It’s a hybrid role with 1-3 work from office (Indiranagar) with in office hours 3:00 pm to 10:00 om IST, are you ok with hybrid mode?
  • It’s a hybrid role with 1-3 work from office (Indiranagar) with in office hours 3:00 pm to 10:00 om IST, are you ok with hybrid mode?

 

Role & Responsibilities

As the new hire in this role, you'll be the voice of the customer in the company, and lead the charge in developing our customer-centric approach, working closely with our tech, design, and product teams.

 

What you will be doing:

You will be responsible for converting, onboarding, managing, and proactively ensuring success for our customers/prospective clients.

  • Implementation
  • Understand client billing models and configure company contracts, pricing, metering, and invoicing accurately.
  • Lead pilots and implementation for new customers, ensuring complete onboarding within 3–8 weeks.
  • Translate complex business requirements into structured company workflows and setup.
  • Pre-sales & Technical Discovery
  • Support sales with live demos, sandbox setups, and RFP responses.
  • Participate in technical discovery calls to map company capabilities to client needs.
  • Create and maintain demo environments showcasing relevant use cases.
  • Internal Coordination & Escalation
  • Act as the voice of the customer internally — share structured feedback with product and engineering.
  • Create clear, well-scoped handoff documents when working with technical teams.
  • Escalate time-sensitive issues appropriately and follow through on resolution.
  • Documentation & Enablement
  • Create client-specific documentation (e.g., onboarding guides, configuration references).
  • Contribute to internal wikis, training material, and product documentation.
  • Write simple, to-the-point communication — clear enough for a CXO and detailed enough for a developer.

 

Ideal Candidate

  • 3-7 years of relevant experience
  • Willing to work in US time zone (~430 am IST) on weekdays (Mon-Fri)
  • Ability to understand and shape the product at a granular level
  • Ability to empathize with the customers, and understand their pain points
  • Understanding of SaaS architecture and APIs conceptually — ability to debug API workflows and usage issues
  • Previous experience in salesforce CRM
  • Entrepreneurial drive, and willingness to wear multiple hats as per company’s requirements
  • Strong analytical skills and a structured problem-solving approach
  • (Strongly preferred) Computer science background and basic coding experience
  • Ability to understand functional aspects related to the product e.g., accounting/revenue recognition, receivables, billing etc
  • Self-motivated and proactive in managing tasks and responsibilities, requiring minimal follow-ups.
  • Self-driven individual with high ownership and strong work ethic
  • Not taking yourself too seriously.


Read more
Analytical Brains Education
Remote only
1 - 5 yrs
₹8L - ₹12L / yr
skill iconPython
Shell Scripting
Powershell
SQL
skill iconJava

Job Description

We are looking for motivated IT professionals with at least one year of industry experience. The ideal candidate should have hands-on experience in AWS, Azure, AI, or Cloud technologies, or should be enthusiastic and ready to upskill and shift to new and emerging technologies. This role is primarily remote; however, candidates may be required to visit the office occasionally for meetings or project needs.

Key Requirements

  • Minimum 1 year of experience in the IT industry
  • Exposure to AWS / Azure / AI / Cloud platforms (any one or more)
  • Willingness to learn and adapt to new technologies
  • Strong problem-solving and communication skills
  • Ability to work independently in a remote setup
  • Must have a proper work-from-home environment (laptop, stable internet, quiet workspace)

Education Qualification

  • B.Tech / BE / MCA / M.Sc (IT) / equivalent


Read more
Oneture Technologies

at Oneture Technologies

1 recruiter
Eman Khan
Posted by Eman Khan
Mumbai
1 - 5 yrs
₹7L - ₹15L / yr
skill iconGo Programming (Golang)
SQL
Microservices
RESTful APIs
skill iconJava
+1 more

Role Overview

We are looking for a passionate Software Engineer with 1–3 years of hands-on experience in backend engineering to join our team in Mumbai. The ideal candidate will have strong programming skills in GoLang, a solid understanding of SQL databases, and exposure to or interest in High Performance Computing (HPC) concepts. You will be responsible for designing, developing, optimizing, and maintaining backend services that are scalable, efficient, and secure.


Key Responsibilities

  • Develop, build, and maintain backend services and microservices using GoLang
  • Design and optimize database schemas and write efficient SQL queries for relational databases
  • Work on high-performance applications by optimizing code, memory usage, and execution speed
  • Collaborate with cross-functional teams including frontend, DevOps, QA, and product
  • Participate in code reviews, troubleshoot production issues, and follow best engineering practices
  • Contribute to improving system performance, reliability, and scalability
  • Stay up to date with emerging backend technologies, tools, and frameworks


Required Skills

Technical Skills

  • 1–5 years of experience in backend development
  • Strong hands-on experience with GoLang (Golang)
  • Good understanding of SQL and relational database design
  • Exposure to or understanding of HPC concepts such as concurrency, parallelism, distributed processing, or performance optimization
  • Experience with RESTful APIs and microservice architectures
  • Familiarity with version control systems (Git)

Soft Skills

  • Strong analytical and problem-solving abilities
  • Ability to work effectively in a fast-paced, collaborative team environment
  • Good communication and documentation skills
  • Strong ownership mindset with a willingness to learn

Good to Have

  • Experience with cloud platforms (AWS, Azure, or GCP)
  • Knowledge of Docker or other containerization tools
  • Understanding of CI/CD pipelines
  • Experience with performance profiling and monitoring tools


Education

  • Bachelor’s degree in Computer Science, Engineering, or a related technical field


Why Join Oneture Technologies?

  • Opportunity to work on high-impact, modern technology projects
  • Learning-driven culture with strong mentorship and continuous upskilling
  • Exposure to cloud-native and cutting-edge backend technologies
  • Collaborative, startup-like environment with real ownership of projects
Read more
LogIQ Labs Pvt.Ltd.
Bengaluru (Bangalore), Pune, Hyderabad, Noida
3 - 5 yrs
₹4L - ₹10L / yr
Playwright
SQL

Functional Testing & Validation

  • Web Application Testing: Design, document, and execute comprehensive functional test plans and test cases for complex, highly interactive web applications, ensuring they meet specified requirements and provide an excellent user experience.
  • Backend API Testing: Possess deep expertise in validating backend RESTful and/or SOAP APIs. This includes testing request/response payloads, status codes, data integrity, security, and robust error handling mechanisms.
  • Data Validation with SQL: Write and execute complex SQL queries (joins, aggregations, conditional logic) to perform backend data checks, verify application states, and ensure data integrity across integration points.
  • I Automation (Playwright & TypeScript):
  • Design, develop, and maintain robust, scalable, and reusable UI automation scripts using Playwright and TypeScript.
  • Integrate automation suites into Continuous Integration/Continuous Deployment (CI/CD) pipelines.
  • Implement advanced automation patterns and frameworks (e.g., Page Object Model) to enhance maintainability.
  • Prompt-Based Automation: Demonstrate familiarity or hands-on experience with emerging AI-driven or prompt-based automation approaches and tools to accelerate test case generation and execution.
  • API Automation: Develop and maintain automated test suites for APIs to ensure reliability and performance.

3. Performance & Load Testing

  • JMeter Proficiency: Utilize Apache JMeter to design, script, and execute robust API load testing and stress testing scenarios.
  • Analyse performance metrics, identify bottlenecks (e.g., response time, throughput), and provide actionable reports to development teams.


🛠️ Required Skills and Qualifications

  • Experience: 4+ years of professional experience in Quality Assurance and Software Testing, with a strong focus on automation.
  • Automation Stack: Expert-level proficiency in developing and maintaining automation scripts using Playwright and TypeScript.
  • Testing Tools: Proven experience with API testing tools (e.g., Postman, Swagger) and strong functional testing methodologies.
  • Database Skills: Highly proficient in writing and executing complex SQL queries for data validation and backend verification.
  • Performance: Hands-on experience with Apache JMeter for API performance and load testing.
  • Communication: Excellent communication and collaboration skills to work effectively with cross-functional teams (Developers, Product Managers).
  • Problem-Solving: Strong analytical and debugging skills to efficiently isolate and report defects.


Read more
AryuPay Technologies
Bhavana Chaudhari
Posted by Bhavana Chaudhari
Bengaluru (Bangalore), Bhopal
2 - 3 yrs
₹3L - ₹5L / yr
Search Engine Optimization (SEO)
SQL
On-page Optimization
off page seo
skill iconGoogle Analytics
+3 more

Job Description – SEO Specialist

Company: Capace Software Pvt. Ltd.

Location: Bhopal / Bangalore (On-site)

Experience: 2+ Years

Budget: Up to ₹4 LPA

Position: Full-Time


About the Role

Capace Software Pvt. Ltd. is looking for a skilled SEO Specialist with strong expertise in On-Page SEO, Off-Page SEO, and Technical SEO. The ideal candidate will be responsible for improving our search engine ranking, driving organic traffic, and ensuring technical search requirements are met across websites.


Key Responsibilities

🔹 On-Page SEO

  • Optimize meta titles, descriptions, header tags, and URLs
  • Conduct in-depth keyword research and implement strategic keyword placement
  • Optimize website content for relevancy and readability
  • Implement internal linking strategies
  • Optimize images, schema, and site structure for SEO
  • Ensure webpages follow SEO best practices

🔹 Off-Page SEO

  • Create and execute backlink strategies
  • Manage directory submissions, social bookmarking, classified listings
  • Conduct competitor backlink analysis
  • Build high-quality guest post links and outreach
  • Improve brand visibility through digital promotions


🔹 Technical SEO

  • Conduct website audits (crawl errors, index issues, technical fixes)
  • Optimize website speed and performance
  • Implement schema markup and structured data
  • Manage XML sitemaps and robots.txt
  • Resolve indexing, crawling, and canonical issues
  • Work with developers to implement technical updates


Requirements

  • Minimum 2+ years of experience in SEO
  • Strong knowledge of On-Page, Off-Page & Technical SEO
  • Experience with tools like:
  • Google Analytics
  • Google Search Console
  • Ahrefs / SEMrush / Ubersuggest
  • Screaming Frog (good to have)
  • Understanding of HTML, CSS basics (preferred)
  • Strong analytical and reporting skills
  • Good communication and documentation skills


What We Offer

  • Competitive salary up to ₹4 LPA
  • Opportunity to work on multiple SaaS products and websites
  • Supportive team & learning-focused environment
  • Career growth in digital marketing & SEO domain
Read more
Tirupati
4 - 8 yrs
₹8L - ₹16L / yr
skill iconPython
API
FastAPI
RESTful APIs
SQL
+1 more

Role Overview

We are seeking an experienced Python Backend Developer with strong expertise in SDK development, API design, and application security. The ideal candidate will build robust backend systems, integrate third-party services, and ensure secure, scalable backend operations.

Key Responsibilities

  • Design, develop, and maintain backend services using Python and modern frameworks (e.g., FastAPI, Django, Flask).
  • Build and maintain SDKs to support internal and external integrations.
  • Develop clean, scalable, and reusable RESTful and/or GraphQL APIs.
  • Implement and enforce security best practices, including authentication, authorization, encryption, secrets management, and OWASP guidelines.
  • Collaborate with frontend, DevOps, and product teams to deliver end-to-end features.
  • Integrate external APIs and third-party services efficiently and securely.
  • Optimize backend performance, scalability, logging, and monitoring.
  • Write automated tests and maintain high code quality through CI/CD pipelines.
  • Work with client SMEs to understand existing workflows, formulas, rules, and translate them into maintainable backend services

·       Consume and work with existing data models and database schemas (SQL/NoSQL) to support analytical workflows, operational planning applications, and integration of machine learning outputs into backend services.

·       Leverage Redis (or similar in-memory stores) for caching and performance optimization, ensuring fast response times for data-driven APIs and applications.

·       Utilize middleware, message queues, and streaming technologies (e.g., Kafka, Event Hubs, RabbitMQ) to build reliable, scalable data flows and event-driven backend services.

Required Skills & Qualifications

  • Bachelor’s or Master’s degree in Computer Science, Artificial Intelligence, Software Engineering, Data Science or a related field
  • Proven experience of 5+ years as a Python Developer specializing in backend systems.
  • Hands-on experience with SDK design, development, and documentation.
  • Strong knowledge of API development (REST, GraphQL), API versioning, and standards.
  • Strong understanding of data modeling, multi-source data integration (SQL/NoSQL/warehouse), and analytical data flows.
  • Solid understanding of application security, including:
  • OAuth2, JWT, API keys
  • Secure coding practices
  • Data privacy & encryption
  • Security testing & vulnerability mitigation
  • Experience with Python frameworks such as FastAPI, Django, Flask.
  • Knowledge of databases (PostgreSQL, MySQL, MongoDB, Redis).
  • Familiarity with CI/CD, Git, Docker, Kubernetes and cloud platforms (AWS, GCP, Azure).
  • Experience with caching (Redis), asynchronous processing, and performance tuning for low-latency user interactions.
  • Knowledge of message brokers (Kafka, Event Hubs, RabbitMQ) and event-driven architecture for workflow orchestration.
  • Strong analytical skills with complex Excel models, including familiarity with advanced formulas, pivot tables, and user-defined Excel functions

Preferred Qualifications

  • Experience building public or enterprise-level SDKs.
  • Hands-on experience with event-driven architectures, message queues, or streaming technologies
  • Familiarity with workflow orchestration tools (e.g., Airflow, Prefect, Dagster, Azure Data Factory)
  • Familiarity with data warehousing or analytical query optimization (Snowflake, BigQuery, Synapse, Redshift).
  • Exposure to MLOps tools like MLflow, BentoML, Seldon, SageMaker, Vertex AI, or Databricks ML.

Competencies:

·       Tech Savvy - Anticipating and adopting innovations in business-building digital and technology applications.

·       Self-Development - Actively seeking new ways to grow and be challenged using both formal and informal development channels.

·       Action Oriented - Taking on new opportunities and tough challenges with a sense of urgency, high energy, and enthusiasm.

·       Customer Focus - Building strong customer relationships and delivering customer-centric solutions.

·       Optimize Work Processes - Knowing the most effective and efficient processes to get things done, with a focus on continuous improvement.

Why Join Us?

  • Be part of a collaborative and agile team driving cutting-edge AI and data engineering solutions.
  • Work on impactful projects that make a difference across industries.
  • Opportunities for professional growth and continuous learning.
  • Competitive salary and benefits package.

Application Details

Ready to make an impact? Apply today and become part of the QX Impact team!


Read more
Tarento Group

at Tarento Group

3 candid answers
1 recruiter
Reshika Mendiratta
Posted by Reshika Mendiratta
Bengaluru (Bangalore)
4yrs+
Best in industry
skill iconJava
skill iconSpring Boot
Microservices
Windows Azure
RESTful APIs
+5 more

Job Summary:

We are seeking a highly skilled and self-driven Java Backend Developer with strong experience in designing and deploying scalable microservices using Spring Boot and Azure Cloud. The ideal candidate will have hands-on expertise in modern Java development, containerization, messaging systems like Kafka, and knowledge of CI/CD and DevOps practices.Key Responsibilities:

  • Design, develop, and deploy microservices using Spring Boot on Azure cloud platforms.
  • Implement and maintain RESTful APIs, ensuring high performance and scalability.
  • Work with Java 11+ features including Streams, Functional Programming, and Collections framework.
  • Develop and manage Docker containers, enabling efficient development and deployment pipelines.
  • Integrate messaging services like Apache Kafka into microservice architectures.
  • Design and maintain data models using PostgreSQL or other SQL databases.
  • Implement unit testing using JUnit and mocking frameworks to ensure code quality.
  • Develop and execute API automation tests using Cucumber or similar tools.
  • Collaborate with QA, DevOps, and other teams for seamless CI/CD integration and deployment pipelines.
  • Work with Kubernetes for orchestrating containerized services.
  • Utilize Couchbase or similar NoSQL technologies when necessary.
  • Participate in code reviews, design discussions, and contribute to best practices and standards.

Required Skills & Qualifications:

  • Strong experience in Java (11 or above) and Spring Boot framework.
  • Solid understanding of microservices architecture and deployment on Azure.
  • Hands-on experience with Docker, and exposure to Kubernetes.
  • Proficiency in Kafka, with real-world project experience.
  • Working knowledge of PostgreSQL (or any SQL DB) and data modeling principles.
  • Experience in writing unit tests using JUnit and mocking tools.
  • Experience with Cucumber or similar frameworks for API automation testing.
  • Exposure to CI/CD toolsDevOps processes, and Git-based workflows.

Nice to Have:

  • Azure certifications (e.g., Azure Developer Associate)
  • Familiarity with Couchbase or other NoSQL databases.
  • Familiarity with other cloud providers (AWS, GCP)
  • Knowledge of observability tools (Prometheus, Grafana, ELK)

Soft Skills:

  • Strong problem-solving and analytical skills.
  • Excellent verbal and written communication.
  • Ability to work in an agile environment and contribute to continuous improvement.

Why Join Us:

  • Work on cutting-edge microservice architectures
  • Strong learning and development culture
  • Opportunity to innovate and influence technical decisions
  • Collaborative and inclusive work environment
Read more
RADCOM
Shreya Tiwari
Posted by Shreya Tiwari
Delhi, Gurugram, Noida, Ghaziabad, Faridabad
3 - 6 yrs
₹4L - ₹10L / yr
Linux/Unix
SQL
DOS/4G
Telecom
skill iconKubernetes
+1 more

Dear Candidate


Candidate must have:

 

  • Minimum 3-5 years of experience working as a NOC Engineer / Senior NOC Engineer in the telecom/Product (preferably telecom monitoring) industry.
  • BE in CS, EE, or Telecommunications from a recognized university.
  • Knowledge of NOC Process
  • Technology exposure towards Telecom – 5G,4G,IMS with a solid understanding of Telecom Performance KPI’s, and/or Radio Access Network. Knowledge of call flows will be advantage
  • Experience with Linux OS and SQL – mandatory.
  • Residence in Delhi – mandatory.
  • Ready to work in a 24×7 environment.
  • Ability to monitor alarms based on our environment.
  • Capability to identify and resolve issues occurring in the RADCOM environment.
  • Any relevant technical certification will be an added advantage.

 

Responsibilities:

 

  • Based in RADCOM India offices, Delhi.
  • Responsible for all NOC Monitoring and technical support -T1/T2 aspects required by the process for RADCOM’s solutions.
  • Ready to participate under Customer Planned activities / execution and monitoring.

 

Read more
DeepIntent

at DeepIntent

2 candid answers
17 recruiters
Amruta Mundale
Posted by Amruta Mundale
Pune
4 - 8 yrs
Best in industry
skill iconJava
SQL
skill iconSpring Boot
Apache
skill iconAmazon Web Services (AWS)
+1 more

What You’ll Do:

  • Setting up formal data practices for the company.
  • Building and running super stable and scalable data architectures.
  • Making it easy for folks to add and use new data with self-service pipelines.
  • Getting DataOps practices in place.
  • Designing, developing, and running data pipelines to help out Products, Analytics, data scientists and machine learning engineers.
  • Creating simple, reliable data storage, ingestion, and transformation solutions that are a breeze to deploy and manage.
  • Writing and Managing reporting API for different products.
  • Implementing different methodologies for different reporting needs.
  • Teaming up with all sorts of people – business folks, other software engineers, machine learning engineers, and analysts.

Who You Are:

  • Bachelor’s degree in engineering (CS / IT) or equivalent degree from a well-known Institute / University.
  • 3.5+ years of experience in building and running data pipelines for tons of data.
  • Experience with public clouds like GCP or AWS.
  • Experience with Apache open-source projects like Spark, Druid, Airflow, and big data databases like BigQuery, Clickhouse.
  • Experience making data architectures that are optimised for both performance and cost.
  • Good grasp of software engineering, DataOps, data architecture, Agile, and DevOps.
  • Proficient in SQL, Java, Spring Boot, Python, and Bash.
  • Good communication skills for working with technical and non-technical people.
  • Someone who thinks big, takes chances, innovates, dives deep, gets things done, hires and develops the best, and is always learning and curious.


Read more
Bengaluru (Bangalore)
6 - 10 yrs
₹15L - ₹28L / yr
Business Analysis
Data integration
SQL
PMS
CRS
+2 more

Job Description: Business Analyst – Data Integrations

Location: Bangalore / Hybrid / Remote

Company: LodgIQ

Industry: Hospitality / SaaS / Machine Learning

About LodgIQ

Headquartered in New York, LodgIQ delivers a revolutionary B2B SaaS platform to the

travel industry. By leveraging machine learning and artificial intelligence, we enable precise

forecasting and optimized pricing for hotel revenue management. Backed by Highgate

Ventures and Trilantic Capital Partners, LodgIQ is a well-funded, high-growth startup with a

global presence.

About the Role

We’re looking for a skilled Business Analyst – Data Integrations who can bridge the gap

between business operations and technology teams, ensuring smooth, efficient, and scalable

integrations. If you’re passionate about hospitality tech and enjoy solving complex data

challenges, we’d love to hear from you!

What You’ll Do

Key Responsibilities

 Collaborate with vendors to gather requirements for API development and ensure

technical feasibility.

 Collect API documentation from vendors; document and explain business logic to

use external data sources effectively.

 Access vendor applications to create and validate sample data; ensure the accuracy

and relevance of test datasets.

 Translate complex business logic into documentation for developers, ensuring

clarity for successful integration.

 Monitor all integration activities and support tickets in Jira, proactively resolving

critical issues.

 Lead QA testing for integrations, overseeing pilot onboarding and ensuring solution

viability before broader rollout.

 Document onboarding processes and best practices to streamline future

integrations and improve efficiency.

 Build, train, and deploy machine learning models for forecasting, pricing, and

optimization, supporting strategic goals.

 Drive end-to-end execution of data integration projects, including scoping, planning,

delivery, and stakeholder communication.

 Gather and translate business requirements into actionable technical specifications,

liaising with business and technical teams.


 Oversee maintenance and enhancement of existing integrations, performing RCA

and resolving integration-related issues.

 Document workflows, processes, and best practices for current and future

integration projects.

 Continuously monitor system performance and scalability, recommending

improvements to increase efficiency.

 Coordinate closely with Operations for onboarding and support, ensuring seamless

handover and issue resolution.

Desired Skills & Qualifications

 Strong experience in API integration, data analysis, and documentation.

 Familiarity with Jira for ticket management and project workflow.

 Hands-on experience with machine learning model development and deployment.

 Excellent communication skills for requirement gathering and stakeholder

engagement.

 Experience with QA test processes and pilot rollouts.

 Proficiency in project management, data workflow documentation, and system

monitoring.

 Ability to manage multiple integrations simultaneously and work cross-functionally.

Required Qualifications

 Experience: Minimum 4 years in hotel technology or business analytics, preferably

handling data integration or system interoperability projects.

 Technical Skills:

 Basic proficiency in SQL or database querying.

 Familiarity with data integration concepts such as APIs or ETL workflows

(preferred but not mandatory).

 Eagerness to learn and adapt to new tools, platforms, and technologies.

 Hotel Technology Expertise: Understanding of systems such as PMS, CRS, Channel

Managers, or RMS.

 Project Management: Strong organizational and multitasking abilities.

 Problem Solving: Analytical thinker capable of troubleshooting and driving resolution.


 Communication: Excellent written and verbal skills to bridge technical and non-

technical discussions.


 Attention to Detail: Methodical approach to documentation, testing, and deployment.

Preferred Qualification

 Exposure to debugging tools and troubleshooting methodologies.

 Familiarity with cloud environments (AWS).

 Understanding of data security and privacy considerations in the hospitality industry.

Why LodgIQ?

 Join a fast-growing, mission-driven company transforming the future of hospitality.


 Work on intellectually challenging problems at the intersection of machine learning,

decision science, and human behavior.

 Be part of a high-impact, collaborative team with the autonomy to drive initiatives from

ideation to production.

 Competitive salary and performance bonuses.

 For more information, visit https://www.lodgiq.com

Read more
Global digital transformation solutions provider.

Global digital transformation solutions provider.

Agency job
via Peak Hire Solutions by Dhara Thakkar
Bengaluru (Bangalore)
7 - 9 yrs
₹15L - ₹28L / yr
databricks
skill iconPython
SQL
PySpark
skill iconAmazon Web Services (AWS)
+9 more

Role Proficiency:

This role requires proficiency in developing data pipelines including coding and testing for ingesting wrangling transforming and joining data from various sources. The ideal candidate should be adept in ETL tools like Informatica Glue Databricks and DataProc with strong coding skills in Python PySpark and SQL. This position demands independence and proficiency across various data domains. Expertise in data warehousing solutions such as Snowflake BigQuery Lakehouse and Delta Lake is essential including the ability to calculate processing costs and address performance issues. A solid understanding of DevOps and infrastructure needs is also required.


Skill Examples:

  1. Proficiency in SQL Python or other programming languages used for data manipulation.
  2. Experience with ETL tools such as Apache Airflow Talend Informatica AWS Glue Dataproc and Azure ADF.
  3. Hands-on experience with cloud platforms like AWS Azure or Google Cloud particularly with data-related services (e.g. AWS Glue BigQuery).
  4. Conduct tests on data pipelines and evaluate results against data quality and performance specifications.
  5. Experience in performance tuning.
  6. Experience in data warehouse design and cost improvements.
  7. Apply and optimize data models for efficient storage retrieval and processing of large datasets.
  8. Communicate and explain design/development aspects to customers.
  9. Estimate time and resource requirements for developing/debugging features/components.
  10. Participate in RFP responses and solutioning.
  11. Mentor team members and guide them in relevant upskilling and certification.

 

Knowledge Examples:

  1. Knowledge of various ETL services used by cloud providers including Apache PySpark AWS Glue GCP DataProc/Dataflow Azure ADF and ADLF.
  2. Proficient in SQL for analytics and windowing functions.
  3. Understanding of data schemas and models.
  4. Familiarity with domain-related data.
  5. Knowledge of data warehouse optimization techniques.
  6. Understanding of data security concepts.
  7. Awareness of patterns frameworks and automation practices.


 

Additional Comments:

# of Resources: 22 Role(s): Technical Role Location(s): India Planned Start Date: 1/1/2026 Planned End Date: 6/30/2026

Project Overview:

Role Scope / Deliverables: We are seeking highly skilled Data Engineer with strong experience in Databricks, PySpark, Python, SQL, and AWS to join our data engineering team on or before 1st week of Dec, 2025.

The candidate will be responsible for designing, developing, and optimizing large-scale data pipelines and analytics solutions that drive business insights and operational efficiency.

Design, build, and maintain scalable data pipelines using Databricks and PySpark.

Develop and optimize complex SQL queries for data extraction, transformation, and analysis.

Implement data integration solutions across multiple AWS services (S3, Glue, Lambda, Redshift, EMR, etc.).

Collaborate with analytics, data science, and business teams to deliver clean, reliable, and timely datasets.

Ensure data quality, performance, and reliability across data workflows.

Participate in code reviews, data architecture discussions, and performance optimization initiatives.

Support migration and modernization efforts for legacy data systems to modern cloud-based solutions.


Key Skills:

Hands-on experience with Databricks, PySpark & Python for building ETL/ELT pipelines.

Proficiency in SQL (performance tuning, complex joins, CTEs, window functions).

Strong understanding of AWS services (S3, Glue, Lambda, Redshift, CloudWatch, etc.).

Experience with data modeling, schema design, and performance optimization.

Familiarity with CI/CD pipelines, version control (Git), and workflow orchestration (Airflow preferred).

Excellent problem-solving, communication, and collaboration skills.

 

Skills: Databricks, Pyspark & Python, Sql, Aws Services

 

Must-Haves

Python/PySpark (5+ years), SQL (5+ years), Databricks (3+ years), AWS Services (3+ years), ETL tools (Informatica, Glue, DataProc) (3+ years)

Hands-on experience with Databricks, PySpark & Python for ETL/ELT pipelines.

Proficiency in SQL (performance tuning, complex joins, CTEs, window functions).

Strong understanding of AWS services (S3, Glue, Lambda, Redshift, CloudWatch, etc.).

Experience with data modeling, schema design, and performance optimization.

Familiarity with CI/CD pipelines, Git, and workflow orchestration (Airflow preferred).


******

Notice period - Immediate to 15 days

Location: Bangalore

Read more
Mantle Solutions- A Lulu Group Company
Nikita Sinha
Posted by Nikita Sinha
Bangalore (Whitefield)
2 - 4 yrs
Upto ₹20L / yr (Varies
)
skill iconPython
SQL
skill iconMachine Learning (ML)
skill iconData Analytics

We are seeking a hands-on eCommerce Analytics & Insights Lead to help establish and scale our newly launched eCommerce business. The ideal candidate is highly data-savvy, understands eCommerce deeply, and can lead KPI definition, performance tracking, insights generation, and data-driven decision-making.

You will work closely with cross-functional teams—Buying, Marketing, Operations, and Technology—to build dashboards, uncover growth opportunities, and guide the evolution of our online channel.


Key Responsibilities

Define & Monitor eCommerce KPIs

  • Set up and track KPIs across the customer journey: traffic, conversion, retention, AOV/basket size, repeat rate, etc.
  • Build KPI frameworks aligned with business goals.

Data Tracking & Infrastructure

  • Partner with marketing, merchandising, operations, and tech teams to define data tracking requirements.
  • Collaborate with eCommerce and data engineering teams to ensure data quality, completeness, and availability.

Dashboards & Reporting

  • Build dashboards and automated reports to track:
  • Overall site performance
  • Category & product performance
  • Marketing ROI and acquisition effectiveness

Insights & Performance Diagnosis

Identify trends, opportunities, and root causes of underperformance in areas such as:

  • Product availability & stock health
  • Pricing & promotions
  • Checkout funnel drop-offs
  • Customer retention & cohort behavior
  • Channel acquisition performance

Conduct:

  • Cohort analysis
  • Funnel analytics
  • Customer segmentation
  • Basket analysis

Data-Driven Growth Initiatives

  • Propose and evaluate experiments, optimization ideas, and quick wins.
  • Help business teams interpret KPIs and take informed decisions.

Required Skills & Experience

  • 2–5 years experience in eCommerce analytics (grocery retail experience preferred).
  • Strong understanding of eCommerce metrics and analytics frameworks (Traffic → Conversion → Repeat → LTV).
  • Proficiency with tools such as:
  • Google Analytics / GA4
  • Excel
  • SQL
  • Power BI or Tableau
  • Experience working with:
  • Digital marketing data
  • CRM and customer data
  • Product/category performance data
  • Ability to convert business questions into analytical tasks and produce clear, actionable insights.
  • Familiarity with:
  • Customer journey mapping
  • Funnel analysis
  • Basket and behavioral analysis
  • Comfortable working in fast-paced, ambiguous, and build-from-scratch environments.
  • Strong communication and stakeholder management skills.
  • Strong technical capability in at least one programming language: SQL or PySpark.

Good to Have

  • Experience with eCommerce platforms (Shopify, Magento, Salesforce Commerce, etc.).
  • Exposure to A/B testing, recommendation engines, or personalization analytics.
  • Knowledge of Python/R for deeper analytics (optional).
  • Experience with tracking setup (GTM, event tagging, pixel/event instrumentation).
Read more
Loyalytics

at Loyalytics

2 recruiters
Nikita Sinha
Posted by Nikita Sinha
Bengaluru (Bangalore)
4 - 7 yrs
Upto ₹22L / yr (Varies
)
SQL
PowerBI
skill iconData Analytics
Customer Relationship Management (CRM)

In this role, you will drive and support customer analytics for HP’s online store business across the APJ region. You will lead campaign performance analytics, customer database intelligence, and enable data-driven targeting for automation and trigger programs. Your insights will directly shape customer engagement, marketing strategy, and business decision-making.


You will be part of the International Customer Management team, which focuses on customer strategy, base value, monetization, and brand consideration. As part of HP’s Digital Direct organization, you will support the company’s strategic transformation toward direct-to-customer excellence.


Join HP—a US$50B global technology leader known for innovation and being #1 in several business domains.


Key Responsibilities

Customer Insights & Analytics

  • Design and deploy customer success and engagement metrics across APJ.
  • Analyze customer behavior and engagement to drive data-backed marketing decisions.
  • Apply statistical techniques to translate raw data into meaningful insights.

Campaign Performance & Optimization

  • Elevate marketing campaigns across APJ by enabling advanced targeting criteria, performance monitoring, and test-and-learn frameworks.
  • Conduct campaign measurement, identifying trends, patterns, and optimization opportunities.

Data Management & Reporting

  • Develop a deep understanding of business data across markets.
  • Build and maintain SQL-based data assets: tables, stored procedures, scripts, queries, and SQL views.
  • Provide reporting and dashboards for marketing, sales, and CRM teams using Tableau or Power BI.
  • Measure and monitor strategic initiatives against KPIs and provide uplift forecasts for prioritization.

Required Experience

  • 4+ years of relevant experience (flexible for strong profiles).
  • Proficiency in SQL, including:
  • Database design principles
  • Query optimization
  • Data integrity checks
  • Building SQL views, stored procedures, and analytics-ready datasets
  • Experience translating analytics into business outcomes.
  • Hands-on experience analyzing campaign performance.
  • Expertise with data visualization tools such as Tableau or Power BI.
  • Experience with campaign management/marketing automation platforms (preferably Salesforce Marketing Cloud).

About You

  • Strong advocate of customer data–driven marketing.
  • Comfortable working hands-on with data and solving complex problems.
  • Confident communicator who can work with multiple cross-functional stakeholders.
  • Passionate about experimentation (test & learn) and continuous improvement.
  • Self-driven, accountable, and motivated by ownership.
  • Thrive in a diverse, international, dynamic environment.


Read more
Global digital transformation solutions provider.

Global digital transformation solutions provider.

Agency job
via Peak Hire Solutions by Dhara Thakkar
Bengaluru (Bangalore), Chennai, Kochi (Cochin), Trivandrum, Hyderabad, Thiruvananthapuram
8 - 10 yrs
₹10L - ₹25L / yr
Business Analysis
Data Visualization
PowerBI
SQL
Tableau
+18 more

Job Description – Senior Technical Business Analyst

Location: Trivandrum (Preferred) | Open to any location in India

Shift Timings - 8 hours window between the 7:30 PM IST - 4:30 AM IST

 

About the Role

We are seeking highly motivated and analytically strong Senior Technical Business Analysts who can work seamlessly with business and technology stakeholders to convert a one-line problem statement into a well-defined project or opportunity. This role is ideal for fresh graduates who have a strong foundation in data analytics, data engineering, data visualization, and data science, along with a strong drive to learn, collaborate, and grow in a dynamic, fast-paced environment.

As a Technical Business Analyst, you will be responsible for translating complex business challenges into actionable user stories, analytical models, and executable tasks in Jira. You will work across the entire data lifecycle—from understanding business context to delivering insights, solutions, and measurable outcomes.

 

Key Responsibilities

Business & Analytical Responsibilities

  • Partner with business teams to understand one-line problem statements and translate them into detailed business requirementsopportunities, and project scope.
  • Conduct exploratory data analysis (EDA) to uncover trends, patterns, and business insights.
  • Create documentation including Business Requirement Documents (BRDs)user storiesprocess flows, and analytical models.
  • Break down business needs into concise, actionable, and development-ready user stories in Jira.

Data & Technical Responsibilities

  • Collaborate with data engineering teams to design, review, and validate data pipelinesdata models, and ETL/ELT workflows.
  • Build dashboards, reports, and data visualizations using leading BI tools to communicate insights effectively.
  • Apply foundational data science concepts such as statistical analysispredictive modeling, and machine learning fundamentals.
  • Validate and ensure data quality, consistency, and accuracy across datasets and systems.

Collaboration & Execution

  • Work closely with product, engineering, BI, and operations teams to support the end-to-end delivery of analytical solutions.
  • Assist in development, testing, and rollout of data-driven solutions.
  • Present findings, insights, and recommendations clearly and confidently to both technical and non-technical stakeholders.

 

Required Skillsets

Core Technical Skills

  • 6+ years of Technical Business Analyst experience within an overall professional experience of 8+ years
  • Data Analytics: SQL, descriptive analytics, business problem framing.
  • Data Engineering (Foundational): Understanding of data warehousing, ETL/ELT processes, cloud data platforms (AWS/GCP/Azure preferred).
  • Data Visualization: Experience with Power BI, Tableau, or equivalent tools.
  • Data Science (Basic/Intermediate): Python/R, statistical methods, fundamentals of ML algorithms.

 

Soft Skills

  • Strong analytical thinking and structured problem-solving capability.
  • Ability to convert business problems into clear technical requirements.
  • Excellent communication, documentation, and presentation skills.
  • High curiosity, adaptability, and eagerness to learn new tools and techniques.

 

Educational Qualifications

  • BE/B.Tech or equivalent in:
  • Computer Science / IT
  • Data Science

 

What We Look For

  • Demonstrated passion for data and analytics through projects and certifications.
  • Strong commitment to continuous learning and innovation.
  • Ability to work both independently and in collaborative team environments.
  • Passion for solving business problems using data-driven approaches.
  • Proven ability (or aptitude) to convert a one-line business problem into a structured project or opportunity.

 

Why Join Us?

  • Exposure to modern data platforms, analytics tools, and AI technologies.
  • A culture that promotes innovation, ownership, and continuous learning.
  • Supportive environment to build a strong career in data and analytics.

 

Skills: Data Analytics, Business Analysis, Sql


Must-Haves

Technical Business Analyst (6+ years), SQL, Data Visualization (Power BI, Tableau), Data Engineering (ETL/ELT, cloud platforms), Python/R

 

******

Notice period - 0 to 15 days (Max 30 Days)

Educational Qualifications: BE/B.Tech or equivalent in: (Computer Science / IT) /Data Science

Location: Trivandrum (Preferred) | Open to any location in India

Shift Timings - 8 hours window between the 7:30 PM IST - 4:30 AM IST

Read more
Quanteon Solutions
DurgaPrasad Sannamuri
Posted by DurgaPrasad Sannamuri
Hyderabad
0 - 2 yrs
₹3L - ₹5L / yr
skill iconJava
skill iconPython
skill iconJavascript
Selenium
Playwright
+13 more

About the Role

We are looking for a strong, self-driven QA Engineer who can perform a hybrid role in the new testing paradigm — acting as both a Business Analyst (BA) and a Quality Assurance (QA) professional. The ideal candidate should be capable of understanding business needs under direction, translating them into clear requirements, and then validating them through effective QA practices.

This role requires someone who can leverage AI tools extensively to automate and optimize both requirements documentation and QA activities, reducing manual effort while improving speed and accuracy.

 

Key Responsibilities

Business Analysis Responsibilities

  • Work under direction to understand business problems, workflows, and client expectations
  • Elicit, analyze, and document business and functional requirements
  • Create and maintain BRDs, FRDs, user stories, acceptance criteria, and process flows
  • Collaborate with stakeholders, developers, and product teams to clarify requirements
  • Use AI tools to assist with requirement generation, refinement, documentation, and validation

Quality Assurance Responsibilities

  • Design, develop, and execute manual and automated test cases based on documented requirements
  • Perform functional, regression, smoke, sanity, and UAT testing
  • Ensure traceability between requirements and test cases
  • Identify, log, track, and retest defects using defect tracking tools
  • Collaborate closely with development teams to ensure quality delivery
  • Use AI-powered QA tools to automate test case creation, execution, and maintenance

AI & Automation Focus

  • Use AI tools to:
  • Generate and refine requirements and user stories
  • Auto-create test cases from requirements
  • Optimize regression test suites
  • Perform test data generation and defect analysis
  • Continuously identify areas where AI can reduce manual effort and improve efficiency
  • Ensure quality, accuracy, and business alignment of AI-generated outputs

 

Required Skills & Qualifications

  • 1–3 years of experience in QA / Software Testing, with exposure to Business Analysis activities
  • Strong understanding of SDLC, STLC, and Agile methodologies
  • Proven ability to understand requirements and translate them into effective test scenarios
  • Experience with QA Automation tools (Selenium, Cypress, Playwright, or similar)
  • Hands-on experience using AI tools for QA and documentation (AI test generators, AI copilots, testRigor, Gen AI tools, etc.)
  • Good knowledge of test case design techniques and requirement traceability
  • Basic to intermediate knowledge of programming/scripting languages (Java, JavaScript, or Python)
  • Experience with API testing (Postman or similar tools)
  • Familiarity with JIRA, Confluence, or similar tools
  • Strong analytical, problem-solving, and documentation skills
  • Ability to take direction, work independently, and deliver with minimal supervision

Educational Qualifications

  • B.Tech / B.E in IT, CSE, AI/ML, ECE
  • M.Tech / M.E in IT, CSE, AI/ML, ECE
  • Strong academic foundation in programming, software engineering, or testing concepts is preferred
  • Certifications in Software Testing, Automation, or AI tools (optional but an added advantage)
Read more
Phi Commerce

at Phi Commerce

2 candid answers
Nikita Sinha
Posted by Nikita Sinha
Pune
3 - 8 yrs
Upto ₹18L / yr (Varies
)
skill iconJava
skill iconSpring Boot
Microservices
SQL
skill iconAngular (2+)

We are seeking skilled and experienced Java Full Stack Developers to join our engineering team. The ideal candidate will have strong backend expertise in Java, Spring Boot, Microservices and hands-on frontend experience with Angular (version 11 or higher). This role requires the ability to build scalable, high-performance applications while working closely across teams such as Product, QA, and Architecture.


Responsibilities

  • Develop, test, and deploy scalable and robust backend services using Java, Spring Boot, and Microservices.
  • Build responsive, user-friendly web applications using Angular (v11+).
  • Collaborate with architects and team members to design scalable, maintainable, and efficient systems.
  • Contribute to system architecture discussions for microservices, APIs, and integrations.
  • Implement and maintain RESTful APIs for seamless frontend-backend interaction.
  • Optimize application performance and perform debugging across the full stack.
  • Write clean, reusable, and maintainable code following engineering best practices.
  • Work cross-functionally with UI/UX, Product Management, QA, and DevOps teams.
  • Mentor junior engineers (for senior positions).

Mandatory Skills

Backend:

  • Java / Java 8
  • Spring Boot
  • Spring Framework
  • Microservices
  • REST API development
  • SQL (MySQL or similar relational database)

Frontend:

  • Angular 11 or higher (mandatory)
  • TypeScript, JavaScript
  • HTML, CSS
  • Note: React or other frameworks are not accepted

Other Mandatory Skills:

  • Strong experience working in Linux-based systems
  • Ability to troubleshoot issues across the full stack
  • Understanding of scalable architecture principles

Preferred Skills

  • Experience in Fintech / Payments / Banking domain
  • Knowledge of caching, performance optimization, and security best practices
  • Exposure to Kafka or messaging systems
  • Hands-on experience with CI/CD pipelines (good to have)

Candidate Profile

  • Strong communication and problem-solving skills
  • Ability to work in a fast-paced environment
  • Collaborative mindset with ownership mentality
  • Open to working from office (Pune, 5 days a week)
  • Willing to travel for the final in-person interview (if not based in Pune)


Read more
Phi Commerce

at Phi Commerce

2 candid answers
Nikita Sinha
Posted by Nikita Sinha
Pune
3 - 8 yrs
Upto ₹18L / yr (Varies
)
skill iconJava
skill iconSpring Boot
Microservices
SQL
Linux/Unix

We are seeking an experienced and highly skilled Backend Java Engineer to join our team.

The ideal candidate will have strong expertise in Core Java, Spring Boot, Microservices, and building high-performance, scalable backend applications.


Responsibilities

  • Develop, test, and deploy scalable and robust backend services using Java, Spring Boot, and Spring Framework.
  • Design and implement RESTful APIs for seamless integrations.
  • Contribute to architectural decisions involving microservices, APIs, and cloud-based solutions.
  • Write clean, efficient, and reusable code following coding standards and best practices.
  • Optimize application performance and participate in debugging and troubleshooting sessions.
  • Collaborate with architects, product managers, and QA engineers to deliver high-quality releases.
  • Conduct peer code reviews and ensure adherence to engineering best practices.
  • Mentor junior engineers and support their technical growth where required.

Skills & Requirements

  • Minimum 2 years of hands-on backend development experience.
  • Strong proficiency in:
  • Core Java / Java 8
  • Spring Boot, Spring Framework
  • Microservices architecture
  • REST APIs
  • Experience with:
  • Kafka (preferred)
  • MySQL or other relational databases
  • Batch processing, application performance tuning, caching strategies
  • Web security / application security
  • Solid understanding of software design principles and scalable system design.

Preferred

  • Male candidates preferred (client-mentioned requirement).
  • Experience working in fintech, payments, or high-scale production environments
Read more
Phi Commerce

at Phi Commerce

2 candid answers
Nikita Sinha
Posted by Nikita Sinha
Pune
11 - 15 yrs
Upto ₹32L / yr (Varies
)
Linux/Unix
SQL
Shell Scripting
skill iconAmazon Web Services (AWS)
CI/CD
+2 more

The Production Infrastructure Manager is responsible for overseeing and maintaining the infrastructure that powers our payment gateway systems in a high-availability production environment. This role requires deep technical expertise in cloud platforms, networking, and security, along with strong leadership capability to guide a team of infrastructure engineers. You will ensure the system’s reliability, performance, and compliance with regulatory standards while driving continuous improvement.


Key Responsibilities:

Infrastructure Management

  • Manage and optimize infrastructure for payment gateway systems to ensure high availability, reliability, and scalability.
  • Oversee daily operations of production environments, including AWS cloud services, load balancers, databases, and monitoring systems.
  • Implement and maintain infrastructure automation, provisioning, configuration management, and disaster recovery strategies.
  • Develop and maintain capacity planning, monitoring, and backup mechanisms to support peak transaction periods.
  • Oversee regular patching, updates, and version control to minimize vulnerabilities.

Team Leadership

  • Lead and mentor a team of infrastructure engineers and administrators.
  • Provide technical direction to ensure efficient and effective implementation of infrastructure solutions.

Cross-Functional Collaboration

  • Work closely with development, security, and product teams to ensure infrastructure aligns with business needs and regulatory requirements (PCI-DSS, GDPR).
  • Ensure infrastructure practices meet industry standards and security requirements (PCI-DSS, ISO 27001).

Monitoring & Incident Management

  • Monitor infrastructure performance using tools like Prometheus, Grafana, Datadog, etc.
  • Conduct incident response, root cause analysis, and post-mortems to prevent recurring issues.
  • Manage and execute on-call duties, ensuring timely resolution of infrastructure-related issues.

Documentation

  • Maintain comprehensive documentation, including architecture diagrams, processes, and disaster recovery plans.

Skills and Qualifications

Required

  • Bachelor’s degree in Computer Science, IT, or equivalent experience.
  • 8+ years of experience managing production infrastructure in high-availability, mission-critical environments (fintech or payment gateways preferred).
  • Expertise in AWS cloud environments.
  • Strong experience with Infrastructure as Code (IaC) tools such as Terraform or CloudFormation.
  • Deep understanding of:
  • Networking (load balancers, firewalls, VPNs, distributed systems)
  • Database systems (SQL/NoSQL), HA & DR strategies
  • Automation tools (Ansible, Chef, Puppet) and containerization/orchestration (Docker, Kubernetes)
  • Security best practices, encryption, vulnerability management, PCI-DSS compliance
  • Experience with monitoring tools (Prometheus, Grafana, Datadog).
  • Strong analytical and problem-solving skills.
  • Excellent communication and leadership capabilities.

Preferred

  • Experience in fintech/payment industry with regulatory exposure.
  • Ability to operate effectively under pressure and ensure service continuity.


Read more
Albert Invent

at Albert Invent

4 candid answers
3 recruiters
Nikita Sinha
Posted by Nikita Sinha
Bengaluru (Bangalore)
4 - 6 yrs
Upto ₹30L / yr (Varies
)
skill iconPython
AWS Lambda
Amazon Redshift
Snow flake schema
SQL

To design, build, and optimize scalable data infrastructure and pipelines that enable efficient

data collection, transformation, and analysis across the organization. The Senior Data Engineer

will play a key role in driving data architecture decisions, ensuring data quality and availability,

and empowering analytics, product, and engineering teams with reliable, well-structured data to

support business growth and strategic decision-making.


Responsibilities:

• Develop, and maintain SQL and NoSQL databases, ensuring high performance,

scalability, and reliability.

• Collaborate with the API team and Data Science team to build robust data pipelines and

automations.

• Work closely with stakeholders to understand database requirements and provide

technical solutions.

• Optimize database queries and performance tuning to enhance overall system

efficiency.

• Implement and maintain data security measures, including access controls and

encryption.

• Monitor database systems and troubleshoot issues proactively to ensure uninterrupted

service.

• Develop and enforce data quality standards and processes to maintain data integrity.

• Create and maintain documentation for database architecture, processes, and

procedures.

• Stay updated with the latest database technologies and best practices to drive

continuous improvement.

• Expertise in SQL queries and stored procedures, with the ability to optimize and fine-tune

complex queries for performance and efficiency.

• Experience with monitoring and visualization tools such as Grafana to monitor database

performance and health.


Requirements:

• 4+ years of experience in data engineering, with a focus on large-scale data systems.

• Proven experience designing data models and access patterns across SQL and NoSQL

ecosystems.

• Hands-on experience with technologies like PostgreSQL, DynamoDB, S3, GraphQL, or

vector databases.

• Proficient in SQL stored procedures with extensive expertise in MySQL schema design,

query optimization, and resolvers, along with hands-on experience in building and

maintaining data warehouses.

• Strong programming skills in Python or JavaScript, with the ability to write efficient,

maintainable code.

• Familiarity with distributed systems, data partitioning, and consistency models.

• Familiarity with observability stacks (Prometheus, Grafana, OpenTelemetry) and

debugging production bottlenecks.

• Deep understanding of cloud infrastructure (preferably AWS), including networking, IAM,

and cost optimization.

• Prior experience building multi-tenant systems with strict performance and isolation

guarantees.

• Excellent communication and collaboration skills to influence cross-functional technical

decisions.

Read more
Global digital transformation solutions provider.

Global digital transformation solutions provider.

Agency job
via Peak Hire Solutions by Dhara Thakkar
Thiruvananthapuram, Chennai, Pune
4 - 7 yrs
₹10L - ₹20L / yr
skill iconC#
Test Automation (QA)
Manual testing
Play Framework
SQL
+6 more

Role Proficiency:

Performs tests in strict compliance independently guides other testers and assists test leads


Additional Comments:

Position Title: - Automation + Manual Tester Primary

Skills: Playwright, xUnit, Allure Report, Page Object Model, .Net, C#, Database Queries

Secondary Skills: GIT, JIRA, Manual Testing Experience: 4 to 5 years ESSENTIAL FUNCTIONS AND


BASIC DUTIES

1. Leadership in Automation Strategy: o Assess the feasibility and scope of automation efforts to ensure they align with project timelines and requirements. o Identify opportunities for process improvements and automation within the software development life cycle (SDLC).

2. Automation Test Framework Development: o Design, develop, and implement reusable test automation frameworks for various testing phases (unit, integration, functional, performance, etc.). o Ensure the automation frameworks integrate well with CI/CD pipelines and other development tools. o Maintain and optimize test automation scripts and frameworks for continuous improvements.

3. Team Management: o Lead and mentor a team of automation engineers, ensuring they follow best practices, writing efficient test scripts, and developing scalable automation solutions. o Conduct regular performance evaluations and provide constructive feedback. o Facilitate knowledge-sharing sessions within the team.

4. Collaboration with Cross-functional Teams: o Work closely with development, QA, and operations teams to ensure proper implementation of automated testing and automation practices. o Collaborate with business analysts, product owners, and project managers to understand business requirements and translate them into automated test cases.

5. Continuous Integration & Delivery (CI/CD): o Ensure that automated tests are integrated into the CI/CD pipelines to facilitate continuous testing. o Identify and resolve issues related to the automation processes within the CI/CD pipeline.

6. Test Planning and Estimation: o Contribute to the test planning phase by identifying key automation opportunities. o Estimate effort and time required for automating test cases and other automation tasks.

7. Test Reporting and Metrics: o Monitor automation test results and generate detailed reports on test coverage, defects, and progress. o Analyze test results to identify trends, bottlenecks, or issues in the automation process and make necessary improvements.

8. Automation Tools Management: o Evaluate, select, and manage automation tools and technologies that best meet the needs of the project. o Ensure that the automation tools used align with the overall project requirements and help to achieve optimal efficiency.

9. Test Environment and Data Management: o Work on setting up and maintaining the test environments needed for automation. o Ensure automation scripts work across multiple environments, including staging, testing, and production environments.

10. Risk Management & Issue Resolution:

• Proactively identify risks associated with the automation efforts and provide solutions or mitigation strategies.

• Troubleshoot issues in the automation scripts, framework, and infrastructure to ensure minimal downtime and quick issue resolution.

11. Develop and Maintain Automated Tests: Write and maintain automated scripts for different testing levels, including regression, functional, and integration tests.

12. Bug Identification and Tracking: Report, track, and manage defects identified through automation testing to ensure quick resolution.

13. Improve Test Coverage: Identify gaps in test coverage and develop additional test scripts to improve test comprehensiveness. 14. Automation Documentation: Create and maintain detailed documentation for test automation processes, scripts, and frameworks.

15. Quality Assurance: Ensure that all automated testing activities meet the quality standards, contributing to delivering a high-quality software product.

16. Stakeholder Communication: Regularly update project stakeholders about automation progress, risks, and areas for improvement.


REQUIRED KNOWLEDGE

1. Automation Tools Expertise: Proficiency in tools like Playwright, Allure reports and integration with CI/CD pipelines.

2. Programming Languages: Strong knowledge of languages such as .NET and test frameworks like xUnit.

3. Version Control: Experience using Git for script management and collaboration.

4. Test Automation Frameworks: Ability to design scalable, reusable frameworks for different types of tests (functional, integration, etc.).

5. Leadership and Mentoring: Lead and mentor automation teams, ensuring adherence to best practices and continuous improvement.

6. Problem-Solving: Strong troubleshooting and analytical skills to identify and resolve automation issues quickly.

7. Collaboration and Communication: Excellent communication skills for working with cross-functional teams and presenting test results.

8. Time Management: Ability to estimate, prioritize, and manage automation tasks to meet project deadlines.

9. Quality Focus: Strong commitment to improving software quality, test coverage, and automation efficiency.


Skills: xUnit, Allure report, Playwright, C#

Read more
CFRA

at CFRA

4 candid answers
2 recruiters
Bisman Gill
Posted by Bisman Gill
Remote only
3yrs+
Upto ₹15L / yr (Varies
)
skill iconAmazon Web Services (AWS)
SQL
Selenium
Appium
cypress
+3 more

Quality Engineer is responsible for planning, developing, and executing tests for CFRA’s financial software. The responsibilities include designing and implementing tests, debugging and defining corrective actions. The role plays an important part in our company’s product development process. Our ideal candidate will be responsible for conducting tests to ensure software runs efficiently and meets client needs, while at the same time being cost-effective. You will be part of CFRA Data Collection Team responsible for collecting, processing and publishing financial market data for internal and external stakeholders. The team uses a contemporary stack in the AWS Cloud to design, build and maintain a robust data architecture, data engineering pipelines, and large-scale data systems. You will be responsible for verifying and validating all data quality and completeness parameters for the automated (ETL) pipeline processes (new and existing).

Key Responsibilities

  • Review requirements, specifications and technical design documents to provide timely and meaningful feedback
  • Create detailed, comprehensive and well-structured test plans and test cases
  • Estimate, prioritize, plan and coordinate testing activities
  • Identify, record, document thoroughly and track bugs
  • Develop and apply testing processes for new and existing products to meet client needs
  • Liaise with internal teams to identify system requirements and develop testing plans
  • Investigate the causes of non-conforming software and train users to implement solutions
  • Stay up-to-date with new testing tools and test strategies

Desired Skills

  • Proven work experience in software development and quality assurance
  • Strong knowledge of software QA methodologies, tools and processes
  • Experience in writing clear, concise and comprehensive test plans and test cases
  • Hands-on experience with automated testing tools
  • Acute attention to detail
  • Experience working in an Agile/Scrum development process
  • Excellent collaboration skills

 Technical Skills

  • Proficient with SQL, and capable of developing queries for testing
  • Familiarity with Python, especially for scripting tests
  • Familiarity with Cloud Technology and working with remote servers


Read more
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort
Why apply via Cutshort?
Connect with actual hiring teams and get their fast response. No spam.
Find more jobs
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort