50+ SQL Jobs in India
Apply to 50+ SQL Jobs on CutShort.io. Find your next job, effortlessly. Browse SQL Jobs and apply today!
Hiring for Lead Data Engineer
Exp : 6 - 10 yrs
Edu : Any Graduates
Work Location : Noida WFO
Skills :
Team Handling Experience .
Advanced SQL and PySpark
Data Engineering concepts (Data Warehouse (DW), Data Lake, OLTP vs OLAP, etc.)
API development experience (preferably in Python)
Familiarity with Docker and Kubernetes
Experience with Airflow and DBT
Exposure to Hudi, Iceberg, or Delta Lake
Strong AWS project experience
Hiring for Data Engineer - AWS
Exp : 3 - 6 yrs
Edu : BE/B.Tech
Work Location : Noida WFO
Skills
Data Engineers, Pyspark, SQL, AWS, Data Pipelines,airflow, Hadoop
We are looking for a detail-oriented **QA / Software Tester (Fresher)** to join our development team. You will be responsible for testing applications, identifying bugs, and ensuring software quality before release. This role is ideal for fresh graduates who want to start a career in software testing and quality assurance.
Key Responsibilities
1. Test web and mobile applications for functionality and usability.
2. Identify, document, and track software defects.
3. Execute manual test cases and report results.
4. Work with developers to fix issues and verify fixes.
5. Perform regression and functional testing.
6. Ensure the product meets quality standards before release.
7. Assist in preparing test documentation and reports.
Required Skills
1. Basic understanding of software testing concepts.
2. Knowledge of manual testing.
3. Basic understanding of SDLC and STLC.
4. Familiarity with web or mobile applications.
5. Basic knowledge of SQL or databases (optional).
6. Good analytical and problem-solving skills.
7. Attention to detail.
Preferred (Optional)
1. Knowledge of any testing tool (Selenium, JIRA, etc.).
2. Understanding of automation testing basics.
3. Internship or academic project experience.
Location: Remote / Chennai
Experience: 0–2 Year (Freshers Welcome)
Education: B.E / B.Tech / BCA / MCA / Any Computer-related Degree
Apply here: https://connectsblue.com/jobs/738/qa-software-tester-fresher-at-bluepms-software-solutions-pvt-ltd
As a Senior Data Engineer, you will be responsible for building and delivering a Lakehouse-based data pipeline. This is a hands-on role focused on implementing real-time and batch data ingestion, processing, and delivery workflows, while ensuring strong monitoring, observability, and data quality across the entire pipeline.
Must-Have Skills
- 3+ years of hands-on experience building large-scale data pipelines
- Strong experience with Spark Streaming, AWS Glue, and EMR for real-time and batch processing
- Proficiency in PySpark/Python, including building Kafka producers for data ingestion
- Experience working with Confluent Kafka and Spark Streaming for ingestion from on-premise sources
- Solid understanding of AWS services including:
- S3
- Redshift
- Glue
- CloudWatch
- Secrets Manager
- Experience working with Medallion Architecture and hybrid data destinations (e.g., Redshift + on-prem Oracle)
- Ability to implement monitoring dashboards and observability using tools like CloudWatch or Datadog
- Strong SQL skills for data validation and job-level metrics development
- Experience building alerting mechanisms for pipeline failures and performance issues
- Strong collaboration and communication skills
- Proven ownership mindset — driving deliverables from design to deployment
- Experience mentoring junior engineers, conducting code reviews, and guiding best practices
- AWS Certified Data Engineer – Associate (preferred/required)
Good-to-Have Skills
- Experience with orchestration tools such as Apache Airflow or AWS Step Functions
- Exposure to Big Data ecosystem tools:
- Sqoop
- HDFS
- Hive
- NiFi
- Exposure to Terraform for infrastructure automation
- Familiarity with CI/CD pipelines for data workflows
We are hiring an Associate Technical Architect with strong expertise in AWS-based Data Platforms to design scalable data lakes, warehouses, and enterprise data pipelines while working with global teams.
Key Responsibilities
- Design and implement scalable data warehouse, data lake, and lakehouse architectures on AWS
- Build resilient and modular data pipelines using native AWS services
- Architect cloud-based data platforms and evaluate service trade-offs
- Optimize large-scale data processing and query performance
- Collaborate with global cross-functional teams (Engineering, QA, PMs, Stakeholders)
- Communicate technical roadmap, risks, and mitigation strategies
Must-Have Skills
- 8+ years of experience in AWS Data Engineering / Data Architecture
- Hands-on experience with AWS services:
- Amazon S3
- AWS Glue
- AWS Lambda
- Amazon EMR
- AWS Kinesis (Streams & Firehose)
- AWS Step Functions / MWAA
- Amazon Redshift (Spectrum & Serverless)
- Amazon Athena
- Amazon RDS
- AWS Lake Formation
- AWS DMS, EventBridge, SNS, SQS
- Strong programming skills in Python & PySpark
- Advanced SQL with query optimization & performance tuning
- Deep understanding of:
- MPP databases
- Partitioning & indexing strategies
- Data modeling (Dimensional, Normalized, Lakehouse)
- Experience building resilient ETL/data pipelines
- Knowledge of AWS fundamentals:
- Security
- Networking
- Disaster Recovery
- Scalability & resilience
- Experience with on-prem → AWS migrations
- AWS Certification (Solution Architect Associate / Data Engineer Associate)
Good-to-Have Skills
- Domain experience: FSI / Retail / CPG
- Data governance & virtualization tools:
- Collibra
- Denodo
- QuickSight / Power BI / Tableau
- Exposure to:
- Terraform (IaC)
- CI/CD pipelines
- SSIS
- Apache NiFi, Hive, HDFS, Sqoop
- Data Mesh architecture
- Experience with NoSQL databases:
- DynamoDB
- MongoDB
- DocumentDB
Soft Skills
- Strong problem-solving and analytical mindset
- Excellent communication and stakeholder management skills
- Ability to translate technical concepts into business outcomes
- Experience working with distributed/global teams
FULL STACK DEVELOPER
JOB DESCRIPTION – FULL STACK DEVELOPER
Location: Bangalore
Key Responsibilities:
Establish processes, SLAs, and escalation protocols for the support & maintenance of web applications
Manage stakeholders with effective communication & collaborate with cross functional teams to address issues and maintain business continuity.
Design, implement, unit test, and build business applications using React, React-Native, .Net Core, .Net 8, Azure/AWS and leveraging an agile methodology and latest tech like Agentic AI & Gihub Copilot.
Facilitate scrum ceremonies including sprint planning, retrospectives, reviews, and daily stand-ups·
Facilitate discussion, assessment of alternatives or different approaches, decision making, and conflict resolution within the development team
Develop and administer CI/CD pipelines in cloud-hosted Git repositories, and source control artifacts via Git in alignment with common branching strategies and workflows
Assist Software Designer/Implementers with the creation of detailed software design specifications
Participate in the system specification review process to ensure system requirements can be translated into valid software architecture
Integrate internal and external product designs into a cohesive user experience
Identify and keep track of metrics that indicate how software is performing
Handle technical and non-technical queries from the development team and stakeholders
Ensure that all development practices follow best practices and any relevant policies / procedures
Other Duties· Maintain project reporting including dashboards, status reports, road maps, burn down, velocity, and resource utilization.
Own the technical solution and ensure all technical aspects are implemented as designed. ·
Partner with the customer success team and aid in triaging and troubleshooting customer support issues spanning across a range of software components, infrastructure, integrations, and services, some of which target 24/7/365 availability
Flexible to work in rotational shift
Required Qualification
Previous experience of leading full stack technology projects with scrum teams and stakeholder management·
BTech or MTech in computer science, or related field·
3-5 years of experience.
Required Knowledge, Skills and Abilities: (Include any required computer skills, certifications, licenses, languages, etc)·
With Proficiency in .NET Core/.Net 8/, React, React-Native, Redux, Material, Bootstrap, Typescript, SCSS, Microservices, EF, LINQ, SQL, Azure/AWS, CI CD, Agile, Agentic AI, Github Copilot·
Azure Dev Ops, Design System, Micro front ends, Data Science·
Stakeholder management & excellent communication skills.
Must have skills
React - 3 years
React Native - 3 years
Redux - 1 years
Material UI - 1 years
Typescript - 1 years
Bootstrap - 1 years
Microservices - 2 years
SQL - 1 years
Azure - 1 years
Nice to have skills
.NET Core - 3 years
NET 8 - 3 years
AWS - 1 years
LINQ - 1 years
Strong Senior Backend Engineer profiles
Mandatory (Experience 1) – Must have 5+ years of hands-on Backend Engineering experience building scalable, production-grade systems
Mandatory (Experience 2) – Must have strong backend development experience using one or more frameworks (FastAPI / Django (Python), Spring (Java), Express (Node.js).
Mandatory (Experience 3) – Must have deep understanding of relevant libraries, tools, and best practices within the chosen backend framework
Mandatory (Experience 4) – Must have strong experience with databases, including SQL and NoSQL, along with efficient data modeling and performance optimization
Mandatory (Experience 5) – Must have experience designing, building, and maintaining APIs, services, and backend systems, including system design and clean code practices
Mandatory (Domain) – Experience with financial systems, billing platforms, or fintech applications is highly preferred (fintech background is a strong plus)
Mandatory (Company) – Must have worked in product companies / startups, preferably Series A to Series D
Mandatory (Education) – Candidates from Tier - 1 engineering institutes (IITs, BITS, are highly preferred)
We are looking for an experienced Data Engineer with strong expertise in AWS, DBT, Databricks, and Apache Airflow to join our growing data engineering team.
Immediate joiners preferred
Role Overview
The ideal candidate will design, develop, and maintain scalable data pipelines and data platforms to support analytics and business intelligence initiatives.
Key Responsibilities
- Design and build scalable data pipelines using AWS, Databricks, DBT, and Airflow.
- Develop and optimize ETL/ELT workflows for large-scale data processing.
- Implement data transformation models using DBT.
- Orchestrate workflows using Apache Airflow.
- Work with Databricks for big data processing and analytics.
- Ensure data quality, reliability, and performance optimization.
- Collaborate with data analysts, engineers, and business teams.
Required Skills
- Strong experience with AWS data services
- Hands-on experience with Databricks
- Experience in DBT (Data Build Tool)
- Workflow orchestration using Apache Airflow
- Strong SQL and Python skills
- Experience in data warehousing and ETL pipelines
Description
We are currently hiring for the position of Data Scientist/ Senior Machine Learning Engineer (6–7 years’ experience).
Please find the detailed Job Description attached for your reference. We are looking for candidates with strong experience in:
- Machine Learning model development
- Scalable data pipeline development (ETL/ELT)
- Python and SQL
- Cloud platforms such as Azure/AWS/Databricks
- ML deployment environments (SageMaker, Azure ML, etc.)
Kindly note:
- Location: Pune (Work From Office)
- Immediate joiners preferred
While sharing profiles, please ensure the following details are included:
- Current CTC
- Expected CTC
- Notice Period
- Current Location
- Confirmation on Pune WFO comfort
Must have skills
Machine Learning - 6 years
Python - 6 years
ETL(Extract, Transform, Load) - 6 years
SQL - 6 years
Azure - 6 years
💼 Job Title: Full Stack Developer (full time experienced only*)
🏢 Company: SDS Softwares
💻 Location: Work from Home
💸 Salary range: ₹7,000 - ₹18,000 per month (based on knowledge and interview)
🕛 Shift Timings: 12 PM to 9 PM (5 days working )
About the role: As a Full Stack Developer, you will work on both the front-end and back-end of web applications. You will be responsible for developing user-friendly interfaces and maintaining the overall functionality of our projects.
⚜️ Key Responsibilities:
- Collaborate with cross-functional teams to define, design, and ship new features.
- Develop and maintain high-quality web applications (frontend + backend )
- Troubleshoot and debug applications to ensure peak performance.
- Participate in code reviews and contribute to the team’s knowledge base.
⚜️ Required Skills:
- Proficiency in HTML, CSS, JavaScript, React.js for front-end development. ✅
- Understanding of server-side languages such as Node.js. ✅
- Familiarity with database technologies such as MySQL, MongoDB, or ✅ PostgreSQL.
- Basic knowledge of version control systems, particularly Git.
- Strong problem-solving skills and attention to detail.
- Excellent communication skills and a team-oriented mindset.
💠 Qualifications:
- Recent graduates or individuals with internship experience (1 year to 2 years) in software development.
- Must have a personal laptop and stable internet connection.
- Ability to join immediately is preferred.
If you are passionate about coding and eager to learn, we would love to hear from you. 👍

Business Intelligence & Digital Consulting company
Description
JOB DESCRIPTION – SENIOR ANALYST – DATA SCIENTIST
Key Responsibilities ·
Work with business stakeholders and cross-functional SMEs to deeply understand business context and key business questions·
Advanced skills with statistical/programming in Python and data querying languages (e.g., SQL, Hadoop/Hive, Scala)·
Solid understanding of time-series forecasting techniques·
Good hands-on skills in both feature engineering and hyperparameter optimization·
Able to write clean and tested code that can be maintained by other software engineers·
Able to clearly summarize and communicate data analysis assumptions and results·
Able to craft effective data pipelines to transform your analyses from offline to production systems·
Self-motivated and a proactive problem solver who can work independently and in teams·
Connects both externally and internally to understand industry trends, technology advances and outstanding processes or solutions·
Is collaborative and engages (strategic & tactical. Able to influence without authority, handle complex issues and implement positive change·
Work on multiple pillars of AI including cognitive engineering, conversational bots, and data science·
Ensure that solutions exhibit high levels of performance, security, scalability, maintainability, repeatability, appropriate reusability, and reliability upon deployment ·
Provide guidance and leadership to more junior data scientists, managing processes and flow of work, vetting designs, and mentoring team members to realize their full potential·
Lead discussions at peer review and use interpersonal skills to positively influence decision making·
Provide subject matter expertise in machine learning techniques, tools, and concepts; make impactful contributions to internal discussions on emerging practices·
Facilitate cross-geography sharing of new ideas, learnings, and best-practices
What We Are Looking For
Required Qualifications ·
Master's degree in a quantitative field such as Data Science, Statistics, Applied Mathematics or Bachelor's degree in engineering, computer science, or related field. ·
4 – 6 years of total work experience as data scientist or analytical role, with at least 2-3 years of experience in time series forecasting·
A combination of business focus, strong analytical and problem-solving skills, and programming knowledge to be able to quickly cycle hypothesis through the discovery phase of a project ·
Strong experience in Time Series Forecasting and Demand Planning ·
Advanced skills with statistical/programming software (e.g., R, Python) and data querying languages (e.g., SQL, Hadoop/Hive, Scala) ·
Good hands-on skills in both feature engineering and hyperparameter optimization ·
Experience producing high-quality code, tests, documentation·
Understanding of descriptive and exploratory statistics, predictive modelling, evaluation metrics, decision trees, machine learning algorithms, optimization & forecasting techniques, and / or deep learning methodologies·
Proficiency in statistical concepts and ML algorithms·
Ability to lead, manage, build, and deliver customer business results through data scientists or professional services team·
Ability to share ideas in a compelling manner, to clearly summarize and communicate data analysis assumptions and results·
Self-motivated and a proactive problem solver who can work independently and in teams·
Outstanding verbal and written communication skills with the ability to effectively advocate technical solutions to engineering and business teams
Desired Qualifications ·
Experience working in one or multiple supply chain functions (e.g., procurement, planning, manufacturing, quality, logistics) is strongly preferred ·
Experience in applying AI/ML within a CPG or Healthcare business environment is strongly preferred ·
Experience in creating CI/CD pipelines for deployment using Jenkins. ·
Experience implementing MLOPs framework along with understanding of data security·
Implementation on ML models·
Exposure to visualization packages and Azure tech stack.
Must have skills
Python - 2 years
Data Science - 4 years
SQL - 2 years
Machine Learning - 2 years
Nice to have skills
Data Analysis - 4 years
Time Series Forecasting - 2 years
Demand Planning - 2 years
Hadoop - 2 years
Statistical concepts - 2 years
Supply chain functions - 2 years
Senior Full Stack Developer – Job Description
Job Overview
Surety Seven Technologies Pvt Ltd is looking for an experienced and highly skilled Senior Full Stack Developer with strong expertise in Next.js, Node.js, and React.js. The ideal candidate will lead architecture decisions, build scalable applications, guide development teams, and drive technical excellence across projects.
This role requires strong ownership, leadership capability, and hands-on coding expertise in both frontend and backend technologies.
Key Responsibilities
- Lead the design and architecture of scalable full-stack applications
- Develop, maintain, and optimize web applications using Next.js, React.js, and Node.js
- Build robust RESTful APIs and backend services
- Ensure high performance, security, and responsiveness of applications
- Work closely with Product, Design, and QA teams to deliver high-quality features
- Conduct code reviews and maintain coding standards & best practices
- Mentor and guide junior and mid-level developers
- Manage CI/CD pipelines and deployment processes
- Troubleshoot complex production issues and provide solutions
- Contribute to technical documentation and system design discussions
Required Skills & Qualifications
- 5–8 years of experience in Full Stack Development
- Strong hands-on experience with Next.js, React.js, and Node.js
- Deep knowledge of JavaScript (ES6+), HTML5, CSS3
- Experience with MongoDB / MySQL / PostgreSQL
- Strong understanding of REST APIs, authentication (JWT/OAuth), and API security
- Experience with Git, CI/CD tools, and deployment on cloud platforms (AWS, Azure, or similar)
- Understanding of microservices architecture (preferred)
- Strong problem-solving and debugging skills
- Experience leading technical modules or teams
About the role
Applix is seeking a highly skilled Senior Power BI Developer to join our Hyderabad office on a full-time, work-from-office basis. In this role, you will work directly with Caterpillar’s global analytics and GCIO BI Services teams to design, develop, and maintain enterprise-grade Power BI reports, dashboards, scorecards, and advanced data visualizations. You will operate as a member of a Project/Scrum team within Caterpillar’s technology environment, engaging with business partners and internal support teams to provide data visualization development services for a wide variety of projects and business needs.
The ideal candidate combines deep Power BI expertise with strong backend data engineering skills, and can champion BI COE standards while partnering closely with data scientists, business analysts, and IT professionals across Caterpillar’s global operations. A minimum 5-hour daily overlap with US Central Time is required to ensure seamless collaboration with onshore stakeholders and end users.
Key responsibilities
- Design and develop enterprise-grade Power BI dashboards, reports, and scorecards aligned to business needs.
- Implement BI COE standards, governance, security (RLS), and best practices across BI tools and environments.
- Build and optimize data models, DAX calculations, SQL queries, and data transformation pipelines.
- Enhance performance using aggregation, incremental refresh, storage modes, and query optimization techniques.
- Collaborate with business stakeholders, data engineers, and data scientists to deliver actionable insights.
- Support documentation, training, troubleshooting, and continuous improvement initiatives.
- Drive advanced analytics adoption, CI/CD practices, and mentor junior team members.
Requires Qualifications
- Bachelor’s or Master’s degree in Computer Science, Information Technology, Data Science, Industrial Engineering, or a related field.
- 5+ years of hands-on experience in Power BI development, including reports, dashboards, enterprise scorecards, and paginated reports.
- Expert-level proficiency in Power BI Desktop, Power BI Service, Power BI Report Server, and Power BI Report Builder.
- Expert working knowledge of DAX, Power Query (M language), and data modeling best practices (star schema, snowflake schema, dimensional modeling).
- Strong backend skills with SQL Server, Azure SQL Database, Azure Synapse Analytics, or Snowflake – including writing complex T-SQL queries, stored procedures, CTEs, and window functions.
- 3+ years of experience in relational database design, data modeling, and structured query language (SQL).
- Hands-on experience with Azure Data Factory (ADF), Azure Data Lake, or similar ETL/ELT tools.
- Experience working in Agile/Scrum methodology, with tools like Azure DevOps, Jira, or ServiceNow.
Caterpillar-Specific Experience (Strongly Preferred)
- Prior experience within Caterpillar’s BI ecosystem, GCIO BI Services, and governance frameworks.
- Familiarity with multi-tool BI environments (Power BI, Tableau, ThoughtSpot, Cognos, BOBJ).
- Exposure to Caterpillar’s Azure cloud infrastructure, data lakes, and enterprise platforms.
- Understanding of BI COE standards, data governance, naming conventions, and security protocols.
- Domain experience in manufacturing, heavy equipment, construction, or mining industries.
- Experience managing complex, enterprise-grade BI applications integrating multiple data sources.
Preferred Qualifications
- Microsoft PL-300 certification or equivalent.
- Experience with Microsoft Fabric, Azure Databricks, Snowflake/Snowpark.
- Working knowledge of Python or R for advanced analytics.
- Experience with Microsoft Power Platform (Power Apps, Power Automate).
- Knowledge of SSAS Tabular models and XMLA endpoints.
- Experience implementing CI/CD for Power BI using Azure DevOps.
- Familiarity with ETL tools (SnapLogic, SSIS).
- Prior consulting or client-facing delivery experience.
What we offer
- Opportunity to work on high-impact analytics projects for Caterpillar Inc. - a Fortune 100 global leader with $67B+ in annual revenue.
- Direct engagement with Caterpillar’s GCIO BI Services organization and US-based leadership teams.
- Collaborative, innovation-driven work culture at Applix’s Hyderabad office with a team focused on enterprise BI excellence.
- Competitive compensation and benefits package aligned with market standards.
- Career growth with exposure to cutting-edge Microsoft data technologies, Snowflake, and enterprise-scale BI solutions.
- Learning and development support, including Microsoft certification sponsorship (PL-300, DP-500, etc.).
- Opportunity to contribute to Caterpillar’s BI Centre of Excellence standards and shape analytics best practices.
Job Title: Tech Lead
Location: Gachibowli, Hyderabad
Required Skills/Experience:
• 6+ years of experience in designing and developing enterprise and/or consumer-facing applications using technologies and frameworks like JavaScript, Node.js, ReactJS, Angular, SCSS, CSS, and React Native.
• 2+ years of experience in leading teams (guiding, designing, and tracking tasks) and taking responsibility for delivering projects as per agreed schedules.
• Hands-on experience with SQL and NoSQL databases.
• Hands-on experience working in Linux OS environments.
• Strong debugging, troubleshooting, and problem-resolution skills.
• Experience in developing responsive and scalable web applications.
• Good communication skills (verbal and written) to effectively interact with customers and internal teams.
• Ability and interest in learning new technologies and adapting to evolving technical requirements.
• Experience working in the complete product development lifecycle (prototyping, development, hardening, testing, and deployment).
• Exposure to AI/ML concepts and ability to integrate AI-based features into applications.
• Experience using AI tools such as ChatGPT, GitHub Copilot, Gemini, or similar tools for improving development productivity, automation, and documentation.
Additional Skills/Experience:
• Working experience with Python and NoSQL databases such as MongoDB and Cassandra.
• Exposure to AI, Machine Learning (ML), Natural Language Processing (NLP), and Predictive Analytics domains.
• Familiarity with modern AI frameworks or APIs and experience integrating AI-powered capabilities into applications is a plus.
• Eagerness to participate in product functional design and user experience discussions.
• Familiarity with internationalization (i18n) and the latest trends in UI/UX design.
• Experience implementing payment gateways applicable across different countries.
• Experience with CI/CD pipelines and tools such as Jenkins, Nginx, and related DevOps practices.
Educational Qualification:
• B.Tech / M.Tech in Computer Science Engineering (CSE), Information Technology (IT), Electronics & Communication Engineering (ECE), Artificial Intelligence (AI), Machine Learning (ML), or Data Science (DS) from a recognized university.
We are looking for an experienced Power BI developer with 7+yrs years of experience to join our Business Intelligence team. The ideal candidate will be responsible for transforming raw data into actionable insights using Microsoft Power BI. This role encompasses developing, maintaining, and optimizing interactive dashboards, reports, and data models to support strategic decision-making across the organization.
Key Responsibilities:
- Understand business requirements in the BI context and design data models to convert raw data to meaningful insights.
- Create complex DAX queries and functions in Power BI.
- Create dashboards and visual interactive reports using Power BI. Deploy creative visual interaction tools for different metrics.
- Proven expertise in the entire Microsoft Power Platform like Power Apps, Power Automate, Microsoft DataVerse, AI Builder etc.
- Provide architecture recommendations to manage Power BI workspaces.
- Design, develop, and deploy Power BI scripts and perform efficient detailed analysis.
- Create charts and document data with algorithms, parameters, models, and relations explanations.
- Make technical changes to existing BI systems in order to enhance their functioning.
Required Qualifications & Skills:
- 7+ years relevant experience with the above job duties for Intermediate Power BI Developer bachelor’s or master's degree in computer science or related fields.
- Certification in MS Power BI and Power Apps Suite is needed.
- Good creative and communication skills – Ability to influence and recommend visualizations to the senior leadership teams.
- Ability to create complex SQL queries to join multiple tables / data is absolutely Needed.
- Understanding of JavaScript, CSS, and other JavaScript libraries is preferred.
Job Title: QA Tester – FinTech (Manual + Automation Testing)
Location: Bangalore, India
Job Type: Full-Time
Experience Required: 3 Years
Industry: FinTech / Financial Services
Function: Quality Assurance / Software Testing
About the Role:
We are looking for a skilled QA Tester with 3 years of experience in both manual and automation testing, ideally in the FinTech domain. The candidate will work closely with development and product teams to ensure that our financial applications meet the highest standards of quality, performance, and security.
Key Responsibilities:
- Analyze business and functional requirements for financial products and translate them into test scenarios.
- Design, write, and execute manual test cases for new features, enhancements, and bug fixes.
- Develop and maintain automated test scripts using tools such as Selenium, TestNG, or similar frameworks.
- Conduct API testing using Postman, Rest Assured, or similar tools.
- Perform functional, regression, integration, and system testing across web and mobile platforms.
- Work in an Agile/Scrum environment and actively participate in sprint planning, stand-ups, and retrospectives.
- Log and track defects using JIRA or a similar defect management tool.
- Collaborate with developers, BAs, and DevOps teams to improve quality across the SDLC.
- Ensure test coverage for critical fintech workflows like transactions, KYC, lending, payments, and compliance.
- Assist in setting up CI/CD pipelines for automated test execution using tools like Jenkins, GitLab CI, etc.
Required Skills and Experience:
- 3+ years of hands-on experience in manual and automation testing.
- Solid understanding of QA methodologies, STLC, and SDLC.
- Experience in testing FinTech applications such as digital wallets, online banking, investment platforms, etc.
- Strong experience with Selenium WebDriver, TestNG, Postman, and JIRA.
- Knowledge of API testing, including RESTful services.
- Familiarity with SQL to validate data in databases.
- Understanding of CI/CD processes and basic scripting for automation integration.
- Good problem-solving skills and attention to detail.
- Excellent communication and documentation skills.
Preferred Qualifications:
- Exposure to financial compliance and regulatory testing (e.g., PCI DSS, AML/KYC).
- Experience with mobile app testing (iOS/Android).
- Working knowledge of test management tools like TestRail, Zephyr, or Xray.
- Performance testing experience (e.g., JMeter, LoadRunner) is a plus.
- Basic knowledge of version control systems (e.g., Git).
About the role
We are seeking a seasoned Backend Tech Lead with deep expertise in Golang and Python to lead our backend team. The ideal candidate has 6+ years of experience in backend technologies and 2–3 years of proven engineering mentoring experience, having successfully scaled systems and shipped B2C applications in collaboration with product teams.
Responsibilities
Technical & Product Delivery
● Oversee design and development of backend systems operating at 10K+ RPM scale.
● Guide the team in building transactional systems (payments, orders, etc.) and behavioral systems (analytics, personalization, engagement tracking).
● Partner with product managers to scope, prioritize, and release B2C product features and applications.
● Ensure architectural best practices, high-quality code standards, and robust testing practices.
● Own delivery of projects end-to-end with a focus on scalability, reliability, and business impact.
Operational Excellence
● Champion observability, monitoring, and reliability across backend services.
● Continuously improve system performance, scalability, and resilience.
● Streamline development workflows and engineering processes for speed and quality.
Requirements
● Experience:
7+ years of professional experience in backend technologies.
2-3 years as Tech lead and driving delivery.
● Technical Skills:
Strong hands-on expertise in Golang and Python.
Proven track record with high-scale systems (≥10K RPM).
Solid understanding of distributed systems, APIs, SQL/NoSQL databases, and cloud platforms.
● Leadership Skills:
Demonstrated success in managing teams through 2–3 appraisal cycles.
Strong experience working with product managers to deliver consumer-facing applications.
● Excellent communication and stakeholder management abilities.
Nice-to-Have
● Familiarity with containerization and orchestration (Docker, Kubernetes).
● Experience with observability tools (Prometheus, Grafana, OpenTelemetry).
● Previous leadership experience in B2C product companies operating at scale.
What We Offer
● Opportunity to lead and shape a backend engineering team building at scale.
● A culture of ownership, innovation, and continuous learning.
● Competitive compensation, benefits, and career growth opportunities.
Hiring for Manual Test Engineer
Exp: 6 - 8 yrs
Permanent Remote
Notice Period : Immediate - 15 days
Skills:
Strong background in manual testing with excellent attention to detail
· Experience in AI-driven testing, including practical usage of AI in the testing process
· Ability to generate test cases, test scenarios, and test data using AI testing tools
· Hands-on experience with AI testing tools such as Testim and BrowserStack
· Knowledge of advanced features like self-healing tests in AI-based automation tools
· Good understanding of database testing, including writing and executing SQL queries.
Job Summary:
We are seeking a highly skilled and self-driven Java Backend Developer with strong experience in designing and deploying scalable microservices using Spring Boot and Azure Cloud. The ideal candidate will have hands-on expertise in modern Java development, containerization, messaging systems like Kafka, and knowledge of CI/CD and DevOps practices.Key Responsibilities:
- Design, develop, and deploy microservices using Spring Boot on Azure cloud platforms.
- Implement and maintain RESTful APIs, ensuring high performance and scalability.
- Work with Java 11+ features including Streams, Functional Programming, and Collections framework.
- Develop and manage Docker containers, enabling efficient development and deployment pipelines.
- Integrate messaging services like Apache Kafka into microservice architectures.
- Design and maintain data models using PostgreSQL or other SQL databases.
- Implement unit testing using JUnit and mocking frameworks to ensure code quality.
- Develop and execute API automation tests using Cucumber or similar tools.
- Collaborate with QA, DevOps, and other teams for seamless CI/CD integration and deployment pipelines.
- Work with Kubernetes for orchestrating containerized services.
- Utilize Couchbase or similar NoSQL technologies when necessary.
- Participate in code reviews, design discussions, and contribute to best practices and standards.
Required Skills & Qualifications:
- Strong experience in Java (11 or above) and Spring Boot framework.
- Solid understanding of microservices architecture and deployment on Azure.
- Hands-on experience with Docker, and exposure to Kubernetes.
- Proficiency in Kafka, with real-world project experience.
- Working knowledge of PostgreSQL (or any SQL DB) and data modeling principles.
- Experience in writing unit tests using JUnit and mocking tools.
- Experience with Cucumber or similar frameworks for API automation testing.
- Exposure to CI/CD tools, DevOps processes, and Git-based workflows.
Nice to Have:
- Azure certifications (e.g., Azure Developer Associate)
- Familiarity with Couchbase or other NoSQL databases.
- Familiarity with other cloud providers (AWS, GCP)
- Knowledge of observability tools (Prometheus, Grafana, ELK)
Soft Skills:
- Strong problem-solving and analytical skills.
- Excellent verbal and written communication.
- Ability to work in an agile environment and contribute to continuous improvement.
Why Join Us:
- Work on cutting-edge microservice architectures
- Strong learning and development culture
- Opportunity to innovate and influence technical decisions
- Collaborative and inclusive work environment
AccioJob is conducting a Walk-In Hiring Drive with a Global Intelligence Company for the position of Java Full Stack Developer.
To apply, register and select your slot here: https://go.acciojob.com/8rUwyD
Required Skills: Java, DSA, SQL
Eligibility:
Degree: BTech./BE, MTech./ME, BCA, MCA, BSc., MSc
Branch: Electrical/Other electrical related branches, Computer Science/CSE/Other CS related branch, IT
Graduation Year: 2025, 2024, 2023, 2022, 2021, 2020, 2019, 2018
Work Details:
Work Location: Pune (Onsite)
CTC: ₹3.5 LPA
Evaluation Process:
Round 1: Offline Assessment at AccioJob Pune Centre
Further Rounds (for shortlisted candidates only):
Resume Evaluation, Telephonic Screening, Programming Test, Technical Interview 1, Technical Interview 2, HR Discussion
Important Note: Bring your laptop & earphones for the test.
Register here: https://go.acciojob.com/8rUwyD
👇 FAST SLOT BOOKING 👇
[ 📲 DOWNLOAD ACCIOJOB APP ]
🚀 Hiring: .NET Developer at Deqode
⭐ Experience: 4+ Years
📍 Location: Mumbai and Bangalore
⭐ Work Mode:- Hybrid
⏱️ Notice Period: Immediate Joiners
(Only immediate joiners & candidates serving notice period)
🚀 Hiring: .NET Developer
We are looking for a skilled .NET Developer to join our growing team. The ideal candidate should have strong experience in developing, testing, and maintaining applications using the .NET framework.
🎗️ Key Responsibilities:
✅ Develop and maintain web applications using .NET / .NET Core
✅ Write clean, scalable, and efficient code
✅ Troubleshoot, debug, and upgrade existing applications
✅ Work with databases and APIs for application integration
💫 Requirements:
✅ Experience with C#, ASP.NET, .NET Core
✅ Knowledge of SQL Server
✅ Familiarity with REST APIs
✅ Understanding of HTML, CSS, JavaScript
✅ Strong problem-solving and communication skills
Location: Mumbai, Maharashtra, India
Sector: Technology, Information & Media
Company Size: 500 - 1,000 Employees
Employment: Full-Time, Permanent
Experience: 10 - 14 Years (Engineering Leadership)
Level: Engineering Manager / Group EM
ABOUT THIS MANDATE :
Recruiting Bond has been exclusively retained by one of India's most prominent and well-established digital platform organisations operating at the intersection of Technology, Information, and Media to identify and place an exceptional Engineering Manager who can lead engineering teams through an enterprise-wide AI adoption and digital transformation agenda.
This is a high-impact, hands-on leadership role at the nexus of people, product, and technology. The organisation is executing one of the most ambitious AI transformation programmes in its sector and this Engineering Manager will be a core driver of that change. You will lead multiple squads, own engineering delivery end-to-end, embed AI tooling and practices into the team's DNA, and shape the engineering culture of tomorrow.
We are seeking leaders who code when it matters, who build systems and teams with equal conviction, and who view AI not as a trend but as a fundamental shift in how great software is built.
THE OPPORTUNITY AT A GLANCE :
AI-First Engineering Culture :
- Own AI adoption across your squads - from LLM tooling integration to automation-first delivery workflows. Make AI a default, not an afterthought.
Hands-On Engineering Leadership :
- Stay close to the code. Lead architecture reviews, unblock engineers, and set the technical bar - not just the management agenda.
People & Org Builder :
- Grow engineers into leaders. Build squads of 615 across functions. Drive hiring, career frameworks, and a culture of psychological safety.
KEY RESPONSIBILITIES :
1. Hands-On Technical Engagement :
- Remain deeply embedded in the technical work participate in design reviews, architecture decisions, and critical code reviews
- Set and uphold the engineering quality bar : performance benchmarks, security standards, test coverage, and release quality
- Provide technical direction on backend platform strategy, API design, service decomposition, and data architecture
- Identify and resolve systemic technical debt and architectural risks across team-owned services
- Unblock engineers by diving into complex problems debugging, pair programming, and system analysis when it matters
- Own key technical decisions in collaboration with Tech Leads and Principal Engineers; balance pragmatism with long-term sustainability
2. AI Adoption, Integration & Transformation (2026 Mandate) :
- Define and execute the team's AI adoption roadmap - from developer tooling to product-facing AI features
- Champion the integration of GenAI tools (GitHub Copilot, Cursor, Claude, ChatGPT) across the full engineering workflow coding, testing, documentation, incident response
- Embed LLM-powered capabilities into the product : recommendation engines, intelligent search, conversational interfaces, content generation, and predictive systems
- Lead evaluation and adoption of AI-assisted SDLC practices : automated code review, AI-generated test suites, intelligent observability, and anomaly detection
- Partner with Data Science and ML Platform teams to productionise ML models with robust MLOps pipelines
- Build team literacy in prompt engineering, RAG (Retrieval-Augmented Generation), and AI agent frameworks
- Create an experimentation culture : run structured AI pilots, measure productivity impact, and scale what works
- Stay ahead of the AI tooling landscape and advise senior leadership on strategic AI investments and engineering implications
3. People Leadership & Team Development :
- Lead, manage, and grow squads of 6 - 15 engineers across seniority levels (L2 through L6 / Junior through Staff)
- Conduct structured 1 : 1s, career growth conversations, and development planning with every direct report
- Design and execute personalised AI upskilling programmes ensure every engineer develops practical AI fluency by end of 2026
- Build and maintain a high-performance team culture : clarity of ownership, accountability, fast feedback loops, and psychological safety
- Drive performance management fairly and rigorously recognise top performers, manage underperformance constructively
- Lead technical hiring end-to-end : define job requirements, conduct bar-raising interviews, and make data-driven hire decisions
- Contribute to engineering career frameworks and level definitions in partnership with the VP / Director of Engineering
4. Engineering Delivery & Execution Excellence :
- Own end-to-end delivery for multiple product squads from planning and scoping through production release and post-launch stability
- Implement and refine agile delivery frameworks (Scrum, Kanban, Shape Up) calibrated to squad needs and product cadence
- Drive predictable delivery : maintain healthy sprint velocity, manage WIP limits, and ensure dependency resolution across teams.
- Establish and own engineering KPIs : DORA metrics (deployment frequency, lead time, MTTR, change failure rate), uptime SLOs, and velocity trends
- Lead incident management : build blameless post-mortem culture, own RCA processes, and drive systemic reliability improvements
- Balance technical debt repayment with feature velocity negotiate prioritisation transparently with Product leadership
5. Strategic Leadership & Cross-Functional Influence :
- Serve as the primary engineering partner for Product, Design, Data, and Business stakeholders translate ambiguity into executable engineering plans
- Participate in quarterly roadmap planning, capacity forecasting, and OKR definition for engineering teams
- Represent engineering in leadership forums articulate technical constraints, risks, and opportunities in business terms
- Contribute to org-wide engineering strategy : platform investments, build-vs-buy decisions, and shared infrastructure priorities
- Build relationships across geographies (Mumbai HQ + distributed teams) to maintain alignment and delivery cohesion
- Act as a culture carrier and ambassador for engineering excellence, innovation, and responsible AI use
AI TRANSFORMATION LEADERSHIP 2026 EXPECTATIONS :
In 2026, Engineering Managers at this organisation are expected to be active architects of AI transformation not passive observers. The following outlines the specific AI leadership expectations for this role :
AI Developer Productivity
- Drive measurable uplift in developer velocity through AI tooling adoption. Target : 30%+ reduction in code review cycle time and 40%+ increase in test coverage automation by Q3 2026.
LLM & GenAI Product Features
- Own delivery of GenAI-powered product capabilities : intelligent content, semantic search, personalisation, and conversational UX in production, at scale.
AI-Augmented Observability
- Implement AI-driven monitoring and anomaly detection pipelines. Reduce MTTR by leveraging predictive alerting, intelligent runbooks, and auto-remediation scripts.
Team AI Fluency :
- Build mandatory AI literacy across all engineering levels.
- Every engineer understands prompt engineering basics, AI ethics guardrails, and responsible AI deployment practices.
Responsible AI Governance :
- Partner with Security, Legal, and Data Privacy to ensure all AI deployments meet compliance standards, bias mitigation requirements, and explainability benchmarks.
TECHNOLOGY STACK & DOMAIN FAMILIARITY REQUIRED :
- Languages: Java/ Go/ Python/ Node.js /PHP /Rust (must be hands-on in at least 2)
- Cloud: AWS / GCP / Azure (multi-cloud exposure strongly preferred)
- AI & GenAI: OpenAI / Anthropic / Gemini APIs /LangChain /LlamaIndex / RAG / Vector DBs / GitHub
- Copilot: Cursor /Hugging Face
- Containers: Docker /Kubernetes /Helm /Service Mesh (Istio / Linkerd)
- Databases: PostgreSQL /MongoDB / Redis / Cassandra / Elasticsearch / Pinecone (Vector DB)
- Messaging: Apache Kafka /RabbitMQ /AWS SQS/SNS /Google Pub/Sub
- MLOps & DataOps: MLflow /Kubeflow / SageMaker / Vertex AI /Airflow /dbt
- Observability: Datadog /Prometheus /Grafana /OpenTelemetry / Jaeger /ELK Stack
- CI/CD & IaC: GitHub Actions ArgoCD / Jenkins / Terraform /Ansible /Backstage (IDP)
QUALIFICATIONS & CANDIDATE PROFILE :
Education :
- B.E. / B.Tech or M.E. / M.Tech from a Tier-I or Tier-II Institution - CS, IS, ECE, AI/ML streams strongly preferred
- Demonstrated engineering depth and leadership impact may complement institution pedigree
Experience :
- 10 to 14 years of progressive engineering experience, with at least 3 years in a formal Engineering Manager or equivalent people-leadership role
- Proven track record of managing and scaling engineering teams (615+ engineers) in a fast-growing SaaS or digital product environment
- Hands-on backend engineering background must be able to read, write, and critique production code
- Direct experience driving AI/ML feature delivery or AI tooling adoption within engineering organisations
- Exposure across start-up, mid-size, and large-scale product organisations, preferred adaptability is a core requirement
- Strong CS fundamentals: distributed systems, algorithms, system design, and software architecture
- Demonstrated career stability minimum of 2 years of average tenure per organisation.
The Ideal Engineering Manager in 2026 :
- Leads with context, not control, empowers engineers while maintaining accountability and quality
- Is fluent in both people language and technical language, switches registers naturally with engineers and executives alike
- Sees AI as a force multiplier for the team, not a threat. Actively experiments with and advocates for AI tooling
- Measures success by team outcomes, not personal output. Takes pride in what the team ships, not what they build alone
- Creates feedback loops obsessively between product and engineering, between seniors and juniors, between metrics and decisions
- Has strong opinions, loosely held, brings conviction to discussions but updates on evidence
- Invests in engineering excellence as seriously as delivery velocity knows that quality and speed are not opposites
WHY THIS ROLE STANDS APART :
AI Transformation at Scale :
- Lead one of the most significant AI adoption programmes in India's digital media sector.
- Our decisions will shape how hundreds of engineers work in 2026 and beyond.
Hands-On & Strategic Balance :
- A rare EM role that actively encourages technical depth.
- Stay close to the code while owning the people agenda - the best of both worlds.
Established Platform, Real Scale :
- 5001,000 engineers, proven product-market fit, and the org maturity to execute.
- This is not a greenfield startup gamble it is a serious company with serious ambition.
Clear Leadership Growth Path :
- A visible, direct path toward Director / VP of Engineering.
- Senior leadership is invested in growing its next generation of technology executives.
Job opportunity for Developer -Python Full Stack with Siemens at Bangalore.
Interview Process:
1st round of interview - F2F (in-Person)-Technical
2nd round of interview – F2F /Virtual Interview - Technical
3rd round of interview – Virtual Interview – Technical + HR
Job Title / Designation: Developer -Python Full Stack
Employment Type: Full Time, Permanent
Location: Bangalore
Experience: 3-5 Years Job Description: : Developer -Python Full Stack
We are looking for a python full stack expert who has proven 5+ years of experience in developing automating solutions on Linux based environments. You should be capable of developing python-based web applications or automation solutions and have with excellent knowledge on DB handling and decent knowledge of the K8-based deployment environment.
Required Skills:
- Solid experience in Python back-end technology
- Sound experience in web application development
- Decent knowledge and experience in UI development using JavaScript, React/Angular or related tech stack.
- Strong understanding of software design patterns and testing principles
- Ability to learn and adapt to working with multiple programming languages.
- Experience Docker, ArgoCD, Kubernetes and Terraform
- Understanding of ETL processes to extract data from different data sources is a plus.
- Proven experience in Linux development environments using Python.
- Excellent knowledge in interacting with database systems (SQL, NoSQL) and webservices (REST)
- Experienced in establishing an optimized CI / CD environment relevant to the project.
- Good knowledge on repository management tools like Git, Bit Bucket, etc.
- Excellent debugging skills/strategies.
- Excellent communication skills
- Experienced in working in an Agile environment.
Nice to have
- Good Knowledge in eclipse IDE, developed add-ons/ plugins on eclipse Platform.
- Knowledge of 93K Semiconductor test platforms
- Good know-how of agile management tools like Jira, Azure DevOps.
- Good knowledge of RHEL
- Knowledge of JIRA administration
To design, build, and optimize scalable data infrastructure and pipelines that enable efficient
data collection, transformation, and analysis across the organization. The Senior Data Engineer
will play a key role in driving data architecture decisions, ensuring data quality and availability,
and empowering analytics, product, and engineering teams with reliable, well-structured data to
support business growth and strategic decision-making.
Responsibilities:
• Develop, and maintain SQL and NoSQL databases, ensuring high performance,
scalability, and reliability.
• Collaborate with the API team and Data Science team to build robust data pipelines and
automations.
• Work closely with stakeholders to understand database requirements and provide
technical solutions.
• Optimize database queries and performance tuning to enhance overall system
efficiency.
• Implement and maintain data security measures, including access controls and
encryption.
• Monitor database systems and troubleshoot issues proactively to ensure uninterrupted
service.
• Develop and enforce data quality standards and processes to maintain data integrity.
• Create and maintain documentation for database architecture, processes, and
procedures.
• Stay updated with the latest database technologies and best practices to drive
continuous improvement.
• Expertise in SQL queries and stored procedures, with the ability to optimize and fine-tune
complex queries for performance and efficiency.
• Experience with monitoring and visualization tools such as Grafana to monitor database
performance and health.
Requirements:
• 4+ years of experience in data engineering, with a focus on large-scale data systems.
• Proven experience designing data models and access patterns across SQL and NoSQL
ecosystems.
• Hands-on experience with technologies like PostgreSQL, DynamoDB, S3, GraphQL, or
vector databases.
• Proficient in SQL stored procedures with extensive expertise in MySQL schema design,
query optimization, and resolvers, along with hands-on experience in building and
maintaining data warehouses.
• Strong programming skills in Python or JavaScript, with the ability to write efficient,
maintainable code.
• Familiarity with distributed systems, data partitioning, and consistency models.
• Familiarity with observability stacks (Prometheus, Grafana, OpenTelemetry) and
debugging production bottlenecks.
• Deep understanding of cloud infrastructure (preferably AWS), including networking, IAM,
and cost optimization.
• Prior experience building multi-tenant systems with strict performance and isolation
guarantees.
• Excellent communication and collaboration skills to influence cross-functional technical
decisions.
NOW HIRING · WORLD-CLASS TALENT Backend Tech Lead (Senior Level Engineering Leadership)
Placed by Recruiting Bond on behalf of a Confidential Digital Platform Leader
📍Location: Bengaluru, India (Hybrid / On-Site)
🏢Sector: Technology, Information & Media
👥Company Size: 500 – 1,000 Employees
💼Employment: Full-Time, Permanent
🎯Experience: 6 – 9 Years (Backend Engineering)
🚀 Level: Tech Lead
ABOUT THIS MANDATE
Recruiting Bond has been exclusively retained by one of India's most well-established digital platform organisations — a company operating at the intersection of Technology, Information, and Media — to identify and place a world-class Backend Tech Lead who can drive a transformational engineering agenda at scale.
This is not an ordinary role. The organisation is executing a high-stakes, large-scale modernisation of its backend infrastructure — migrating from legacy monolithic systems to resilient, cloud-native, AI-augmented distributed architectures that serve millions of concurrent users. The person in this seat will be a core pillar of that transformation.
We are looking exclusively for the top 1% — engineers who think in systems, own outcomes, and lead by example.
THE OPPORTUNITY AT A GLANCE
🏗️ Architecture Ownership
Drive system design decisions across the entire backend platform. Shape the future of distributed, fault-tolerant architecture.
🤖 AI-Augmented Engineering
Embed GenAI and LLM tooling directly into the SDLC. Champion automation-first development practices across squads.
🎓 Engineering Leadership
Mentor and grow the next generation of backend engineers. Lead hiring, reviews, and cross-functional technical alignment.
KEY RESPONSIBILITIES
1. Architecture & Platform Modernisation
- Lead the full migration of legacy monolithic systems to a scalable, cloud-native microservices architecture
- Design and own distributed, fault-tolerant backend systems with sub-millisecond SLO targets
- Architect API-first and event-driven platforms using async messaging patterns (Kafka, Pub/Sub, SQS)
- Resolve systemic performance bottlenecks, concurrency conflicts, and scalability ceilings
- Establish backend design standards, coding guidelines, and architectural review processes
2. Distributed Systems Engineering (Production-Grade)
- Design and implement Webhook reliability frameworks with intelligent retry and exponential backoff strategies
- Build idempotent, versioned APIs with enterprise-grade rate limiting and throttling controls
- Implement circuit breakers, bulkheads, and resilience patterns using Resilience4j / Hystrix or equivalents
- Engineer Dead-Letter Queue (DLQ) strategies and event reprocessing pipelines with guaranteed delivery semantics
- Apply Saga orchestration and choreography patterns for distributed transaction integrity
- Execute zero-downtime deployments and canary release strategies with rollback capability
- Design and enforce multi-region disaster recovery and business continuity protocols
3. AI-Driven Engineering Practices
- Champion LLM and GenAI adoption as first-class tooling across the software development lifecycle
- Apply prompt engineering techniques for automated code generation, review, and documentation workflows
- Utilise AI-assisted debugging, root cause analysis, and predictive performance optimisation
- Build automation-first pipelines that reduce toil and accelerate delivery velocity
- Evaluate and integrate emerging AI developer tools into the engineering ecosystem
4. Engineering Leadership & Culture
- Own backend platforms end-to-end with full accountability across development, stability, and performance
- Actively mentor, coach, and elevate engineers at all levels (L3–L6) through structured 1:1s and code reviews
- Drive and lead technical hiring — from designing assessments to final hire decisions
- Partner with Product, Data, DevOps, and Security stakeholders to align engineering with business objectives
- Represent the engineering org in cross-functional roadmap planning and architecture decision reviews
- Foster a culture of technical excellence, psychological safety, and high-velocity delivery
TECHNOLOGY STACK (HANDS-ON PROFICIENCY REQUIRED)
Languages: Java (primary) · Go · Python · Node.js · PHP · Rust
Cloud: AWS · GCP · Azure (Multi-cloud exposure preferred)
Containers: Docker · Kubernetes · Helm · Service Mesh (Istio / Linkerd)
Databases: PostgreSQL · MySQL · MongoDB · Cassandra · Redis · Elasticsearch
Messaging: Apache Kafka · RabbitMQ · AWS SQS/SNS · Google Pub/Sub
Observability: Datadog · Prometheus · Grafana · OpenTelemetry · Jaeger · ELK Stack
CI/CD & IaC: GitHub Actions · Jenkins · ArgoCD · Terraform · Ansible
AI & GenAI: OpenAI / Claude APIs · LangChain · RAG Pipelines · GitHub Copilot · Cursor
QUALIFICATIONS & CANDIDATE PROFILE
Education
- B.E. / B.Tech or M.E. / M.Tech from a Tier-I or Tier-II Institution — CS, IS, ECE, AI/ML streams strongly preferred
- Exceptional real-world engineering track record may be considered in lieu of institution pedigree
Experience
- 6 to 9 years of progressive backend engineering experience with demonstrable ownership and impact
- Proven track record of shipping and scaling production SaaS / Product systems at significant user load
- Exposure to and success within start-up, mid-size, and large-scale product organisations — the full spectrum
- Strong computer science fundamentals: algorithms, data structures, distributed systems theory, OS internals
- Demonstrated career stability — minimum 2 years average tenure per organisation
- The Ideal Candidate Exemplifies
- System-level thinking with an ability to hold context across code, architecture, product, and business
- An ownership mindset — no task is 'not my job'; outcomes and quality are personal commitments
- Strong written and verbal communication skills for asynchronous, cross-functional collaboration
- Intellectual curiosity: actively follows engineering trends, contributes to the community (OSS, blogs, talks)
- Bias for automation, observability, and engineering efficiency at every level
- A mentor's instinct — genuine desire to grow others and raise the capability of the team around them
WHY THIS ROLE STANDS APART
✅ Transformational Scope
Lead platform modernisation at scale. Your architectural choices will define systems serving millions of users for years.
✅ AI-Forward Engineering Culture
Be at the forefront of AI-augmented development. This org invests in tools and practices that make great engineers exceptional.
✅ Established, Stable Platform
Join a company with 500–1,000 employees, proven product-market fit, and the resources to execute on a serious technical vision.
✅ Career-Defining Leadership
Operate with strategic influence, direct access to senior leadership, and a clear path toward Principal / Staff / VP Engineering.
HOW TO APPLY
This search is being managed exclusively by Recruiting Bond
Submit your application with an updated resume
Only shortlisted candidates will be contacted. All applications are treated with the strictest confidentiality.
⚡ We move fast — qualified candidates can expect a response within 48–72 business hours.
Recruiting Bond | Bengaluru, Karnataka, India | 2026
An L2 Technical Support Engineer with Python knowledge is responsible for handling escalated, more complex technical issues that the Level 1 (L1) support team cannot resolve. Your primary goal is to perform deep-dive analysis, troubleshooting, and problem resolution to minimize customer downtime and ensure system stability.
Python is a key skill, used for scripting, automation, debugging, and data analysis in this role.
Key Responsibilities
- Advanced Troubleshooting & Incident Management:
- Serve as the escalation point for complex technical issues (often involving software bugs, system integrations, backend services, and APIs) that L1 support cannot resolve.
- Diagnose, analyze, and resolve problems, often requiring in-depth log analysis, code review, and database querying.
- Own the technical resolution of incidents end-to-end, adhering strictly to established Service Level Agreements (SLAs).
- Participate in on-call rotation for critical (P1) incident support outside of regular business hours.
- Python-Specific Tasks:
- Develop and maintain Python scripts for automation of repetitive support tasks, system health checks, and data manipulation.
- Use Python for debugging and troubleshooting by analyzing application code, API responses, or data pipeline issues.
- Write ad-hoc scripts to extract, analyze, or modify data in databases for diagnostic or resolution purposes.
- Potentially apply basic-to-intermediate code fixes in Python applications in collaboration with development teams.
- Collaboration and Escalation:
- Collaborate closely with L3 Support, Software Engineers, DevOps, and Product Teams to report bugs, propose permanent fixes, and provide comprehensive investigation details.
- Escalate issues that require significant product changes or deeper engineering expertise to the L3 team, providing clear, detailed documentation of all steps taken.
- Documentation and Process Improvement:
- Conduct Root Cause Analysis (RCA) for major incidents, documenting the cause, resolution, and preventative actions.
- Create and maintain a Knowledge Base (KB), runbooks, and Standard Operating Procedures (SOPs) for recurring issues to empower L1 and enable customer self-service.
- Proactively identify technical deficiencies in processes and systems and recommend improvements to enhance service quality.
- Customer Communication:
- Maintain professional, clear, and timely communication with customers, explaining complex technical issues and resolutions in an understandable manner.
Required Technical Skills
- Programming/Scripting:
- Strong proficiency in Python (for scripting, automation, debugging, and data manipulation).
- Experience with other scripting languages like Bash or Shell
- Databases:
- Proficiency in SQL for complex querying, debugging data flow issues, and data extraction.
- Application/Web Technologies:
- Understanding of API concepts (RESTful/SOAP) and experience troubleshooting them using tools like Postman or curl.
- Knowledge of application architectures (e.g., microservices, SOA) is a plus.
- Monitoring & Tools:
- Experience with support ticketing systems (e.g., JIRA, ServiceNow).
- Familiarity with log aggregation and monitoring tools (Kibana, Splunk, ELK Stack, Grafana)
At BigThinkCode, our technology solves complex problems. We are looking for talented Data engineer to join our Data team at Chennai.
Please find below our job description, if interested apply / reply sharing your profile to connect and discuss.
Company: BigThinkCode Technologies
URL: https://www.bigthinkcode.com/
Work location: Chennai (work from office)
Experience required: 3-5 years
Work location: Chennai
Joining time: Immediate – 4 weeks
Work Mode: Work from office (Hybrid)
About the role: We are looking for a skilled Data Engineer with hands-on expertise in Dagster orchestration, modern data pipeline development, and Medallion architecture implementation. The ideal candidate will design, build, and optimize scalable data pipelines with strong SQL proficiency, data modelling expertise.
Key Responsibilities
- Design, develop, and maintain scalable data pipelines using Dagster.
- Build and manage Dagster components such as:
o Ops / Assets
o Schedules
o Sensors
o Jobs
o Resource definitions
- Implement and maintain Medallion Architecture (Bronze, Silver, Gold layers).
- Write optimized and production-grade SQL scripts for transformations and data validation.
Must Have
- 3+ years of experience in Data Engineering.
- Strong hands-on experience with Dagster and workflow orchestration.
- Solid understanding of data pipeline design patterns.
- Experience implementing Medallion Architecture.
- Advanced SQL skills (complex joins, CTEs, performance tuning).
- Experience working with GCP cloud data platform.
Why Join Us:
- Collaborative work environment.
- Exposure to modern tools and scalable application architectures.
- Medical cover for employee and eligible dependents.
- Tax beneficial salary structure.
- Comprehensive leave policy
- Competency development training programs.
Data Scientist or Senior Machine Learning Engineer
We are currently hiring for the position of Data Scientist/ Senior Machine Learning Engineer (6–7 years' experience).
Please find the detailed Job Description attached for your reference.
We are looking for candidates with strong experience in:
- Machine Learning model development
- Scalable data pipeline development (ETL/ELT)
- Python and SQL
- Cloud platforms such as Azure/AWS/Databricks
- ML deployment environments (SageMaker, Azure ML, etc.)
Kindly note:
- Location: Pune (Work from Office)
- Immediate joiners preferred
While sharing profiles, please ensure the following details are included:
- Current CTC
- Expected CTC
- Notice Period
- Current Location
- Confirmation on Pune WFO comfort
Must have Skills
- Machine Learning - 6 Years
- Python - 6 Years
- ETL (Extract, Transform, Load) - 6 Years
- SQL - 6 Years
- Azure - 6 Years
Request you to share relevant profiles at the earliest. Looking forward to your support.
Role Overview
We are looking for a hands-on engineering leader who can own technical design and drive end-to-end development of scalable, high-quality systems. This role requires strong architectural depth, coding excellence, and the ability to mentor engineers while building production-grade applications in a fast-paced agile environment.
You will lead by example — designing systems, writing clean code, solving complex problems, and ensuring engineering best practices across the stack.
Key Responsibilities
- Lead technical design and architecture discussions (HLD & LLD).
- Build scalable, modular, and testable systems with strong engineering fundamentals.
- Own complex features end-to-end — design, development, testing, and optimization.
- Write high-quality, production-ready code with strong unit test coverage.
- Ensure clean code practices (SOLID principles, modular design, reusability).
- Drive engineering quality within CI/CD environments.
- Debug and resolve complex issues across distributed systems and APIs.
- Mentor engineers and elevate overall code quality standards.
- Collaborate effectively within agile teams and move with delivery velocity.
Core Technical Requirements
- 8+ years of hands-on software development experience.
- Strong proficiency in:
- Java
- Node.js
- Angular (6+)
- JavaScript / TypeScript
- SQL & MongoDB
- Deep understanding of system design, architecture patterns, and scalable application development.
- Strong debugging capabilities across:
- Distributed services
- API integrations
- UI state management
- Database query performance
- Experience working in CI/CD-driven engineering environments.
GenAI & AI Stack Expertise
- Hands-on experience with GenAI frameworks and LLM integrations.
- Familiarity with:
- LangChain ecosystem
- Hugging Face
- Prompt chaining & orchestration
- Understanding of AI cost optimization strategies.
- Ability to debug AI pipelines and optimize model interactions.
Engineering Expectations
- Strong ownership mindset.
- Ability to design independently and lead technical direction.
- Exceptional problem-solving and debugging skills.
- High attention to detail.
- Comfortable working in fast-paced agile/scrum setups.
- Strong communication and collaboration skills.
- Ability to mentor and guide other engineers.
Educational Qualification
- Bachelor’s degree in Computer Science / Engineering / related field
- or
- Master’s degree in Computer Science / Computer Applications
- Exploratory tester with 2–3 years of experience in software testing.
- The candidate should be an expert in GUI and functional testing of web applications.
- Good communication is a must and should be capable of collaborating with cross functional teams.
- Should be self driven capable of handling responsibilities independently.
- Should have good knowledge of SQL and Jira
- Strong proficiency in Microsoft Excel is required for test analysis and reporting.
- Should be able to understand application architecture to effectively design and execute test scenarios.
- Experience with Playwright automation is an added advantage but not mandatory.
5+ years of experience as a Senior Analytics or Data Engineer building pipelines, developing data models, and improving BI infrastructure, ideally at a SaaS company.
Core Stack: Expert-level knowledge of SQL and Python.
Platform Expertise: Expert-level knowledge in Snowflake, Azure, Fivetran, and dbt.
Orchestration: Hands-on expertise with at least one orchestration framework such as Airflow, Prefect, or Dagster.
Technical Skills: Solid understanding of data modeling concepts, specifically star schemas and normalized vs. denormalized structures.
Workflow Development: Experience building ELT/ETL workflows and integrating APIs or webhooks.Analytical Mindset: Proven ability to translate ambiguous business questions into structured analyses.
Soft Skills: Excellent communication skills with the ability to articulate technical problems to non-technical audiences.Agility: Comfortable working in an agile environment.Education: Bachelor's Degree in Engineering,
We are seeking a skilled Java Developer with hands-on experience in Java and Spark to build scalable data processing solutions. You'll contribute to high-performance data pipelines and analytics platforms in a dynamic Agile environment.
Key Responsibilities
- Design and develop Java applications integrated with Apache Spark for ETL processes, data transformations, and analytics.
- Build and optimize Spark jobs (Spark SQL, DataFrames, Streaming) for large-scale data processing.
- Collaborate with data engineers and analysts to implement robust data workflows.
- Write clean, maintainable Java code following best practices (Spring Boot, Microservices preferred).
- Perform code reviews, unit testing, and contribute to CI/CD pipelines.
- Troubleshoot and optimize Spark performance for production workloads.
- Document technical solutions and mentor junior developers.
Required Skills & Qualifications
- 4-7 years of hands-on Java development experience.
- Strong expertise in Apache Spark (Spark Core, Spark SQL, PySpark basics).
- Proficiency in Java 8/11+ with multithreading and collections frameworks.
- Experience with data processing (ETL, data pipelines, big data).
- Familiarity with build tools (Maven/Gradle) and version control (Git).
- Strong problem-solving skills and Bangalore location availability.
- Excellent communication skills for cross-team collaboration.
Good to Have
- Experience with Snowflake for cloud data warehousing.
- Knowledge of DBT (Data Build Tool) for analytics engineering.
- Python scripting for data manipulation and automation.
- Exposure to AWS/GCP/Azure cloud platforms.
- Familiarity with Kafka, Airflow, or containerization (Docker/Kubernetes).
Dot Net Full Stack Developer
Job Overview
We are seeking a skilled .NET Developer who can design, develop, and maintain both conventional .NET applications and modern cloud-ready solutions. The ideal candidate will have expertise in Microsoft SQL Server, Azure DevOps CI/CD, Azure AD-based SSO, and integration with enterprise applications using MuleSoft APIs. The role also involves modernizing legacy applications, migrating to Azure Cloud, and building responsive web applications using Razor Pages, Bootstrap, and jQuery, as well as modern alternatives like Blazor, Tailwind CSS, and React/Angular.
Responsibilities:
- Develop and maintain .NET Framework (4.x) and .NET 9 applications.
- Build responsive web applications using Razor Pages, Bootstrap v5.3.3, and jQuery 3.7.1.
- Document functionalities through reverse engineering and through communication with other developers. Draw architecture diagrams, and maintain application documentation
- Design and optimize SQL Server schemas, stored procedures, and queries.
- Integrate .NET applications with enterprise systems via MuleSoft APIs.
- Implement Single Sign-On (SSO) using Azure Active Directory.
- Design and maintain CI/CD pipelines using Azure DevOps.
- Migrate legacy .NET Framework apps to .NET 9 and deploy to Azure.
- Implement containerization using Docker and orchestration with Kubernetes.
- Ensure application security, scalability, and performance optimization.
- Collaborate with architects, QA, and business teams in agile environments.
- Develop and enhance software products mainly located in the European geography, and thus ability to support during CET timezone is must.
Required Framework & Technologies
- .NET Framework (4.x) and .NET 9
- C# programming language
- ASP.NET MVC, ASP.NET Core, Razor Pages
- Bootstrap CSS Framework v5.3.3
- jQuery 3.7.1
- Modern Alternatives: Blazor (Server/WebAssembly), Tailwind CSS, React, Angular
- Entity Framework Core, LINQ, Dapper
- Microsoft SQL Server (T-SQL, Stored Procedures, Performance Tuning)
- MuleSoft API Integration
- Azure Active Directory (SSO, OAuth, JWT)
- Azure DevOps (CI/CD pipelines, Release Management)
- Git, YAML pipelines
- Azure App Services, Azure Functions, Azure Kubernetes Service (AKS)
- Docker, Kubernetes
- Application Insights, Azure Monitor
Preferred Qualifications
- Bachelor's degree in Computer Science, Information Technology, or related field.
- Strong proficiency in C# and .NET technologies including .NET 9.
- Experience with Razor Pages, Bootstrap, and jQuery for front-end development.
- Familiarity with modern alternatives like Blazor, Tailwind CSS, and React/Angular.
- Hands-on experience with Azure DevOps and CI/CD pipelines.
- Knowledge of Azure AD authentication and SSO implementation.
- Experience in integrating applications using MuleSoft APIs.
- Familiarity with cloud migration strategies and Azure services.
Experience
- 3+ years of experience in .NET application development.
- 2+ years of experience in Azure Cloud ecosystem and DevOps.
- Experience in migrating legacy applications to modern .NET platforms.
- Experience in containerization and orchestration (Docker, Kubernetes).
What is in it for you?
- Opportunity to work on a technically challenging and impactful product.
- Joining a values-driven, employee-centric organization that prioritizes well-being.
- Being part of a growing start-up setting new standards for employee experience while delivering breakthrough digital products.
- Exposure to an international distributed work environment with industry-leading clients.
- A Hybrid-first setup, giving you the flexibility to work from anywhere 40 percent of the time in a week.
- First-hand experience working directly with large client organizations, solving meaningful challenges (not in an “outsourced” model).
- Collaborative and supportive team environment that values empathy and camaraderie.
- Professional development and continuous learning opportunities.
- Competitive salary package and a strong emphasis on work-life balance.
Must have Skills
- .Net - 3 Years
- DevOps - 2 Years
- C Sharp - 3 Years
- .NET 9 - 3 Years
- Razor Pages - 2 Years
- ASP.NET Core - 2 Years
- ASP.NET MVC - 2 Years
- Blazor - 3 Years
- Azure DevOps - 2 Years
- Docker - 2 Years
- Kubernetes - 2 Years
- Microsoft SQL Services - 2 Years
- YAML - 3 Years
- Azure Monitor - 2 Years
- CI/CD pipeline - 2 Years
We are looking for an integration engineer to assist our rapidly growing customer base. As part of our integration team, you will be the primary point of contact for all integrations. You would be responsible for helping our clients integrate with OneFin APIs, configuring our system for clients and providing ongoing help to them to resolve any issues.
Responsibilities
- Understand and explain APIs to clients. Help clients integrate OneFin APIs. Research and identify solutions to issues during integration.
- Escalate unresolved issues to appropriate internal teams (e. g. software developers).
- Become a product expert for clients.
- Configure OneFin system for customized usage by clients. Identify and write internal and external technical articles or knowledge-base entries, like typical troubleshooting steps, workarounds, or best practices, how-to guides etc.
- Automate solution of common issues using Python.
- Help live clients resolve issues and coordinate with the development team for issue resolution.
Requirements and Qualifications:
- Strong verbal and written communication skills.
- Experience in writing code in Python.
- Understanding web based systems.
- Proficient in understanding and writing JSON.
- Experience in SQL databases.
- Experience working with REST APIs.
- Excellent analytical skills, passion for pinning down technical issues, and solving problems.
Job Title: Senior Full-stack Developer (Python,React)
Location: Hyderabad, India (On-site Only)
Employment Type: Full-Time
Work Mode: Office-Based; Remote or Hybrid Not Allowed
Role Summary
We are looking for a skilled Senior Fullstack Developer with expertise in Django (Python),React, RESTful APIs, GraphQL, microservices architecture, Redis, and AWS services (SNS, SQS, etc.). The ideal candidate will be responsible for designing, developing, and maintaining scalable backend systems and APIs to support dynamic frontend applications and services.
Required Skillset:
l 9+ years of professional experience writing production-grade software, including experience leading the design of complex systems.
l Strong expertise in Python (Django or equivalent frameworks) and REST API development.
l Solid exp of frontend frameworks such as React and TypeScript.
l Strong understanding of relational databases (MySQL or PostgreSQL preferred).
l Experience with CI/CD pipelines, containerization (Docker), and orchestration (Kubernetes).
l Hands-on experience with cloud infrastructure (AWS preferred)
l Proven experience debugging complex production issues and improving observability.
Preferred Skillset:
l Experience in enterprise SaaS or B2B systems with multi-tenancy, authentication (OAuth, SSO, SAML), and data partitioning. Exposure to Kafka or RabbitMQ, microservices.
l Knowledge of event-driven architecture, A/B testing frameworks, and analytics pipelines.
l Familiarity with accessibility standards and best practices Agile/Scrum methodologies.
l Exposure to the Open edX ecosystem or open-source contributions in education tech.
l Demonstrated history of technical mentorship, team leadership, or cross-team collaboration.
Tech Stack:
l Backend: Python (Django), (Celery,Redis Asynchronous workflows), REST APIs
l Frontend: React, TypeScript, SCSS
l Data: MySQL, Snowflake, Elasticsearch
l DevOps/Cloud: Docker,Kubernetes,GitHub Actions,AWS
l Monitoring: Datadog
l Collaboration Tools: GitHub, Jira, Slack, Segment
Primary Responsibilities:
l Lead, guide, and mentor a team of Python/Django engineers, offering hands-on technical support and direction.
l Architect, design, and deliver secure, scalable, and high-performing web applications.
l Manage the complete software development lifecycle including requirements gathering, system design, development, testing, deployment, and post-launch maintenance.
l Ensure compliance with coding standards, architectural patterns, and established development best practices.
l Collaborate with product teams, QA, UI/UX, and other stakeholders to ensure timely and high-quality product releases.
l Perform detailed code reviews, optimize system performance, and resolve production-level issues.
l Drive engineering improvements such as automation, CI/CD implementation, and modernization of outdated systems.
l Create and maintain technical documentation while providing regular updates to leadership and stakeholders.

A real time Customer Data Platform and cross channel marketing automation delivers superior experiences that result in an increased revenue for some of the largest enterprises in the world.
Key Responsibilities:
- Design and develop backend components and sub-systems for large-scale platforms under guidance from senior engineers.
- Contribute to building and evolving the next-generation customer data platform.
- Write clean, efficient, and well-tested code with a focus on scalability and performance.
- Explore and experiment with modern technologies—especially open-source frameworks—
- and build small prototypes or proof-of-concepts.
- Use AI-assisted development tools to accelerate coding, testing, debugging, and learning while adhering to engineering best practices.
- Participate in code reviews, design discussions, and continuous improvement of the platform.
Qualifications:
- 0–2 years of experience (or strong academic/project background) in backend development with Java.
- Good fundamentals in algorithms, data structures, and basic performance optimizations.
- Bachelor’s or Master’s degree in Computer Science or IT (B.E / B.Tech / M.Tech / M.S) from premier institutes.
Technical Skill Set:
- Strong aptitude and analytical skills with emphasis on problem solving and clean coding.
- Working knowledge of SQL and NoSQL databases.
- Familiarity with unit testing frameworks and writing testable code is a plus.
- Basic understanding of distributed systems, messaging, or streaming platforms is a bonus.
AI-Assisted Engineering (LLM-Era Skills):
- Familiarity with modern AI coding tools such as Cursor, Claude Code, Codex, Windsurf, Opencode, or similar.
- Ability to use AI tools for code generation, refactoring, test creation, and learning new systems responsibly.
- Willingness to learn how to combine human judgment with AI assistance for high-quality engineering outcomes.
Soft Skills & Nice to Have
- Appreciation for technology and its ability to create real business value, especially in data and marketing platforms.
- Clear written and verbal communication skills.
- Strong ownership mindset and ability to execute in fast-paced environments.
- Prior internship or startup experience is a plus.
Description
SRE Engineer
Role Overview
As a Site Reliability Engineer, you will play a critical role in ensuring the availability and performance of our customer-facing platform. You will work closely with DevOps, DBA, and Development teams to provision and maintain infrastructure, deploy and monitor our applications, and automate workflows. Your contributions will have a direct impact on customer satisfaction and overall experience.
Responsibilities and Deliverables
• Manage, monitor, and maintain highly available systems (Windows and Linux)
• Analyze metrics and trends to ensure rapid scalability.
• Address routine service requests while identifying ways to automate and simplify.
• Create infrastructure as code using Terraform, ARM Templates, Cloud Formation.
• Maintain data backups and disaster recovery plans.
• Design and deploy CI/CD pipelines using GitHub Actions, Octopus, Ansible, Jenkins, Azure DevOps.
• Adhere to security best practices through all stages of the software development lifecycle
• Follow and champion ITIL best practices and standards.
• Become a resource for emerging and existing cloud technologies with a focus on AWS.
Organizational Alignment
• Reports to the Senior SRE Manager
• This role involves close collaboration with DevOps, DBA, and security teams.
Technical Proficiencies
• Hands-on experience with AWS is a must-have.
• Proficiency analyzing application, IIS, system, security logs and CloudTrail events
• Practical experience with CI/CD tools such as GitHub Actions, Jenkins, Octopus
• Experience with observability tools such as New Relic, Application Insights, AppDynamics, or DataDog.
• Experience maintaining and administering Windows, Linux, and Kubernetes.
• Experience in automation using scripting languages such as Bash, PowerShell, or Python.
• Configuration management experience using Ansible, Terraform, Azure Automation Run book or similar.
• Experience with SQL Server database maintenance and administration is preferred.
• Good Understanding of networking (VNET, subnet, private link, VNET peering).
• Familiarity with cloud concepts including certificates, Oauth, AzureAD, ASE, ASP, AKS, Azure Apps,
Load Balancers, Application Gateway, Firewall, Load Balancer, API Management, SQL Server, Databases on Azure
Experience
• 7+ years of experience in SRE or System Administration role
• Demonstrated ability building and supporting high availability Windows/Linux servers, with emphasis on the WISA stack (Windows/IIS/SQL Server/ASP.net)
• 3+ years of experience working with cloud technologies including AWS, Azure.
• 1+ years of experience working with container technology including Docker and Kubernetes.
• Comfortable using Scrum, Kanban, or Lean methodologies.
Education
• Bachelor’s Degree or College Diploma in Computer Science, Information Systems, or equivalent
experience.
Additional Job Details:
• Working hours: 2:00 PM / 3:00 PM to 11:30 PM IST
• Interview process: 3 technical rounds
• Work model: 3 days’ work from office
Role & Responsibilities:
Develop and deliver defect free, web-based applications using C#, ASP.Net and Oracle as per the specifications provided by the Business Analysts.
- Read and Understand the Functional and Technical Specification and have complete understanding of the work before commencing the work
- Design, develop, and unit test applications in accordance with established standards.
- Adhering to high-quality development principles while delivering solutions on-time.
- Adhere to the Quality Management Standards established in the organization.
- Understanding of the SDLC process defined in the organization and follow it without any deviation
- Providing third-level support to the support tickets raised by the business users
- Analyzing and resolving technical and application Logic related problems
- Ensure high performance in the application by developing efficient code
Ideal Candidate:
- Strong .NET Senior Software Engineer Profile
- Must have 5+ years of hands-on development experience with C#.NET, ASP.NET, ADO.NET.
- Must have 3+ years of experience in web application development using HTML, CSS, JavaScript/jQuery
- Must have strong experience in Writing Complex SQL Queries, Stored Procedures, Functions using Oracle / SQL Server.
- Must have experience in designing, developing, and unit testing applications with SDLC compliance
- Experience with AJAX, Crystal Reports, and front-end validations using JavaScript/jQuery.
- ME/MTech (CS) or BE/BTech (CS).
Role & Responsibilities:
- Design, develop, and unit test applications in accordance with established standards.
- Preparing reports, manuals and other documentation on the status, operation and maintenance of software.
- Analyzing and resolving technical and application problems
- Adhering to high-quality development principles while delivering solutions on-time
- Providing third-level support to business users.
- Compliance of process and quality management standards
- Understanding and implementation of SDLC process
Ideal Candidate:
- Strong Senior Angular Developer Profiles.
- Must have 6+ years of experience in frontend development, with at least 4+ years in Angular 8+.
- Must have strong proficiency in JavaScript, TypeScript, HTML5, and CSS3.
- Must have strong test-driven development experience and proficiency in unit testing frameworks such as Jasmine, Karma, NUnit, Selenium.
- Must have strong experience in database technologies (MySQL / SQL Server / Oracle)
- Considering candidates from South India only.
- Must have 2+ experience with Web APIs, Entity Framework, and Linq Queries.
- Experience in .NET Core framework, OOP, and C# APIs.
- Product Companies
- B.Tech./M.Tech in Computer Science (or related field).
Position: Full Stack Developer ( PHP Codeigniter)
Company : Mayura Consultancy Services
Experience: 2 yrs
Location : Bangalore
Skill: HTML, CSS, Bootstrap, Javascript, Ajax, Jquery , PHP and Codeigniter or CI
Work Location: Work From Home(WFH)
Apply: Please apply for the job opening using the URL below, based on your skill set. Once you complete the application form, we will review your profile.
Website:
https://www.mayuraconsultancy.com/careers/mcs-full-stack-web-developer-opening?r=jlp
Responsibilities
Develop and maintain web and backend components using Python, Node.js, and Zoho tools
Design and implement custom workflows and automations in Zoho
Perform code reviews to maintain quality standards and best practices
Debug and resolve technical issues promptly
Collaborate with teams to gather and analyze requirements for effective solutions
Write clean, maintainable, and well-documented code
Manage and optimize databases to support changing business needs
Contribute individually while mentoring and supporting team members
Adapt quickly to a fast-paced environment and meet expectations within the first month
Selection Process
1. HR Screening: Review of qualifications and experience
2. Online Technical Assessment: Test coding and problem-solving skills
3. Technical Interview: Assess expertise in web development, Python, Node.js, APIs, and Zoho
4. Leadership Evaluation: Evaluate team collaboration and leadership abilities
5. Management Interview: Discuss cultural fit and career opportunities
6. Offer Discussion: Finalize compensation and role specifics
Experience Required
2-4 years of relevant experience as a Zoho Developer
Proven ability to work as a self-starter and contribute individually
Strong technical and interpersonal skills to support team members effectively
Power BI Analyst – EdTech (UAE Market)
📍 Location: Bangalore (Onsite)
🕔 Working Days: 5 Days
🏢 Industry: EdTech – Professional Training & Certification Programs
🌍 Market Focus: UAE
About Us – Learners Point
Learners Point Academy is a leading professional training institute in the UAE, empowering working professionals and organizations through globally recognised certification programs such as CMA, PMP, ACCA, CIA, and other corporate training solutions.
With a strong presence in the UAE market, we specialise in career-focused education, enterprise workforce development, and high-impact learning solutions designed to drive measurable professional growth.
As we expand our analytics capabilities, we are looking for a skilled
Power BI Analyst to support business intelligence and data-driven decision-making across our Professional Training Programs.
Role Overview
The Power BI Analyst will be responsible for transforming business, learner, and sales data into actionable dashboards and reports that enhance performance tracking, learner engagement, and revenue optimisation.
Key Responsibilities
- Design, develop, and maintain interactive dashboards using Microsoft Power BI
- Develop advanced reports using DAX, data modelling, and Power Query
- Analyze training program performance (enrollments, retention, completion rates, revenue)
- Build KPI dashboards for:
- Sales & Reactivation Team
- Academic & Training Team
- Leadership & Management
- Extract and manage data using SQL from databases and CRM systems
- Automate reporting processes and ensure data accuracy
- Translate business requirements into technical BI solutions
- Present insights through clear and compelling data storytelling
Required Technical Skills
- Strong experience in Power BI (Desktop & Service)
- Proficiency in:
- DAX (Measures, Time Intelligence)
- Data Modeling (Star & Snowflake Schema)
- Power Query (ETL)
- Good knowledge of SQL
- Advanced Excel (Pivot Tables, Power Pivot, Lookup Functions)
- Experience integrating data from CRM, LMS, or ERP systems
Industry-Specific Requirements (EdTech Focus)
- Understanding of:
- Learner engagement metrics
- Course completion & drop-off analysis
- Revenue per program
- Student retention analytics
- Experience working with Professional Certification Programs is an added advantage
- Familiarity with UAE market reporting standards preferred
Preferred Skills
- Exposure to Azure Data Services
- Dashboard design best practices
- Ability to manage large datasets
- Strong analytical mindset with business understanding
Soft Skills
- Strong communication & stakeholder management skills
- Business-oriented thinking
- Problem-solving mindset
- Attention to detail
Experience & Qualification
- Bachelor’s Degree in Computer Science, Data Analytics, Statistics, or related field
- 2–8 years of experience as a Power BI / Data Analyst (EdTech preferred)
- UAE or GCC market exposure is a plus
About the Role:
We are seeking a highly skilled Integration Specialist / Full Stack Developer with strong experience in .NET Core, API integrations, and modern front-end development. The ideal candidate will build and integrate scalable web and mobile applications, manage end-to-end delivery, and ensure smooth data exchange across platforms.
Key Responsibilities:
- Design, develop, and maintain backend APIs using .NET Core / C#.
- Build and integrate REST and SOAP-based services (JSON, XML, OAuth2, JWT, API Key).
- Implement file-based integrations (Flat file, CSV, Excel, XML, JSON) and manage FTP/SFTP transfers.
- Work with databases such as MSSQL, PostgreSQL, Oracle, and SQLite — including writing queries, stored procedures, and using ADO.NET.
- Handle data serialization/deserialization using Newtonsoft.Json or System.Text.Json.
- Implement robust error handling and logging with Serilog, NLog, or log4net.
- Automate and schedule processes using Quartz.NET, Hangfire, or Windows Task Scheduler.
- Manage version control and CI/CD pipelines via Git and Azure DevOps.
- Develop front-end interfaces with React and React Native ensuring responsive, modular UI.
- Implement offline-first functionality for mobile apps (sync logic, caching, etc.).
- Collaborate with cross-functional teams or independently handle full project ownership.
- Utilize AI-assisted development tools (e.g., Cursor, GitHub Copilot, Claude Code) to enhance productivity.
- Apply integration best practices including middleware, API gateways, and optionally message queues (MSMQ, RabbitMQ).
- Ensure scalability, security, and performance in all deliverables.
Key Skills & Technologies:
- Backend: .NET Core, C#, REST/SOAP APIs, WCF, ADO.NET
- Frontend: React, React Native
- Databases: MSSQL, PostgreSQL, Oracle, SQLite
- Tools: Git, Azure DevOps, Hangfire, Quartz.NET, Serilog/NLog
- Integration: JSON, XML, CSV, FTP/SFTP, OAuth2, JWT
- DevOps: CI/CD automation, deployment pipelines
- Optional: Middleware, API Gateway, Message Queues (MSMQ, RabbitMQ)
Qualifications:
- Bachelor’s degree in Computer Science, Engineering, or a related field.
- Minimum 5 years of hands-on experience in software development and integration.
- Proven expertise in designing and implementing scalable applications.
- Strong analytical and problem-solving skills with a proactive approach.
Nice to Have:
- Experience with cloud services (Azure, AWS, GCP).
- Knowledge of containerization tools like Docker or Kubernetes.
- Familiarity with mobile deployment workflows and app store publishing.
About Kanerika:
Kanerika Inc. is a premier global software products and services firm that specializes in providing innovative solutions and services for data-driven enterprises. Our focus is to empower businesses to achieve their digital transformation goals and maximize their business impact through the effective use of data and AI.
We leverage cutting-edge technologies in data analytics, data governance, AI-ML, GenAI/ LLM and industry best practices to deliver custom solutions that help organizations optimize their operations, enhance customer experiences, and drive growth.
Awards and Recognitions:
Kanerika has won several awards over the years, including:
1. Best Place to Work 2023 by Great Place to Work®
2. Top 10 Most Recommended RPA Start-Ups in 2022 by RPA Today
3. NASSCOM Emerge 50 Award in 2014
4. Frost & Sullivan India 2021 Technology Innovation Award for its Kompass composable solution architecture
5. Kanerika has also been recognized for its commitment to customer privacy and data security, having achieved ISO 27701, SOC2, and GDPR compliances.
Working for us:
Kanerika is rated 4.6/5 on Glassdoor, for many good reasons. We truly value our employees' growth, well-being, and diversity, and people’s experiences bear this out. At Kanerika, we offer a host of enticing benefits that create an environment where you can thrive both personally and professionally. From our inclusive hiring practices and mandatory training on creating a safe work environment to our flexible working hours and generous parental leave, we prioritize the well-being and success of our employees.
Our commitment to professional development is evident through our mentorship programs, job training initiatives, and support for professional certifications. Additionally, our company-sponsored outings and various time-off benefits ensure a healthy work-life balance. Join us at Kanerika and become part of a vibrant and diverse community where your talents are recognized, your growth is nurtured, and your contributions make a real impact. See the benefits section below for the perks you’ll get while working for Kanerika.
Role Responsibilities:
Following are high level responsibilities that you will play but not limited to:
- Design, development, and implementation of modern data pipelines, data models, and ETL/ELT processes.
- Architect and optimize data lake and warehouse solutions using Microsoft Fabric, Databricks, or Snowflake.
- Enable business analytics and self-service reporting through Power BI and other visualization tools.
- Collaborate with data scientists, analysts, and business users to deliver reliable and high-performance data solutions.
- Implement and enforce best practices for data governance, data quality, and security.
- Mentor and guide junior data engineers; establish coding and design standards.
- Evaluate emerging technologies and tools to continuously improve the data ecosystem.
Required Qualifications:
- Bachelor's degree in computer science, Information Technology, Engineering, or a related field.
- Bachelor’s/ Master’s degree in Computer Science, Information Technology, Engineering, or related field.
- 7-10 years of experience in data engineering or data platform development
- Strong hands-on experience in SQL, Snowflake, Python, and Airflow
- Solid understanding of data modeling, data governance, security, and CI/CD practices.
Preferred Qualifications:
- Experience in leading a team
- Familiarity with data modeling techniques and practices for Power BI.
- Knowledge of Azure Databricks or other data processing frameworks.
- Knowledge of Microsoft Fabric or other Cloud Platforms.
What we need?
· B. Tech computer science or equivalent.
Why join us?
- Work with a passionate and innovative team in a fast-paced, growth-oriented environment.
- Gain hands-on experience in content marketing with exposure to real-world projects.
- Opportunity to learn from experienced professionals and enhance your marketing skills.
- Contribute to exciting initiatives and make an impact from day one.
- Competitive stipend and potential for growth within the company.
- Recognized for excellence in data and AI solutions with industry awards and accolades.
Employee Benefits:
1. Culture:
- Open Door Policy: Encourages open communication and accessibility to management.
- Open Office Floor Plan: Fosters a collaborative and interactive work environment.
- Flexible Working Hours: Allows employees to have flexibility in their work schedules.
- Employee Referral Bonus: Rewards employees for referring qualified candidates.
- Appraisal Process Twice a Year: Provides regular performance evaluations and feedback.
2. Inclusivity and Diversity:
- Hiring practices that promote diversity: Ensures a diverse and inclusive workforce.
- Mandatory POSH training: Promotes a safe and respectful work environment.
3. Health Insurance and Wellness Benefits:
- GMC and Term Insurance: Offers medical coverage and financial protection.
- Health Insurance: Provides coverage for medical expenses.
- Disability Insurance: Offers financial support in case of disability.
4. Child Care & Parental Leave Benefits:
- Company-sponsored family events: Creates opportunities for employees and their families to bond.
- Generous Parental Leave: Allows parents to take time off after the birth or adoption of a child.
- Family Medical Leave: Offers leave for employees to take care of family members' medical needs.
5. Perks and Time-Off Benefits:
- Company-sponsored outings: Organizes recreational activities for employees.
- Gratuity: Provides a monetary benefit as a token of appreciation.
- Provident Fund: Helps employees save for retirement.
- Generous PTO: Offers more than the industry standard for paid time off.
- Paid sick days: Allows employees to take paid time off when they are unwell.
- Paid holidays: Gives employees paid time off for designated holidays.
- Bereavement Leave: Provides time off for employees to grieve the loss of a loved one.
6. Professional Development Benefits:
- L&D with FLEX- Enterprise Learning Repository: Provides access to a learning repository for professional development.
- Mentorship Program: Offers guidance and support from experienced professionals.
- Job Training: Provides training to enhance job-related skills.
- Professional Certification Reimbursements: Assists employees in obtaining professional certifications.
- Promote from Within: Encourages internal growth and advancement opportunities.
JOB DESCRIPTION:
Location: Pune, Mumbai, Bangalore
Mode of Work : 3 days from Office
* Python : Strong expertise in data workflows and automation
* Pandas: For detailed data analysis and validation
* SQL: Querying and performing operations on Delta tables
* AWS Cloud: Compute and storage services
* OOPS concepts
Strong Senior Backend Engineer profiles
Mandatory (Experience 1) – Must have 5+ years of hands-on Backend Engineering experience building scalable, production-grade systems
Mandatory (Experience 2) – Must have strong backend development experience using one or more frameworks (FastAPI / Django (Python), Spring (Java), Express (Node.js).
Mandatory (Experience 3) – Must have deep understanding of relevant libraries, tools, and best practices within the chosen backend framework
Mandatory (Experience 4) – Must have strong experience with databases, including SQL and NoSQL, along with efficient data modeling and performance optimization
Mandatory (Experience 5) – Must have experience designing, building, and maintaining APIs, services, and backend systems, including system design and clean code practices
Mandatory (Domain) – Experience with financial systems, billing platforms, or fintech applications is highly preferred (fintech background is a strong plus)
Mandatory (Company) – Must have worked in product companies / startups, preferably Series A to Series D
Mandatory (Education) – Candidates from Tier - 1 engineering institutes (IITs, BITS, are highly preferred)
Qualification- BTech-CS (2025 graduate only)
Joining: Immediate Joiner
Job Type: Trainee
Work Mode: Remote
Working Days: Monday to Friday
Shift (Rotational – based on project need):
· 5:00 PM – 2:00 AM IST
· 6:00 PM – 3:00 AM IST
Job Summary
ARDEM is seeking highly motivated Technology Interns from Tier 1 colleges who are passionate about software development and eager to work with modern Microsoft technologies. This role is ideal for fresher who want hands-on experience in building scalable web applications while maintaining a healthy work-life balance through remote work opportunities.
Eligibility & Qualifications
- Education:
- B.Tech (Computer Science) / M.Tech (Computer Science)
- Tier 1 colleges preferred
- Experience Level: Fresher
- Communication: Excellent English communication skills (verbal & written)
Skills Required
Technical & Development Skills:
· Basic understanding of AI / Machine Learning concepts
· Exposure to AWS (deployment or cloud fundamentals)
· PHP development
· WordPress development and customization
· JavaScript (ES5 / ES6+)
· jQuery
· AJAX calls and asynchronous handling
· Event handling
· HTML5 & CSS3
· Client-side form validation
Work Environment & Tools
- Comfortable working in a remote setup
- Familiarity with collaboration and remote access tools
Additional Requirements (Work-from-Home Setup)
This opportunity promotes a healthy work-life balance with remote work flexibility. Candidates must have the following minimum infrastructure:
- System: Laptop or Desktop (Windows-based)
- Operating System: Windows
- Screen Size: Minimum 14 inches
- Screen Resolution: Full HD (1920 × 1080)
- Processor: Intel i5 or higher
- RAM: Minimum 8 GB (Mandatory)
- Software: AnyDesk
- Internet Speed: 100 Mbps or higher
About ARDEM
ARDEM is a leading Business Process Outsourcing (BPO) and Business Process Automation (BPA) service provider. With over 20 years of experience, ARDEM has consistently delivered high-quality outsourcing and automation services to clients across the USA and Canada. We are growing rapidly and continuously innovating to improve our services. Our goal is to strive for excellence and become the best Business Process Outsourcing and Business Process Automation company for our customers.
We are seeking a talented AI/ML Engineer with strong hands-on experience in Generative AI and Large Language Models (LLMs) to join our Business Intelligence team. The role involves designing, developing, and deploying advanced AI/ML and GenAI-driven solutions to unlock business insights and enhance data-driven decision-making.
Key Responsibilities:
• Collaborate with business analysts and stakeholders to identify AI/ML and Generative AI use cases.
• Design and implement ML models for predictive analytics, segmentation, anomaly detection, and forecasting.
• Develop and deploy Generative AI solutions using LLMs (GPT, LLaMA, Mistral, etc.).
• Build and maintain Retrieval-Augmented Generation (RAG) pipelines and semantic search systems.
• Work with vector databases (FAISS, Pinecone, ChromaDB) for embedding storage and retrieval.
• Develop end-to-end AI/ML pipelines from data preprocessing to deployment.
• Integrate AI/ML and GenAI solutions into BI dashboards and reporting tools.
• Optimize models for performance, scalability, and reliability.
• Maintain documentation and promote knowledge sharing within the team.
Mandatory Requirements:
• 4+ years of relevant experience as an AI/ML Engineer.
• Hands-on experience in Generative AI and Large Language Models (LLMs) – Mandatory.
• Experience implementing RAG pipelines and prompt engineering techniques.
• Strong programming skills in Python.
• Experience with ML frameworks (TensorFlow, PyTorch, scikit-learn).
• Experience with vector databases (FAISS, Pinecone, ChromaDB).
• Strong understanding of SQL and database systems.
• Experience integrating AI solutions into BI tools (Power BI, Tableau).
• Strong analytical, problem-solving, and communication skills. Good to Have
• Experience with cloud platforms (AWS, Azure, GCP).
• Experience with Docker or Kubernetes.
• Exposure to NLP, computer vision, or deep learning use cases.
• Experience in MLOps and CI/CD pipelines
Job description Data Analyst
About Miror
Miror is India’s leading FemTech platform transforming how women experience peri-menopause and menopause. In just a year, we’ve built India’s largest menopause-focused WhatsApp community, partnered with the National Health Mission and the Indian Menopause Society, and launched category-defining nutraceutical products and digital health services. Our app blends science and technology—offering personalized care pathways, symptom tracking, diagnostic links, games, AI-powered chat, expert consultations, and more. We're proud recipients of the Innovation in Menopause Care award at the Global Women’s Health Innovation Conference 2024 and are rapidly scaling toward our $1B+ vision. Learn more: miror.in
Role Overview
We’re looking for a Data Analyst who is excited to work at the intersection of data, technology, and women’s wellness. You'll be instrumental in helping us understand user behaviour, community engagement, campaign performance, and product usage across platforms — including app, web, and WhatsApp.
You’ll also have opportunities to collaborate on AI-powered features such as chatbots and personalized recommendations. Experience with GenAI or NLP is a plus but not a requirement.
Key Responsibilities
· Clean, transform, and analyse data from multiple sources (SQL databases, CSVs, APIs).
· Build dashboards and reports to track KPIs, user behaviour, and marketing performance.
· Collaborate with product, marketing, and customer teams to uncover actionable insights.
· Support experiments, A/B testing, and cohort analysis to drive growth and retention.
· Assist in documentation and communication of findings to technical and non-technical teams.
· Work with the data team to enhance personalization and AI features (optional).
Required Qualifications
· Bachelor’s degree in Data Science, Statistics, Computer Science, or a related field.
· 2 – 4 years of experience in data analysis or business intelligence.
· Strong hands-on experience with SQL and Python (pandas, NumPy, matplotlib).
· Familiarity with data visualization tools (Streamlit, Tableau, Metabase, Power BI, etc.)
· Ability to translate complex data into simple visual stories and clear recommendations.
· Strong attention to detail and a mindset for experimentation.
Preferred (Not Mandatory)
· Exposure to GenAI, LLMs (e.g., OpenAI, HuggingFace), or NLP concepts.
· Experience working with healthcare, wellness, or e-commerce datasets.
· Familiarity with REST APIs, JSON structures, or chatbot systems.
· Interest in building tools that impact women’s health and wellness.
Why Join Us?
· Be part of a high-growth startup tackling a real need in women’s healthcare.
· Work with a passionate, purpose-driven team.
· Opportunity to grow into GenAI/ML-focused roles as we scale.
· Competitive salary and career progression
Best Regards,
Indrani Dutta
MIROR THERAPEUTICS PRIVATE LIMITED
Connect with me here or on my LinkedIn page.
Job Description: Data Analyst Intern
Location: On-site, Bangalore
Duration: 6 months (Full-time)
About us:
- Optimo Capital is a newly established NBFC founded by Prashant Pitti, who is also a co-founder of EaseMyTrip (a billion-dollar listed startup that grew profitably without any funding).
- Our mission is to serve the underserved MSME businesses with their credit needs in India. With less than 15% of MSMEs having access to formal credit, we aim to bridge this credit gap through a phygital model (physical branches + digital decision-making). As a technology and data-first company, tech lovers and data enthusiasts play a crucial role in building the analytics & tech at Optimo that helps the company thrive.
What we offer:
- Join our dynamic startup team and play a crucial role in core data analytics projects involving credit risk, lending strategy, credit features analytics, collections, and portfolio management.
- The analytics team at Optimo works closely with the Credit & Risk departments, helping them make data-backed decisions.
- This is an exceptional opportunity to learn, grow, and make a significant impact in a fast-paced startup environment.
- We believe that the freedom and accountability to make decisions in analytics and technology brings out the best in you and helps us build the best for the company.
- This environment offers you a steep learning curve and an opportunity to experience the direct impact of your analytics contributions. Along with this, we offer industry-standard compensation.
What we look for:
- We are looking for individuals with a strong analytical mindset, high levels of initiative / ownership, ability to drive tasks independently, clear communication and comfort working across teams.
- We value not only your skills but also your attitude and hunger to learn, grow, lead, and thrive, both individually and as part of a team.
- We encourage you to take on challenges, bring in new ideas, implement them, and build the best analytics systems.
Key Responsibilities:
- Conduct analytical deep-dives such as funnel analysis, cohort tracking, branch-wise performance reviews, TAT analysis, portfolio diagnostic, credit risk analytics that lead to clear actions.
- Work closely with stakeholders to convert business questions into measurable analyses and clearly communicated outputs.
- Support digital underwriting initiatives, including assisting in the development and analysis of underwriting APIs that enable decisioning on borrower eligibility (“whom to lend”) and exposure sizing (“how much to lend”).
- Develop and maintain periodic MIS and KPI reporting for key business functions (e.g., pipeline, disbursals, TAT, conversion, collections performance, portfolio trends).
- Use Python (pandas, numpy) to clean, transform, and analyse datasets; automate recurring reports and data workflows.
- Perform basic scripting to support data validation, extraction, and lightweight automation.
Required Skills and Qualifications:
- Strong proficiency in Excel, including pivots, lookup functions, data cleaning, and structured analysis.
- Strong working knowledge of SQL, including joins, aggregations, CTEs, and window functions.
- Proficiency in Python for data analysis (pandas, numpy); ability to write clean, maintainable scripts/notebooks.
- Strong logical reasoning and attention to detail, including the ability to identify errors and validate results rigorously.
- Ability to work with ambiguous requirements and imperfect datasets while maintaining output quality.
Preferred (Good to Have):
- REST APIs: A fundamental understanding of APIs and previous experience or projects related to API development/integrations.
- Familiarity with basic AWS tools/services: (S3, lambda, EC2, Glue Jobs).
- Experience with Git and basic engineering practices.
- Any experience with the lending/finance industry.






















