50+ SQL Jobs in India
Apply to 50+ SQL Jobs on CutShort.io. Find your next job, effortlessly. Browse SQL Jobs and apply today!
Like us, you'll be deeply committed to delivering impactful outcomes for customers.
- 7+ years of demonstrated ability to develop resilient, high-performance, and scalable code tailored to application usage demands.
- Ability to lead by example with hands-on development while managing project timelines and deliverables. Experience in agile methodologies and practices, including sprint planning and execution, to drive team performance and project success.
- Deep expertise in Node.js, with experience in building and maintaining complex, production-grade RESTful APIs and backend services.
- Experience writing batch/cron jobs using Python and Shell scripting.
- Experience in web application development using JavaScript and JavaScript libraries.
- Have a basic understanding of Typescript, JavaScript, HTML, CSS, JSON and REST based applications.
- Experience/Familiarity with RDBMS and NoSQL Database technologies like MySQL, MongoDB, Redis, ElasticSearch and other similar databases.
- Understanding of code versioning tools such as Git.
- Understanding of building applications deployed on the cloud using Google cloud platform(GCP)or Amazon Web Services (AWS)
- Experienced in JS-based build/Package tools like Grunt, Gulp, Bower, Webpack.
Key Responsibilities:
- Application Development: Design and implement both client-side and server-side architecture using JavaScript frameworks and back-end technologies like Golang.
- Database Management: Develop and maintain relational and non-relational databases (MySQL, PostgreSQL, MongoDB) and optimize database queries and schema design.
- API Development: Build and maintain RESTfuI APIs and/or GraphQL services to integrate with front-end applications and third-party services.
- Code Quality & Performance: Write clean, maintainable code and implement best practices for scalability, performance, and security.
- Testing & Debugging: Perform testing and debugging to ensure the stability and reliability of applications across different environments and devices.
- Collaboration: Work closely with product managers, designers, and DevOps engineers to deliver features aligned with business goals.
- Documentation: Create and maintain documentation for code, systems, and application architecture to ensure knowledge transfer and team alignment.
Requirements:
- Experience: 1+ years in backend development in micro-services ecosystem, with proven experience in front-end and back-end frameworks.
- 1+ years experience Golang is mandatory
- Problem-Solving & DSA: Strong analytical skills and attention to detail.
- Front-End Skills: Proficiency in JavaScript and modern front-end frameworks (React, Angular, Vue.js) and familiarity with HTML/CSS.
- Back-End Skills: Experience with server-side languages and frameworks like Node.js, Express, Python or GoLang.
- Database Knowledge: Strong knowledge of relational databases (MySQL, PostgreSQL) and NoSQL databases (MongoDB).
- API Development: Hands-on experience with RESTfuI API design and integration, with a plus for GraphQL.
- DevOps Understanding: Familiarity with cloud platforms (AWS, Azure, GCP) and containerization (Docker, Kubernetes) is a bonus.
- Soft Skills: Excellent problem-solving skills, teamwork, and strong communication abilities.
Nice-to-Have:
- UI/UX Sensibility: Understanding of responsive design and user experience principles.
- CI/CD Knowledge: Familiarity with CI/CD tools and workflows (Jenkins, GitLab CI).
- Security Awareness: Basic understanding of web security standards and best practices.
We are seeking a highly skilled and experienced Senior Full Stack Developerwith 8+years of experience to join our dynamic team. The ideal candidate will have a strong background in both front-end and back-end development, with expertise in .NET, Angular, TypeScript, Azure, SQL Server, Agile methodologies, and Design Patterns. Experience with DocuSign is a plus.
Responsibilities:
- Design, develop, and maintain web applications using .NET, Angular, and TypeScript.
- Collaborate with cross-functional teams to define, design, and ship new features.
- Implement and maintain cloud-based solutions using Azure.
- Develop and optimize SQL Server databases.
- Follow Agile methodologies to manage project tasks and deliverables.
- Apply design patterns and best practices to ensure high-quality, maintainable code.
- Troubleshoot and resolve software defects and issues.
- Mentor and guide junior developers.
Requirements:
- Bachelor's degree in computer science, Engineering, or a related field.
- Proven experience as a Full Stack Developer or similar role.
- Strong proficiency in .NET, Angular, and TypeScript.
- Experience with Azure cloud services.
- Proficient in SQL Server and database design.
- Familiarity with Agile methodologies and practices.
- Solid understanding of design patterns and software architecture principles.
- Excellent problem-solving skills and attention to detail.
- Strong communication and teamwork abilities.
- Experience with DocuSign is a plus.
What You’ll Be Doing:
● Own the architecture and roadmap for scalable, secure, and high-quality data pipelines
and platforms.
● Lead and mentor a team of data engineers while establishing engineering best practices,
coding standards, and governance models.
● Design and implement high-performance ETL/ELT pipelines using modern Big Data
technologies for diverse internal and external data sources.
● Drive modernization initiatives including re-architecting legacy systems to support
next-generation data products, ML workloads, and analytics use cases.
● Partner with Product, Engineering, and Business teams to translate requirements into
robust technical solutions that align with organizational priorities.
● Champion data quality, monitoring, metadata management, and observability across the
ecosystem.
● Lead initiatives to improve cost efficiency, data delivery SLAs, automation, and
infrastructure scalability.
● Provide technical leadership on data modeling, orchestration, CI/CD for data workflows,
and cloud-based architecture improvements.
Qualifications:
● Bachelor's degree in Engineering, Computer Science, or relevant field.
● 8+ years of relevant and recent experience in a Data Engineer role.
● 5+ years recent experience with Apache Spark and solid understanding of the
fundamentals.
● Deep understanding of Big Data concepts and distributed systems.
● Demonstrated ability to design, review, and optimize scalable data architectures across
ingestion.
● Strong coding skills with Scala, Python and the ability to quickly switch between them with
ease.
● Advanced working SQL knowledge and experience working with a variety of relational
databases such as Postgres and/or MySQL.
● Cloud Experience with DataBricks.
● Strong understanding of Delta Lake architecture and working with Parquet, JSON, CSV,
and similar formats.
● Experience establishing and enforcing data engineering best practices, including CI/CD
for data, orchestration and automation, and metadata management.
● Comfortable working in an Agile environment
● Machine Learning knowledge is a plus.
● Demonstrated ability to operate independently, take ownership of deliverables, and lead
technical decisions.
● Excellent written and verbal communication skills in English.
● Experience supporting and working with cross-functional teams in a dynamic
environment.
REPORTING: This position will report to Sr. Technical Manager or Director of Engineering as
assigned by Management.
EMPLOYMENT TYPE: Full-Time, Permanent
SHIFT TIMINGS: 10:00 AM - 07:00 PM IST
About CoverSelf: We are an InsurTech start-up based out of Bangalore, with a focus on Healthcare. CoverSelf empowers healthcare insurance companies with a truly NEXT-GEN cloud-native, holistic & customizable platform preventing and adapting to the ever-evolving claims & payment inaccuracies. Reduce complexity and administrative costs with a unified healthcare dedicated platform.
Qualifications:
- Must have a Bachelor’s degree in computer science or equivalent.
- Must have at least 5+ years’ experience as a SDET.
- At least 1+ year of leadership experience or managing a team.
Responsibilities:
- Design, develop and execute automation scripts using open-source tools.
- Troubleshooting any errors and streamlining the testing procedures.
- Writing and executing detailed test plans, test design & test cases covering feature, integration, regression, certification, system level testing as well as release validation in production.
- Identify, analyze and create detailed records of problems that appear during testing, such as software defects, bugs, functionality issues, and output errors, and work directly with software developers to find solutions and develop retesting procedures.
- Good time-management skills and commitment to meet deadlines.
- Stay up-to-date with new testing tools and test strategies.
- Driving technical projects and providing leadership in an innovative and fast-paced environment.
Requirements:
- Experience in the Automation - API and UI as well as Manual Testing on Web Application.
- Experience in framework like Playwright / Selenium Web Driver / Robot Framework / Rest-Assured.
- Must be proficient in Performance Testing tools like K6 / Gatling / JMeter.
- Must be proficient in Core Java / Type Script and Java 17.
- Experience in JUnit-5.
- Good to have Type Script experience.
- Good to have RPA Experience using Java or any other tools like Robot Framework / Automation Anywhere.
- Experience in SQL (like MySQL, PG) & No-SQL Database (like MongoDB).
- Good understanding of software & systems architecture.
- Well acquainted with Agile Methodology, Software Development Life Cycle (SDLC), Software Test Life Cycle (STLC) and Automation Test Life Cycle.
- Strong experience REST based components testing, back-end, DB and micro services testing.
Work Location: Jayanagar - Bangalore.
Work Mode: Work from Office.
Benefits: Best in the Industry Compensation, Friendly & Flexible Leave Policy, Health Benefits, Certifications & Courses Reimbursements, Chance to be part of rapidly growing start-up & the next success story, and many more.
Additional Information: At CoverSelf, we are creating a global workplace that enables everyone to find their true potential, purpose, and passion irrespective of their background, gender, race, sexual orientation, religion and ethnicity. We are committed to providing equal opportunity for all and believe that diversity in the workplace creates a more vibrant, richer work environment that advances the goals of our employees, communities and the business.
Interested: https://coverself.keka.com/careers/jobdetails/73871
🚀 Hiring: QA Engineer (Manual + Automation)
⭐ Experience: 3+ Years
📍 Location: Bangalore
⭐ Work Mode:- Hybrid
⏱️ Notice Period: Immediate Joiners
(Only immediate joiners & candidates serving notice period)
💫 About the Role:
We’re looking for a skilled QA Engineer You’ll ensure product quality through manual and automated testing across web, mobile, and APIs — working with tools and technologies like Postman, Playwright, Appium, Rest Assured, GCP/AWS, and React/Next.js.
Key Responsibilities:
✅ Develop & maintain automated tests using Cucumber, Playwright, Pytest, etc.
✅ Perform API testing using Postman.
✅ Work on cloud platforms (GCP/AWS) and CI/CD (Jenkins).
✅ Test web & mobile apps (Appium, BrowserStack, LambdaTest).
✅ Collaborate with developers to ensure seamless releases.
Must-Have Skills:
✅ API Testing (Postman)
✅ Cloud (GCP / AWS)
✅ Frontend understanding (React / Next.js)
✅ Strong SQL & Git skills
✅ Familiarity with OpenAI APIs
Role: Full-Time
Work Location: Bangalore (Client Location – LeadSquared)
Address: 2nd & 3rd Floor, Omega, Embassy Tech Square, Marathahalli - Sarjapur Outer Ring Rd, Kaverappa Layout, Kadubeesanahalli, Bellandur, Bengaluru, Karnataka – 560103
Interview Process: Test and Technical Discussion (Face-to-Face at Client Location)
Work Mode: 4 Days Work from Office
Preference: Local Bangalore Candidates
Responsibilities & Skills Required
- 4–6 years of experience in building Web Applications & APIs
- Proficient in C#
- Web Framework: React.js
- API Framework: .NET Core
- Database: MySQL or SQL Server
- Strong knowledge of multi-threading and asynchronous programming
- Experience with Cloud Platforms: AWS, GCP, or Azure
- Strong SQL programming skills with experience in handling large datasets (millions of records)
- Ability to write clean, maintainable, and scalable code
- Sound understanding of scalable web application design principles
Job Summary:
We are looking for technically skilled and customer-oriented SME Voice – Technical Support Associates to provide voice-based support to enterprise clients. The role involves real-time troubleshooting of complex issues across servers, networks, cloud platforms (Azure), databases, and more. Strong communication and problem-solving skills are essential.
Key Responsibilities:
- Provide technical voice support to B2B (enterprise) customers.
- Troubleshoot and resolve issues related to:
- SQL, DNS, VPN, Server Support (Windows/Linux)
- Networking (TCP/IP, routing, firewalls)
- Cloud Services – especially Microsoft Azure
- Application and system-level issues
- Assist with technical configurations and product usage.
- Accurately document cases and escalate unresolved issues.
- Ensure timely resolution while meeting SLAs and quality standards.
Required Skills & Qualifications:
- 2.5 to 5 years in technical support (voice-based, B2B preferred)
Proficiency in:
- SQL, DNS, VPN, Server Support
- Networking, Microsoft Azure
- Basic understanding of coding/scripting
- Strong troubleshooting and communication skills
- Ability to work in a 24x7 rotational shift environment
Hiring for Java Automation Tester
Exp : 7 - 10 yrs
Edu : BE /B.Tech
Work Location : Hyderabad WFO
Notice Period : Immediate - 15 days
F2F Interview
Working Hours : 12 - 9 pm
Skills :
Exp in backend API automation testing using Rest assured API or any equivalent API automation testing tool.
Proficient in BDD framework like Cucumber
Having strong knowledge in OOPS concepts, Data Structures
Experience with relational databases like SQL Server
Experience in PCF and GCP • MongoDB • Jira, Confluence, BitBucket, Bamboo, Agile methodologies.
As an L3 Data Scientist, you’ll work alongside experienced engineers and data scientists to solve real-world problems using machine learning (ML) and generative AI (GenAI). Beyond classical data science tasks, you’ll contribute to building and fine-tuning large language model (LLM)– based applications, such as chatbots, copilots, and automation workflows.
Key Responsibilities
- Collaborate with business stakeholders to translate problem statements into data science tasks.
- Perform data collection, cleaning, feature engineering, and exploratory data analysis (EDA).
- Build and evaluate ML models using Python and libraries such as scikit-learn and XGBoost.
- Support the development of LLM-powered workflows like RAG (Retrieval-Augmented Generation), prompt engineering, and fine-tuning for use cases including summarization, Q&A, and task automation.
- Contribute to GenAI application development using frameworks like LangChain, OpenAI APIs, or similar ecosystems.
- Work with engineers to integrate models into applications, build/test APIs, and monitor performance post-deployment.
- Maintain reproducible notebooks, pipelines, and documentation for ML and LLM experiments.
- Stay updated on advancements in ML, NLP, and GenAI, and share insights with the team.
Required Skills & Qualifications
- Bachelor’s or Master’s degree in Computer Science, Engineering, Mathematics, Statistics, or a related field.
- 6–9 years of experience in data science, ML, or AI (projects and internships included).
- Proficiency in Python with experience in libraries like pandas, NumPy, scikit-learn, and matplotlib.
- Basic exposure to LLMs (e.g., OpenAI, Cohere, Mistral, Hugging Face) or a strong interest with the ability to learn quickly.
- Familiarity with SQL and structured data handling.
- Understanding of NLP fundamentals and vector-based retrieval techniques (a plus).
- Strong communication, problem-solving skills, and a proactive attitude.
Nice-to-Have (Not Mandatory)
- Experience with GenAI prototyping using LangChain, LlamaIndex, or similar frameworks.
- Knowledge of REST APIs and model integration into backend systems.
- Familiarity with cloud platforms (AWS/GCP/Azure), Docker, or Git.
Shift: 2:00 PM – 11:00 PM IST
Experience: 6+ years of hands-on Data Engineering experience
About the Role:
We are looking for experienced Data Engineers who can design, build, and optimize large-scale data pipelines. This role is for individual contributors who love coding, problem-solving, and working with cutting-edge big data technologies.
Key Responsibilities:
- Design, develop, and maintain scalable data pipelines for ETL processes using Spark (PySpark/Scala).
- Optimize data flow and automate existing processes for improved performance.
- Collaborate with cross-functional teams — Data Science, Analytics, and Product — to support data infrastructure needs.
- Work on data quality, reliability, and performance improvements.
- Handle large datasets from multiple sources (structured & unstructured).
Required Skills & Experience:
- 6+ years of experience as a Data Engineer working in big data environments.
- Strong programming experience in Scala and/or PySpark.
- Hands-on experience with Apache Spark, Databricks, and cloud-based data architectures.
- Solid knowledge of SQL and relational databases (Postgres, MySQL, etc.).
- Experience with file formats like Parquet, Delta Tables, CSV, JSON.
- Comfortable working in a Linux shell environment.
- Strong communication and problem-solving skills.
Good to Have:
- Experience in machine learning data pipelines.
- Working knowledge of Agile environments and CI/CD data workflows.
Pay: ₹70,000.00 - ₹90,000.00 per month
Job description:
Name of the College: KGiSL Institute of Technology
College Profile: The main objective of KGiSL Institute of Technology is to provide industry embedded education and to mold the students for leadership in industry, government, and educational institutions; to advance the knowledge base of the engineering professions; and to influence the future directions of engineering education and practice. The ability to connect to the future challenges and deliver industry-ready human resources is a credibility that KGISL Educational Institutions have progressively excelled at. Industry -readiness of its students is what will eventually elevate an institution to star status and its competitiveness in the job market. Choice of such an institution will depend on its proximity to industry, the relevance of its learning programme to real-time industry and the active connect that a student will have with industry professionals.
Job Title: Assistant Professor / Associate Professor
Departments:
● CSE
Qualification:
● ME/M.Tech/Ph.D(Ph.D must for Associate Professor)
Experience:
● Experience - 9-10 Years
Key Responsibilities:
1. Teaching & Learning:
Deliver high-quality lectures and laboratory sessions in core and advanced areas of Computer Science & Engineering.
Prepare lesson plans, teaching materials, and assessment tools as per the approved curriculum.
Adopt innovative teaching methodologies, including ICT-enabled learning and outcome-based education (OBE).
2. Research & Publications:
Conduct independent and collaborative research in areas of specialization.
Publish research papers in peer-reviewed journals and present in reputed conferences.
Eligibility & Qualifications (As per AICTE/UGC Norms):
Educational Qualification: Ph.D. in Computer Science & Engineering or relevant discipline.
Experience: Minimum of 9 years teaching/research/industry experience, with at least 3 years at the level of Assistant Professor.
Research: Minimum of 7 publications in refereed journals as per UGC-CARE list and at least one Ph.D. degree awarded or ongoing under supervision.
Other Requirements:
- Good academic record throughout.
- Proven ability to attract research funding.
- Strong communication and interpersonal skills.
- Work Location: [ KGiSL Campus]
- Employment Type: Full-time / Permanent
- Joining time: immediately
Job Type: Full-time
Benefits:
- Health insurance
- Life insurance
- Provident Fund
Work Location: In person
About Fundly
- Fundly is building a retailer centric Pharma Supply Chain platform and Marketplace for over 10 million pharma retailers in India
- Founded by experienced industry professionals with cumulative experience of 30+ years
- Has grown to 60+ people in 12 cities in less than 2 years 4. Monthly disbursement of INR 50 Cr 5. Raised venture capital of USD 5M so far from Accel Partners which is biggest VC Fund of India
Opportunity at Fundly
- Building a retailer centric ecosystem in Pharma Supply Chain
- Fast growing– 3000+ retailers, 36000 Transactions and 200+ Cr disbursement in last 2 years
- Technology First and Customer first fintech organization
- Be an early team member, visible and influence the product and technology roadmap
- Be a leader and own responsibility and accountability
Responsibilities
- Be hands-on and ship good quality code Fast
- Execute and deploy technical solutions
- Understand existing code, maintain and improve it
- Control Technical Debt
- Ensure healthy software engineering practices like planning, estimation, documentation, code review
Qualifications
- 3+ years of Hands-on experience in Java, Spring Boot, Spring MVC, Hibernate, Play
- Hands on experience in SQL and NoSQL databases like Postgres, MongoDB, ElasticSearch, Redis
Job Summary:
Technical Support Associates
We are looking for technically skilled and customer-oriented SME Voice – Technical Support Associates to provide voice-based support to enterprise clients. The role involves real-time troubleshooting of complex issues across servers, networks, cloud platforms (Azure), databases, and more. Strong communication and problem-solving skills are essential.
Key Responsibilities:
- Provide technical voice support to B2B (enterprise) customers.
- Troubleshoot and resolve issues related to:
- SQL, DNS, VPN, Server Support (Windows/Linux)
- Networking (TCP/IP, routing, firewalls)
- Cloud Services – especially Microsoft Azure
- Application and system-level issues
- Assist with technical configurations and product usage.
- Accurately document cases and escalate unresolved issues.
- Ensure timely resolution while meeting SLAs and quality standards.
Required Skills & Qualifications:
- 2.5 to 5 years in technical support (voice-based, B2B preferred)
Proficiency in:
- SQL, DNS, VPN, Server Support
- Networking, Microsoft Azure
- Basic understanding of coding/scripting
- Strong troubleshooting and communication skills
- Ability to work in a 24x7 rotational shift environment
Looking for an experienced Python Developer with a minimum of eight years of experience in Python and its related web frameworks.
Strong knowledge of web frameworks like Flask / Django / Fast API
Experience with relational databases such as PostgreSQL, MySQL, or similar.
Familiarity with version control systems like Git.
Sr Software Engineer
Company Summary :
As the recognized global standard for project-based businesses, Deltek delivers software and information solutions to help organizations achieve their purpose. Our market leadership stems from the work of our diverse employees who are united by a passion for learning, growing and making a difference.
At Deltek, we take immense pride in creating a balanced, values-driven environment, where every employee feels included and empowered to do their best work. Our employees put our core values into action daily, creating a one-of-a-kind culture that has been recognized globally. Thanks to our incredible team, Deltek has been named one of America's Best Midsize Employers by Forbes, a Best Place to Work by Glassdoor, a Top Workplace by The Washington Post and a Best Place to Work in Asia by World HRD Congress. www.deltek.com
Business Summary :
The Deltek Engineering and Technology team builds best-in-class solutions to delight customers and meet their business needs. We are laser-focused on software design, development, innovation and quality. Our team of experts has the talent, skills and values to deliver products and services that are easy to use, reliable, sustainable and competitive. If you're looking for a safe environment where ideas are welcome, growth is supported and questions are encouraged – consider joining us as we explore the limitless opportunities of the software industry.
Position Responsibilities :
About the Role
We are looking for a skilled and motivated Senior Software Developer to join our team responsible for developing and maintaining a robust ERP solution used by approximately 400 customers and more than 30000 users worldwide. The system is built using C# (.NET Core), leverages SQL Server for data management, and is hosted in the Microsoft Azure cloud.
This role offers the opportunity to work on a mission-critical product, contribute to architectural decisions, and help shape the future of our cloud-native ERP platform.
Key Responsibilities
- Design, develop, and maintain features and modules within the ERP system using C# (.NET Core)
- Optimize and manage SQL Server database interactions for performance and scalability
- Collaborate with cross-functional teams, including QA, DevOps, Product Management, and Support
- Participate in code reviews, architecture discussions, and technical planning
- Contribute to the adoption and improvement of CI/CD pipelines and cloud deployment practices
- Troubleshoot and resolve complex technical issues across the stack
- Ensure code quality, maintainability, and adherence to best practices
- Stay current with emerging technologies and recommend improvements where applicable
Qualifications
- Curiosity, passion, teamwork, and initiative
- Strong experience with C# and .NET Core in enterprise application development
- Solid understanding of SQL Server, including query optimization and schema design
- Experience with Azure cloud services (App Services, Azure SQL, Storage, etc.)
- Ability to utilize agentic AI as a development support, with a critical thinking attitude
- Familiarity with agile development methodologies and DevOps practices
- Ability to work independently and collaboratively in a fast-paced environment
- Excellent problem-solving and communication skills
- Master's degree in Computer Science or equivalent; 5+ years of relevant work experience
- Experience with ERP systems or other complex business applications is a plus
What We Offer
- A chance to work on a product that directly impacts thousands of users worldwide
- A collaborative and supportive engineering culture
- Opportunities for professional growth and technical leadership
- Competitive salary and benefits package
Job Title: Senior Tableau Developer
Location: Gurgaon
Experience: 4+ Years in retail domain
Salary: Negotiable
Job Summary:
We need a Senior Tableau Developer with a minimum of 4 years to join our BI team. The ideal candidate will be responsible for designing, developing, and deploying business intelligence solutions using Tableau.
Key Responsibilities:
· Design and develop interactive and insightful Tableau dashboards and visualizations.
· Optimize dashboards for performance and usability.
· Work with SQL and data warehouses (Snowflake) to fetch and prepare clean data sets.
· Gather and analyse business requirements, translate them into functional and technical specifications.
· Collaborate with cross-functional teams to understand business KPIs and reporting needs.
· Conduct unit testing and resolve data or performance issues.
· Strong understanding of data visualization principles and best practices.
Tech. Skills Required:
· Proficient in Tableau Desktop (dashboard development, storyboards)
· Strong command of SQL (joins, subqueries, CTEs, aggregation)
· Experience with large data sets and complex queries
· Experience working on any Data warehouse (Snowflake, Redshift)
· Excellent analytical and problem-solving skills.
Mail updated resume with current salary-
Email: jobs[at]glansolutions[dot]com
Satish: 88O 27 49 743

Position: Full Stack Developer ( PHP Codeigniter)
Company : Mayura Consultancy Services
Experience: 2 yrs
Location : Bangalore
Skill: HTML, CSS, Bootstrap, Javascript, Ajax, Jquery , PHP and Codeigniter or CI
Work Location: Work From Home(WFH)
Website : https://www.mayuraconsultancy.com/
Requirements :
- Prior experience in Full Stack Development using PHP Codeigniter
Perks of Working with MCS :
- Contribute to Innovative Solutions: Join a dynamic team at the forefront of software development, contributing to innovative projects and shaping the technological solutions of the organization.
- Work with Clients from across the Globe: Collaborate with clients from around the world, gaining exposure to diverse cultures and industries, and contributing to the development of solutions that address the unique needs and challenges of global businesses.
- Complete Work From Home Opportunity: Enjoy the flexibility of working entirely from the comfort of your home, empowering you to manage your schedule and achieve a better work-life balance while coding innovative solutions for MCS.
- Opportunity to Work on Projects Developing from Scratch: Engage in projects from inception to completion, working on solutions developed from scratch and having the opportunity to make a significant impact on the design, architecture, and functionality of the final product.
- Diverse Projects: Be involved in a variety of development projects, including web applications, mobile apps, e-commerce platforms, and more, allowing you to showcase your versatility as a Full Stack Developer and expand your portfolio.
Joining MCS as a Full Stack Developer opens the door to a world where your technical skills can shine and grow, all while enjoying a supportive and dynamic work environment. We're not just building solutions; we're building the future—and you can be a key part of that journey.
Company Name – Wissen Technology
Location : Pune / Bangalore / Mumbai (Based on candidate preference)
Work mode: Hybrid
Experience: 5+ years
Job Description
Wissen Technology is seeking an experienced C# .NET Developer to build and maintain applications related to streaming market data. This role involves developing message-based C#/.NET applications to process, normalize, and summarize large volumes of market data efficiently. The candidate should have a strong foundation in Microsoft .NET technologies and experience working with message-driven, event-based architecture. Knowledge of capital markets and equity market data is highly desirable.
Responsibilities
- Design, develop, and maintain message-based C#/.NET applications for processing real-time and batch market data feeds.
- Build robust routines to download and process data from AWS S3 buckets on a frequent schedule.
- Implement daily data summarization and data normalization routines.
- Collaborate with business analysts, data providers, and other developers to deliver high-quality, scalable market data solutions.
- Troubleshoot and optimize market data pipelines to ensure low latency and high reliability.
- Contribute to documentation, code reviews, and team knowledge sharing.
Required Skills and Experience
- 5+ years of professional experience programming in C# and Microsoft .NET framework.
- Strong understanding of message-based and real-time programming architectures.
- Experience working with AWS services, specifically S3, for data retrieval and processing.
- Experience with SQL and Microsoft SQL Server.
- Familiarity with Equity market data, FX, Futures & Options, and capital markets concepts.
- Excellent interpersonal and communication skills.
- Highly motivated, curious, and analytical mindset with the ability to work well both independently and in a team environment.
Education
- Bachelor’s degree in Computer Science, Engineering, or a related technical field.
Job Title: PHP Coordinator / Laravel Developer
Experience: 4+ Years
Work Mode: Work From Home (WFH)
Working Days: 5 Days
Job Description:
We are looking for an experienced PHP Coordinator / Laravel Developer to join our team. The ideal candidate should have strong expertise in PHP and Laravel framework, along with the ability to coordinate and manage development tasks effectively.
Key Responsibilities:
- Develop, test, and maintain web applications using PHP and Laravel.
- Coordinate with team members to ensure timely project delivery.
- Write clean, secure, and efficient code.
- Troubleshoot, debug, and optimize existing applications.
- Collaborate with stakeholders to gather and analyze requirements.
Required Skills:
- Strong experience in PHP and Laravel framework.
- Good understanding of MySQL and RESTful APIs.
- Familiarity with front-end technologies (HTML, CSS, JavaScript).
- Excellent communication and coordination skills.
- Ability to work independently in a remote environment.
Location - Noida
Duration - 6 Months
Experience – 7-10 Years
In this role, one shall be responsible for developing and supporting applications using Java technology framework.
• Provide support throughout the application lifecycle using the standard DT (Digital Technology) defined support environment comprised of process and middleware support.
• Ensure application incident management, ticket routing, and resolution of issues.
• Provide resolution of application issues as per DT support process within the given SLA.
• Update application code & database objects/data, as applicable, to production server only after UAT and approval from DT. Follow change process for Production deployment.
• Perform comprehensive testing – functionality testing, peer to peer testing based on defined DT.
• Testing deliverable and templates on the applications to ensure and verify that the issues are completely fixed.
• Raise tickets if any issues are reported by mail / phone / in person.
• All incidents to be tracked through Issue Tracker/Problem Management Ticket till Root Cause Corrective Action (RCCA) is identified/issue is fixed.
• All fixes to be followed by testing; DT Owner needs to sign off for the production deployment.
• Deliver improvements and changes with quality on time ( per defined SLAs). Requirement priority to be defined by DT.
• Carry out security testing of all the applications which are requested by DT for SAST/DAST to identify any vulnerability in code. The report for the same needs to be submitted to the DT SPOC.
• Review performance metrics reports and trends with DT Leaders on a Quarterly basis and as requested.
• Document and update the issue resolution knowledge-base on timely basis.
• Enthusiastically identify and eliminate problems and bugs in existing applications, with adequate approvals from DT.
• Provide support for the planned outages during on/off work hours
• Analysis of break-fix tickets, by application, on monthly basis, for root causes along with the recommended preventive actions and a plan to implement preventive actions.
• Upon receiving approval analyse and plan accordingly and perform preventive actions, implement the preventive actions on time.
• Development in the existing applications in accordance to DT SDLC process with all corresponding documentation.
• Applications categorized as Export Control (EC), must be supported by EC eligible person.
• Development and support of the interfaces between the applications.
• RCCA documentation for each unplanned outage
• Incident and request ticket completions with corresponding Service Now updates
• SDLC documentation updates after application changes are implemented
• Professional, timely and accurate user communications
• User documentation
• Quality assurance and testing for all changes.
Qualifications we seek in you!
Minimum qualifications
• BE/B Tech/ MCA/BCA
• Excellent written and verbal communication skills
• Java/J2EE, Spring Boot, Web Services (SOAP/REST), Oracle PL/SQL
• Technical Professional, with expertize abilities in the areas of Java development and implementation.
• Excellent customer facing skills that include conducting compelling technical briefing & demonstrations. The person should have a technical curiosity to implement new technologies and articulate the solutions to the customer.
Preferred Qualifications
• Resource must be eligible to work with Export Controlled (EC) data hence resource should be Green Card (GC)/US citizen.
• Knowledge on Design concepts.
• Ability to lead and balance team independently.
• Good communication and coordination skills.
• Collectively work as a team and handling team experience.
• Language: Java/J2EE, Servlets, Java Beans, GE Support Central
• Web Scripting Language: JSP, JavaScript, HTML, DHTML, XML, AJAX, HTML 5, AngularJS, jQuery
• Framework: Struts (Spring, Hibernate or other framework upon mutual agreement with DT, Casper
• IDE: Net Beans 7.x or above, Eclipse
• Database and Language: Oracle 12c and above, SQL, PL/SQL (including Packages, Stored Procedures, Functions, Triggers), PostgreSQL, MySQL, MS SQL
• Operating System: Windows, Unix, and Linux
• Internet Browser: Microsoft Internet Explorer 11.0 and above
• Code Repository: CVS (Open GE) or GIT
• Experience in ITIL processes
• Excellent planning, coordination, interaction skills, and ability to work as team.
• Good analytical, problem-solving and critical thinking skills.
• Detail oriented and organized, able to track multiple tasks simultaneously
• Proficient in Agile and DevOps environments
• Knowledge of software and application design and architecture
• Experience in software development and coding in various languages (C#, .NET, Java etc.)
• Hands on experience in Agile & CI/CD toolsets such as Rally, Git, Jenkins, Robot Framework, etc
• Understanding of software development lifecycle (SDLC)
- Bachelor’s degree in Computer Science/Engineering or equivalent experience.
- 5+ years of professional experience in Node.Js, Angular, AWS, MongoDB or Postgres OR MySQL.
- Strong experience with MEAN stack development.
- Proficiency in SQL databases (MySQL, PostgreSQL, MongoDb).
- Solid understanding of AWS services for application hosting and scaling.
Who We Are:
Increff is the most preferred retail SaaS partner, solving complex inventory management and supply chain challenges for retailers seeking to revolutionize their supply chains both technologically and operationally
.
What We Offer:
Our core focus is on providing innovative retail tech solutions, including merchandising and omni channel inventory management. These solutions are meticulously designed to cater to the specific needs of brands and retailers, empowering them to thrive in the dynamic marketplace Innovative technology, comprehensive support, and a dynamic environment for career growth.
Our Vision:
To be the most admired retail technology company.
Role Overview:
We’re looking for a passionate Software Engineer to design, develop and install software solutions. Ideally, the candidate should be able to build high-quality, innovative and fully performing software in compliance with coding standards and technical design. Software engineers must be skilled in development, writing code, and documenting functionality.
Responsibilities:
- Executing full life-cycle software development
- Writing well designed, testable, efficient code
- Producing specifications and determining operational feasibility
- Integrating software components into fully functional software systems
- Developing software verification plans and quality assurance procedures
- Documenting and maintaining software functionality
- Tailoring and deploying software tools, processes and metrics
- Serving as a subject matter expert
- Complying with project plans and industry standards
Requirements
- Bachelor’s degree in Engineering, Computer Science, or a related field.
- Proven work experience as a software engineer or software developer.
- Proficiency in Python and Java for backend development, with experience in object-oriented programming concepts.
- Strong hands-on skills in SQL and working with relational databases, including query optimization and ORM frameworks (e.g., Hibernate, JPA2).
- Advanced knowledge of data structures, algorithms, and core system design fundamentals.
- Demonstrated experience with at least one web application framework such as Flask or Spring.
- Sound understanding of software engineering best practices, with familiarity in test-driven development and software development methodologies.
- Experience with cloud platforms and Big Data concepts is a plus (e.g., Apache Spark, Trino, Azure, Azure Synapse Analytics).
- Good knowledge of statistical concepts and data analysis is advantageous.
- Strong problem-solving skills, aptitude, and the ability to take ownership and accountability of deliverables.
- Excellent communication skills and the ability to document requirements, designs, and specifications effectively.
Our Culture:
At Increff, we take great pride in fostering an open, flexible, and collaborative workplace. Our culture empowers employees to innovate, build fulfilling careers, and enjoy their work. Moreover, we strongly encourage the development of leadership skills from within the organization. Our commitment to transparency ensures that at every level, individuals have the autonomy to initiate, take ownership of projects, and successfully execute them.
Shift: 9PM IST to 6 AM IST
Experience: 8 + Years
Requirements
We are seeking a skilled and experienced Data Integration Specialist with over 5 years of experience in designing and developing data solutions using Oracle Data Integrator (ODI). The ideal candidate will have strong expertise in data modeling, ETL/ELT processes, and SQL, along with exposure to Python scripting for API-based data ingestion.
Key Responsibilities:
Design, develop, and maintain functions and stored procedures using Oracle Data Integrator (ODI).
Create and document data warehouse schemas, including fact and dimension tables, based on business requirements.
Develop and execute SQL scripts for table creation and collaborate with Database Administrators (DBAs) for deployment.
Analyze various data sources to identify relationships and align them with Business Requirements Documentation (BRD).
Design and implement Extract, Load, Transform (ELT) processes to load data from source systems into staging and target environments.
Validate and profile data using Structured Query Language (SQL) and other analytical tools to ensure data accuracy and completeness.
Apply best practices in data governance, including query optimization, metadata management, and data quality monitoring.
Demonstrate strong data modeling skills to support scalable and efficient data architecture.
Utilize Python to automate data collection from APIs, enhancing integration workflows and enabling real-time data ingestion.
Investigate and resolve data quality issues through detailed analysis and root cause identification.
Communicate effectively with stakeholders through strong written, verbal, and analytical skills.
Exhibit excellent problem-solving and research capabilities in a fast-paced, data-driven environment.
Tech Stack / Requirements:
- Experience required: 1 - 2 yrs atleast
- Candidates must be from an IT Engineering background (B.E./B.Tech in Information Technology, Computer Science, or related fields), B.Sc. IT, BCA or related fields.
- Strong understanding of JavaScript
- Experience with React Native / Expo
- Familiarity with SQL
- Exposure to REST APIs integration
- Fast learner with strong problem-solving & debugging skills
Responsibilities:
- Build & improve mobile app features using React Native / Expo
- Develop and maintain web features using React.js / Next.js
- Integrate APIs and ensure seamless user experiences across platforms
- Collaborate with backend & design teams for end-to-end development
- Debug & optimize performance across mobile and web
- Write clean, maintainable code and ship to production regularly
Work closely with the founding team / CTO and contribute to product launches
Growth: Performance-based growth with significant hikes possible in the same or upcoming months.
Job Title : Informatica Cloud Developer / Migration Specialist
Experience : 6 to 10 Years
Location : Remote
Notice Period : Immediate
Job Summary :
We are looking for an experienced Informatica Cloud Developer with strong expertise in Informatica IDMC/IICS and experience in migrating from PowerCenter to Cloud.
The candidate will be responsible for designing, developing, and maintaining ETL workflows, data warehouses, and performing data integration across multiple systems.
Mandatory Skills :
Informatica IICS/IDMC, Informatica PowerCenter, ETL Development, SQL, Data Migration (PowerCenter to IICS), and Performance Tuning.
Key Responsibilities :
- Design, develop, and maintain ETL processes using Informatica IICS/IDMC.
- Work on migration projects from Informatica PowerCenter to IICS Cloud.
- Troubleshoot and resolve issues related to mappings, mapping tasks, and taskflows.
- Analyze business requirements and translate them into technical specifications.
- Conduct unit testing, performance tuning, and ensure data quality.
- Collaborate with cross-functional teams for data integration and reporting needs.
- Prepare and maintain technical documentation.
Required Skills :
- 4 to 5 years of hands-on experience in Informatica Cloud (IICS/IDMC).
- Strong experience with Informatica PowerCenter.
- Proficiency in SQL and data warehouse concepts.
- Good understanding of ETL performance tuning and debugging.
- Excellent communication and problem-solving skills.
We’re seeking a highly experienced and certified Salesforce Marketing Cloud (SFMC) Subject Matter Expert (SME) to lead strategic initiatives and bridge the gap between business stakeholders and technical teams. This role requires strong communication skills and the ability to translate business needs into scalable solutions. Experience with financial services industry in Asia market is an added advantage.
Key Responsibilities:
- Serve as the primary liaison between marketing stakeholders and technical teams to ensure seamless campaign execution.
- Architect, guide and support scalable solutions using SFMC modules like Journey Builder, Email Studio, Automation Studio, and Contact Builder.
- Translate marketing goals into technical specifications and actionable workflows.
- Lead integration efforts with CRM systems, data platforms, and third-party platforms.
- Design and optimize customer journeys, segmentation strategies, and real-time personalization.
- Ensure data governance, privacy compliance, and platform security best practices.
- Conduct workshops, demos, and training sessions to drive platform adoption and maturity.
- Stay ahead of Salesforce releases and innovations, especially within Marketing Cloud and Einstein AI.
Requirements:
- 8–12 years of hands-on experience in Salesforce, with a strong focus on Marketing Cloud.
- Salesforce certifications such as Marketing Cloud Consultant, Email Specialist, or Architect
- Strong understanding of marketing automation, customer journeys, and campaign analytics.
- Proficiency in AMPscript, SQL, SSJS, and data modeling within SFMC.
- Experience with API integrations, SDKs, and event tracking across web and mobile platforms.
- Familiarity with tools like Jira, Confluence, Tableau CRM, and CDP platforms is a plus.
- Familiarity with other similar marketing platforms like Mo Engage is an added advantage.
Who we are:
Kanerika Inc. is a premier global software products and services firm that specializes in providing innovative solutions and services for data-driven enterprises. Our focus is to empower businesses to achieve their digital transformation goals and maximize their business impact through the effective use of data and AI.
We leverage cutting-edge technologies in data analytics, data governance, AI-ML, GenAI/ LLM and industry best practices to deliver custom solutions that help organizations optimize their operations, enhance customer experiences, and drive growth.
Awards and Recognitions:
Kanerika has won several awards over the years, including:
1. Best Place to Work 2023 by Great Place to Work®
2. Top 10 Most Recommended RPA Start-Ups in 2022 by RPA Today
3. NASSCOM Emerge 50 Award in 2014
4. Frost & Sullivan India 2021 Technology Innovation Award for its Kompass composable solution architecture
5. Kanerika has also been recognized for its commitment to customer privacy and data security, having achieved ISO 27701, SOC2, and GDPR compliances.
Working for us:
Kanerika is rated 4.6/5 on Glassdoor, for many good reasons. We truly value our employees' growth, well-being, and diversity, and people’s experiences bear this out. At Kanerika, we offer a host of enticing benefits that create an environment where you can thrive both personally and professionally. From our inclusive hiring practices and mandatory training on creating a safe work environment to our flexible working hours and generous parental leave, we prioritize the well-being and success of our employees.
Our commitment to professional development is evident through our mentorship programs, job training initiatives, and support for professional certifications. Additionally, our company-sponsored outings and various time-off benefits ensure a healthy work-life balance. Join us at Kanerika and become part of a vibrant and diverse community where your talents are recognized, your growth is nurtured, and your contributions make a real impact. See the benefits section below for the perks you’ll get while working for Kanerika.
About the Role:
We are looking for A highly skilled Full Stack .NET Developer with strong hands-on experience in C#, .NET Core, ASP.NET Core, Web API, and Microservices Architecture. Proficient in developing scalable and high-performing applications using SQL Server, NoSQL databases, and Entity Framework (v6+). Recognized for excellent troubleshooting, problem-solving, and communication skills, with the ability to collaborate effectively with cross-functional and international teams, including US counterparts.
Technical Skills:
- Programming Languages: C#, TypeScript, JavaScript
- Frameworks & Technologies: .NET Core, ASP.NET Core, Web API, Angular (v10+), Entity Framework (v6+), Microservices Architecture
- Databases: SQL Server, NoSQL
- Cloud Platform: Microsoft Azure
- Design & Architecture: OOPs Concepts, Design Patterns, Reusable Libraries, Microservices Implementation
- Front-End Development: Angular Material, HTML5, CSS3, Responsive UI Development
- Additional Skills: Excellent troubleshooting abilities, strong communication (verbal & written), and effective collaboration with US counterparts
What You’ll Bring:
- Bachelor’s degree in Computer Science, Engineering, or a related field, or equivalent work experience.
- 6+ years of experience
- Proven experience delivering high-quality web applications.
Mandatory Skills:
- Strong hands-on experience on C#, SQL Server, OOPS Concepts, Micro Services Architecture.
- Solid experience on .NET Core, ASP.NET Core, Web API, SQL, No SQL, Entity Framework 6 or above, Azure, Applying Design Patterns. Strong proficiency in Angular framework (v10+ preferred)and TypeScript & Solid understanding of HTML5, CSS3, JavaScript
- Skill for writing reusable libraries & Experience with Angular Material or other UI component libraries
- Excellent Communication skills both oral & written.
- Excellent troubleshooting and communication skills, ability to communicate clearly with US counter parts
Preferred Skills (Nice to Have):
- Self – Starter with solid analytical and problem- solving skills. Willingness to work extra hours to meet deliverables
- Understanding of Agile/Scrum Methodologies.
- Exposure to cloud platform like AWS/Azure
Employee Benefits:
1. Culture:
- Open Door Policy: Encourages open communication and accessibility to management.
- Open Office Floor Plan: Fosters a collaborative and interactive work environment.
- Flexible Working Hours: Allows employees to have flexibility in their work schedules.
- Employee Referral Bonus: Rewards employees for referring qualified candidates.
- Appraisal Process Twice a Year: Provides regular performance evaluations and feedback.
2. Inclusivity and Diversity:
- Hiring practices that promote diversity: Ensures a diverse and inclusive workforce.
- Mandatory POSH training: Promotes a safe and respectful work environment.
3. Health Insurance and Wellness Benefits:
- GMC and Term Insurance: Offers medical coverage and financial protection.
- Health Insurance: Provides coverage for medical expenses.
- Disability Insurance: Offers financial support in case of disability.
4. Child Care & Parental Leave Benefits:
- Company-sponsored family events: Creates opportunities for employees and their families to bond.
- Generous Parental Leave: Allows parents to take time off after the birth or adoption of a child.
- Family Medical Leave: Offers leave for employees to take care of family members' medical needs.
5. Perks and Time-Off Benefits:
- Company-sponsored outings: Organizes recreational activities for employees.
- Gratuity: Provides a monetary benefit as a token of appreciation.
- Provident Fund: Helps employees save for retirement.
- Generous PTO: Offers more than the industry standard for paid time off.
- Paid sick days: Allows employees to take paid time off when they are unwell.
- Paid holidays: Gives employees paid time off for designated holidays.
- Bereavement Leave: Provides time off for employees to grieve the loss of a loved one.
6. Professional Development Benefits:
- L&D with FLEX- Enterprise Learning Repository: Provides access to a learning repository for professional development.
- Mentorship Program: Offers guidance and support from experienced professionals.
- Job Training: Provides training to enhance job-related skills.
- Professional Certification Reimbursements: Assists employees in obtaining professional certifications.
- Promote from Within: Encourages internal growth and advancement opportunities.
Job Title: Mid-Level .NET Developer (Agile/SCRUM)
Location: Mohali, Bangalore, Pune, Navi Mumbai, Chennai, Hyderabad, Panchkula, Gurugram (Delhi NCR), Dehradun
Night Shift from 6:30 pm to 3:30 am IST
Experience: 5+ Years
Job Summary:
We are seeking a proactive and detail-oriented Mid-Level .NET Developer to join our dynamic team. The ideal candidate will be responsible for designing, developing, and maintaining high-quality applications using Microsoft technologies with a strong emphasis on .NET Core, C#, Web API, and modern front-end frameworks. You will collaborate with cross-functional teams in an Agile/SCRUM environment and participate in the full software development lifecycle—from requirements gathering to deployment—while ensuring adherence to best coding and delivery practices.
Key Responsibilities:
- Design, develop, and maintain applications using C#, .NET, .NET Core, MVC, and databases such as SQL Server, PostgreSQL, and MongoDB.
- Create responsive and interactive user interfaces using JavaScript, TypeScript, Angular, HTML, and CSS.
- Develop and integrate RESTful APIs for multi-tier, distributed systems.
- Participate actively in Agile/SCRUM ceremonies, including sprint planning, daily stand-ups, and retrospectives.
- Write clean, efficient, and maintainable code following industry best practices.
- Conduct code reviews to ensure high-quality and consistent deliverables.
- Assist in configuring and maintaining CI/CD pipelines (Jenkins or similar tools).
- Troubleshoot, debug, and resolve application issues effectively.
- Collaborate with QA and product teams to validate requirements and ensure smooth delivery.
- Support release planning and deployment activities.
Required Skills & Qualifications:
- 4–6 years of professional experience in .NET development.
- Strong proficiency in C#, .NET Core, MVC, and relational databases such as SQL Server.
- Working knowledge of NoSQL databases like MongoDB.
- Solid understanding of JavaScript/TypeScript and the Angular framework.
- Experience in developing and integrating RESTful APIs.
- Familiarity with Agile/SCRUM methodologies.
- Basic knowledge of CI/CD pipelines and Git version control.
- Hands-on experience with AWS cloud services.
- Strong analytical, problem-solving, and debugging skills.
- Excellent communication and collaboration skills.
Preferred / Nice-to-Have Skills:
- Advanced experience with AWS services.
- Knowledge of Kubernetes or other container orchestration platforms.
- Familiarity with IIS web server configuration and management.
- Experience in the healthcare domain.
- Exposure to AI-assisted code development tools (e.g., GitHub Copilot, ChatGPT).
- Experience with application security and code quality tools such as Snyk or SonarQube.
- Strong understanding of SOLID principles and clean architecture patterns.
Technical Proficiencies:
- ASP.NET Core, ASP.NET MVC
- C#, Entity Framework, Razor Pages
- SQL Server, MongoDB
- REST API, jQuery, AJAX
- HTML, CSS, JavaScript, TypeScript, Angular
- Azure Services, Azure Functions, AWS
- Visual Studio
- CI/CD, Git
Mandatory Criteria
- Looking for candidates from Bangalore and NP less than or equal to 20 days
- Bachelor’s degree in Computer Science, Engineering, or a related field, or equivalent practical experience.
- 6+ years of experience in quality assurance or software testing, with at least 3 years focused on test automation and 2+ years in a leadership or senior role.
- Solid knowledge of SQL Experience in architecting and implementing automated testing frameworks . Strong in programming/scripting languages such as Python, Java, or JavaScript for automation development.
- Expertise with automation tools like Selenium, Playwright, Appium, or RestAssured, and integrating them into CI/CD workflows.
- Proven leadership skills, including mentoring junior engineers and managing team deliverables in an agile environment.
- Experience with test management tools (e.g., TestRail, qTest) and defect tracking systems (e.g., Jira).
- Deep understanding of testing best practices, including functional, regression, performance, and security testing.
- Ability to analyze system architecture and identify key areas for test coverage and risk mitigation.
- Experience with containerization technologies (Docker, Kubernetes) and cloud platforms (AWS preferred). Understanding of performance, security, and load testing tools (e.g., JMeter, OWASP ZAP).
- Familiarity with observability and monitoring tools (e.g., ELK Stack, Datadog, Prometheus, Grafana) for test environment analysis.
If interested kindly share your updated resume on 82008 31681
As an L1/L2 Data Scientist, you’ll work alongside experienced engineers and data scientists to solve real-world problems using machine learning (ML) and generative AI (GenAI). Beyond classical data science tasks, you’ll contribute to building and fine-tuning large language model (LLM)– based applications, such as chatbots, copilots, and automation workflows.
Key Responsibilities
- Collaborate with business stakeholders to translate problem statements into data science tasks.
- Perform data collection, cleaning, feature engineering, and exploratory data analysis (EDA).
- Build and evaluate ML models using Python and libraries such as scikit-learn and XGBoost.
- Support the development of LLM-powered workflows like RAG (Retrieval-Augmented Generation), prompt engineering, and fine-tuning for use cases including summarization, Q&A, and task automation.
- Contribute to GenAI application development using frameworks like LangChain, OpenAI APIs, or similar ecosystems.
- Work with engineers to integrate models into applications, build/test APIs, and monitor performance post-deployment.
- Maintain reproducible notebooks, pipelines, and documentation for ML and LLM experiments.
- Stay updated on advancements in ML, NLP, and GenAI, and share insights with the team.
Required Skills & Qualifications
- Bachelor’s or Master’s degree in Computer Science, Engineering, Mathematics, Statistics, or a related field.
- 2.5–5 years of experience in data science, ML, or AI (projects and internships included).
- Proficiency in Python with experience in libraries like pandas, NumPy, scikit-learn, and matplotlib.
- Basic exposure to LLMs (e.g., OpenAI, Cohere, Mistral, Hugging Face) or strong interest with the ability to learn quickly.
- Familiarity with SQL and structured data handling.
- Understanding of NLP fundamentals and vector-based retrieval techniques (a plus).
- Strong communication, problem-solving skills, and a proactive attitude.
Nice-to-Have (Not Mandatory)
- Experience with GenAI prototyping using LangChain, LlamaIndex, or similar frameworks.
- Knowledge of REST APIs and model integration into backend systems.
- Familiarity with cloud platforms (AWS/GCP/Azure), Docker, or Git.
Senior BI Engineer
📍 Location: BHive Bangalore, India
💼 Department: Technology – Data
🚀 Experience Level: Senior
About the Role
Mumzworld is scaling fast, and data is central to our success.
We are looking for a Senior BI Analyst to join our data team and help drive business insights, storytelling, and decision-making through high-quality analytics and reporting.
This is a high-impact role, reporting directly to the Head of Data, and working cross-functionally with business teams, product managers, and engineers to transform data into actionable intelligence.
You will not only deliver robust dashboards and deep-dive analyses but also help define metrics, improve data literacy across the company, and ensure that our analytics layer is consistent, trusted, and scalable.
You will also leverage Generative AI tools to boost productivity, streamline reporting workflows, and enhance collaboration.
Key Responsibilities
- Deliver Impactful Insights
Build, maintain, and optimize dashboards, reports, and deep-dive analyses to support critical business decisions.
- Create Impact Dashboards with Storytelling
Design dashboards that not only show data but tell a clear story — highlighting insights, business implications, and actions.
- Define and Standardize Metrics
Work closely with stakeholders to define KPIs and ensure consistency in metrics across business domains.
- Model for Analytics Needs
Collaborate with data engineers and modelers to ensure data structures (in bronze, silver, gold layers) are optimized for reporting and analytical needs following medallion architecture principles.
- Drive Business Conversations
Translate business questions into clear analytical deliverables; turn data findings into actionable insights and business narratives.
- Ensure Data Quality & Trust
Validate data accuracy and drive initiatives to improve data reliability across all reports and dashboards.
- Leverage Gen AI for Productivity
Promote the use of Generative AI tools to accelerate report building, automate documentation, and summarize insights faster.
- Advance Data-Driven Culture
Conduct workshops, share knowledge, and uplift the organization’s data literacy and self-service capabilities.
- Support Data Science Projects
Partner with the data science team to ensure BI foundations support predictive models, experimentation, and advanced analytics initiatives.
What We’re Looking For
- 5+ years of experience in business intelligence, data analysis, or analytics roles.
- Expertise in BI tools like Looker, Power BI, or Tableau.
- Strong SQL skills to extract, transform, and analyze large datasets.
- Solid understanding of data modeling principles (especially star schema and medallion architecture).
- Understanding of incremental load vs full load strategies, and a passion for building your own scheduled queries and data refreshes.
- Experience working with cloud data warehouses like BigQuery or Snowflake.
- Strong business acumen with the ability to translate data into business value.
- Experience building self-service datasets, semantic layers, or LookML models is a plus.
- Excellent communication skills — able to present complex findings in a clear and actionable way.
Nice-to-Have
📅 E-commerce experience – Familiarity with product, customer, and order data structures.
🤖 Experience with dbt Cloud and modern analytics stacks.
📈 Exposure to data governance, catalogs, or metadata management tools.
🖥️ Experience with GitHub for version control in analytics workflows.
🌐 Understanding of privacy, GDPR, and data compliance best practices.
Who You Are – Soft Skills & Personality
We’re looking for a curious, business-savvy, and detail-driven BI analyst who thrives in a fast-paced, collaborative environment.
👤 Structured & Organized – You design with clarity, document your work well, and maintain consistency across reports and analyses.
💬 Collaborative Communicator – You work well with both technical and business stakeholders.
🚀 Results-Oriented – You focus on delivering impactful insights that drive business outcomes.
🧬 Analytical & Curious – You dig deep into data, question assumptions, and continuously look for improvement opportunities.
🤝 Team Player – You support, teach, and uplift others across the organization.
If you’re passionate about turning data into impact, scaling analytics, and building a data-driven culture, we’d love to hear from you.
Bachelor’s Degree or equivalent degree in computer science
Min 1-3 years of experience in Python and backend technology
Development experience with programming languages like C/C++, Erlang, Elixir.
Experience in Python / Go Lang is a must
Experience in creating and dealing with REST API and socket technology
Working Knowledge of RabbitMQ and Kafka
Knowledge of both SQL and NoSQL databases along with Redis and elastic search.
Working knowledge of network communication, TCP, UDP, HTTP, etc
Roles and Responsibilities:
Participate in the entire application lifecycle, focusing on coding and debugging
Write clean code to develop functional web applications
Troubleshoot and debug applications
Perform UI tests to optimize performance
Manage cutting-edge technologies to improve legacy applications
Collaborate with Front-end developers to integrate user-facing elements with server-side logic
Gather and address technical and design requirements
Provide training and support to internal teams
Build reusable code and libraries for future use
Liaise with developers, designers, and system administrators to identify new features
Follow emerging technologies
Optimizing multiple producers and multiple consumer system
Develop a Fault-tolerant application design.
About HealthAsyst:
HealthAsyst® is an IT service and product company. It is a leading provider of IT services to some of the largest healthcare IT vendors in the United States. We bring the value of cutting-edge technology through our deep expertise in product engineering, custom software development, testing, large-scale healthcare IT implementation and integrations, on-going maintenance and support, BI & Analytics, and remote monitoring platforms. As a true partner, we help our customers navigate a complex regulatory landscape, deal with cost pressures, and offer high-quality services. As a healthcare transformation agent, we enable innovation in technology and accelerate problem-solving while delivering unmatched cost benefits to healthcare technology companies. HealthAsyst is now a Great Place to Work-Certified™ & our product has also been consistently recognized for high customer credibility by Gartner, Capterra etc.
Required Skills & Qualifications:
- Bachelor's or Master's degree in Computer Science, Information Technology, or a related field.
- 12+ years of overall experience in IT, with a minimum of 4 years in an Architect role specifically architecting web applications.
- Deep expertise in the Microsoft .NET ecosystem, including .NET Core/.NET 5+, C#, ASP.NET Core, and Web API.
- Extensive experience in designing and implementing microservices architectures and distributed systems.
- Strong proficiency in SQL Server, including database design, T-SQL, query optimization, and performance tuning.
- Proven experience with architectural patterns such as MVC, MVVM, and clean architecture.
- Solid understanding of object-oriented programming (OOP) principles and design patterns (GoF, enterprise patterns).
- Experience with RESTful API design and development.
- Familiarity with version control systems (e.g., Git).
- Excellent analytical, problem-solving, and decision-making skills.
- Strong communication, presentation, and interpersonal skills, with the ability to articulate complex technical concepts to both technical and non-technical audiences.
Preferred Skills (Nice to Have):
- Hands-on experience with cloud platforms (Azure, AWS, or GCP) and related services.
- Exposure to containerization technologies (Docker) and orchestration tools (Kubernetes).
- Experience with message brokers (e.g., RabbitMQ, Kafka, Azure Service Bus).
- Familiarity with modern frontend frameworks (Angular, React, Vue.js).
- Knowledge of DevOps tools and practices (e.g., Azure DevOps, Jenkins, GitLab CI).
- Experience with Agile methodologies (Scrum, Kanban).
- Relevant Microsoft certifications (e.g., Microsoft Certified: Azure Solutions Architect Expert).
Job Overview
We are looking for a Senior Analyst who has led teams and managed system operations.
Key Responsibilities
- Lead and mentor a team of analysts to drive high-quality execution.
- Design, write, and optimize SQL queries to derive actionable insights.
- Manage, monitor, and enhance Payment Governance Systems for accuracy and efficiency.
- Work cross-functionally with Finance, Tech, and Operations teams to maintain data integrity.
- Build and automate dashboards/reports to track key metrics and system performance.
- Identify anomalies and lead root cause analysis for payment-related issues.
- Define and document processes, SOPs, and governance protocols.
- Ensure compliance with internal control frameworks and audit readiness.
Requirements
We require candidates with the following qualifications:
- 3–5 years of experience in analytics, data systems, or operations.
- Proven track record of leading small to mid-size teams.
- Strong command over SQL and data querying techniques.
- Experience with payment systems, reconciliation, or financial data platforms.
- Analytical mindset with problem-solving abilities.
- Ability to work in a fast-paced, cross-functional environment.
- Excellent communication and stakeholder management skills.
Job Description: Senior Software Engineer – C# (RMS – Risk Management Systems)
**Location:** [Insert Location]
**Job Type:** [Full-time / Contract]
**Experience:** 4–8 years
**Domain:** Capital Markets / Risk Management / Trading Applications
Job Description:
We are looking for an experienced Senior Software Engineer with deep expertise in C# and distributed systems, to design and maintain mission-critical Risk Management Systems (RMS) used in trading environments. The role requires strong understanding of real-time order flow, risk checks, queue management, and multi-threaded processing.
Key Responsibilities:
RMS Development:
· Design, develop, and optimize real-time RMS components using C# and .NET Framework (4.0/4.7.2).
· Implement rule-based and exposure-based pre-trade and post-trade risk checks.
· Develop in-memory data structures to handle millions of order and trade records efficiently.
· Build high-throughput queues and modules to handle burst loads during market open and spikes.
· Debug multi-threaded modules and ensure accurate and timely risk validation.
· Build alerting, threshold evaluation, and notification modules for risk violations.
· Collaborate with product and trading teams to translate risk rules into executable modules.
Tools & Technologies:
· Version control: Git or TFS.
· Database: SQL Server or in-memory cache (Redis) for real-time exposure tracking.
· Experience with messaging systems or queues (e.g., MSMQ, ZeroMQ, Kafka) preferred.
· Proficiency with AI-powered tools such as GitHub Copilot and ChatGPT.
· Prompt engineering skills to utilize AI for test case generation, debugging, and optimization.
Domain Knowledge (Must-Have):
· Strong understanding of capital markets, especially equity and derivative segments.
· Working knowledge of Order Management Systems (OMS), RMS policies, and market behavior.
· Experience with exchange protocols (e.g., FIX, TCP) and market data processing.
· Ability to handle peak load conditions and large-scale order bursts.
Preferred Qualifications:
· Bachelor’s or Master’s degree in Computer Science, Engineering, or related field.
· Prior experience working on RMS or surveillance systems in broking or exchange domain.
· Familiarity with trading APIs and pre-trade/post-trade workflows.
Role & responsibilities
- Develop and maintain server-side applications using Go Lang.
- Design and implement scalable, secure, and maintainable RESTful APIs and microservices.
- Collaborate with front-end developers to integrate user-facing elements with server-side logic
- Optimize applications for performance, reliability, and scalability.
- Write clean, efficient, and reusable code that adheres to best practices.
Preferred candidate profile
- Minimum 5 years of working experience in Go Lang development.
- Proven experience in developing RESTful APIs and microservices.
- Familiarity of cloud platforms like AWS, GCP, or Azure.
- Familiarity with CI/CD pipelines and DevOps practices
🚀 Hiring: Python Developer at Deqode
⭐ Experience: 2+ Years
📍 Location: Indore
⭐ Work Mode:- Work From Office
⏱️ Notice Period: Immediate Joiners
(Only immediate joiners & candidates serving notice period)
⭐ Must-Have Skills:-
✅ 2+ years of professional experience as a Python Developer
✅Proficient in Django or FastAPI
✅Hands-on with MongoDB & PostgreSQL
✅Strong understanding of REST APIs & Git
About Moative
Moative, an Applied AI company, designs and builds transformation AI solutions for traditional industries in energy, utilities, healthcare & lifesciences, and more. Through Moative Labs, we build AI micro-products and launch AI startups with partners in vertical markets that align with our theses.
Our Past: We have built and sold two companies, one of which was an AI company. Our founders and leaders are Math PhDs, Ivy League University Alumni, Ex-Googlers, and successful entrepreneurs.
Our Team: Our team of 20+ employees consist of data scientists, AI/ML Engineers, and mathematicians from top engineering and research institutes such as IITs, CERN, IISc, UZH, Ph.Ds. Our team includes academicians, IBM Research Fellows, and former founders.
Work you’ll do
As a Data Engineer, you will work on data architecture, large-scale processing systems, and data flow management. You will build and maintain optimal data architecture and data pipelines, assemble large, complex data sets, and ensure that data is readily available to data scientists, analysts, and other users. In close collaboration with ML engineers, data scientists, and domain experts, you’ll deliver robust, production-grade solutions that directly impact business outcomes. Ultimately, you will be responsible for developing and implementing systems that optimize the organization’s data use and data quality.
Responsibilities
- Create and maintain optimal data architecture and data pipelines on cloud infrastructure (such as AWS/ Azure/ GCP)
- Assemble large, complex data sets that meet functional / non-functional business requirements
- Identify, design, and implement internal process improvements
- Build the pipeline infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources
- Support development of analytics that utilize the data pipeline to provide actionable insights into key business metrics
- Work with stakeholders to assist with data-related technical issues and support their data infrastructure needs
Who you are
You are a passionate and results-oriented engineer who understands the importance of data architecture and data quality to impact solution development, enhance products, and ultimately improve business applications. You thrive in dynamic environments and are comfortable navigating ambiguity. You possess a strong sense of ownership and are eager to take initiative, advocating for your technical decisions while remaining open to feedback and collaboration.
You have experience in developing and deploying data pipelines to support real-world applications. You have a good understanding of data structures and are excellent at writing clean, efficient code to extract, create and manage large data sets for analytical uses. You have the ability to conduct regular testing and debugging to ensure optimal data pipeline performance. You are excited at the possibility of contributing to intelligent applications that can directly impact business services and make a positive difference to users.
Skills & Requirements
- 3+ years of hands-on experience as a data engineer, data architect or similar role, with a good understanding of data structures and data engineering.
- Solid knowledge of cloud infra and data-related services on AWS (EC2, EMR, RDS, Redshift) and/ or Azure.
- Advanced knowledge of SQL, including writing complex queries, stored procedures, views, etc.
- Strong experience with data pipeline and workflow management tools (such as Luigi, Airflow).
- Experience with common relational SQL, NoSQL and Graph databases.
- Strong experience with scripting languages: Python, PySpark, Scala, etc.
- Practical experience with basic DevOps concepts: CI/CD, containerization (Docker, Kubernetes), etc
- Experience with big data tools (Spark, Kafka, etc) and stream processing.
- Excellent communication skills to collaborate with colleagues from both technical and business backgrounds, discuss and convey ideas and findings effectively.
- Ability to analyze complex problems, think critically for troubleshooting and develop robust data solutions.
- Ability to identify and tackle issues efficiently and proactively, conduct thorough research and collaborate to find long-term, scalable solutions.
Working at Moative
Moative is a young company, but we believe strongly in thinking long-term, while acting with urgency. Our ethos is rooted in innovation, efficiency and high-quality outcomes. We believe the future of work is AI-augmented and boundary less. Here are some of our guiding principles:
- Think in decades. Act in hours. As an independent company, our moat is time. While our decisions are for the long-term horizon, our execution will be fast – measured in hours and days, not weeks and months.
- Own the canvas. Throw yourself in to build, fix or improve – anything that isn’t done right, irrespective of who did it. Be selfish about improving across the organization – because once the rot sets in, we waste years in surgery and recovery.
- Use data or don’t use data. Use data where you ought to but not as a ‘cover-my-back’ political tool. Be capable of making decisions with partial or limited data. Get better at intuition and pattern-matching. Whichever way you go, be mostly right about it.
- Avoid work about work. Process creeps on purpose, unless we constantly question it. We are deliberate about committing to rituals that take time away from the actual work. We truly believe that a meeting that could be an email, should be an email and you don’t need a person with the highest title to say that out loud.
- High revenue per person. We work backwards from this metric. Our default is to automate instead of hiring. We multi-skill our people to own more outcomes than hiring someone who has less to do. We don’t like squatting and hoarding that comes in the form of hiring for growth. High revenue per person comes from high quality work from everyone. We demand it.
If this role and our work is of interest to you, please apply. We encourage you to apply even if you believe you do not meet all the requirements listed above.
That said, you should demonstrate that you are in the 90th percentile or above. This may mean that you have studied in top-notch institutions, won competitions that are intellectually demanding, built something of your own, or rated as an outstanding performer by your current or previous employers.
The position is based out of Chennai. Our work currently involves significant in-person collaboration and we expect you to work out of our offices in Chennai.

Contribute to all software-development life-cycle phases inc
Mandatory Skills
- Backend: Java, Spring Boot
- Frontend: Angular
- Database: Oracle / SQL
- Node js
Job Description
Contribute to all software-development life-cycle phases including: domain and non-domain problem analysis, solution requirement gathering and analysis, solution design, implementation, code review, source-code control, source building deployment, validation, QA support, and production support.
Essential Duties and Responsibilities
• Maintain and enhance multi-tier messaging application suites (Java EE, Springframework, WAS, Oracle, DB2, MQ)
• Build and maintain IRIS4Health middle-tier message applications (IRIS Interop/Cache; Java, Drools, Kafka, Restful, MLLP, SQL)
• Build and maintain multi-tier Clinical Toxicology application (Angular, Java EE, Springframework, WAS, RHOS, Cache, SQL)
• Maintain stat-tracking application (two-tier Delphi, MySQL)
• Maintain and enhance Cytogenetics three-tier application (Java EE, WAS, DB2, Oracle, SQL, )
• Maintain and enhance Fibrosure application (Java EE, WAS, Derby)
• Define develop, validate, and release software products via agile processes for small and large projects
• Support applications and people via Kanban processes
• Collaborate with laboratory users to analyze problems, design and implement solutions for enterprise systems
• Provide support and troubleshooting of production systems according to an on-call schedule
• Document problem analysis, solution design, implementations, and system support guidelines
• Coach and train team members across lab system organizations to support and develop Java applications
• Communicate effectively and constructively with developers, QA, business analysts, and system users
• Design and depict via UML relational DB table models, object-oriented class models, messaging models, configuration models
• Understand, document, support, and improve inherited code and processes
• Help document knowledge and discovery with peer developers
Minimum Requirements
• Solid Java EE experience (Servlets, JMS, JSP, EJB, JCA, and JPA) development and support
• Solid InfoSystems Cache/IRIS for Health development and support
• A minimum of 1 years of JPA/ORM (Hibernate), Junit, XML/XSD, JSON experience or equivalent
• Solid SQL (and optionally PLSQL) experience
• Experience with Oracle DB including explain plan and or other query optimization techniques/tools
• Excellent verbal and written communication skills
• Strong UML modeling, ER and OO design, and data-normalization techniques
• Strong code-factoring philosophies and techniques
• Eclipse or NetBeans (or equivalent) IDE
• Strong understanding of client/server design, and smart recognition of separation-of-concern like functional behavior versus non-functional performance
Desired Requirements
• Java EE, Angular
• InfoSystem's Cache and/or IRIS for Health
• Springframework
• Modern deployment architectures using containers, API Gateways, load balancers, and AWS cloud based environments
• WebSphere or WebLogic, RHOS
• RESTful Web Services
• JMS interfacing, Apache Kafka, and IBM MQ
• Node.js/NPM, Bootstrap, or similar frameworks
• Git/BitBucket (git flow), Maven, Nexus, UCD, Jira (Kanban and SCRUM), agile workflow
• Unix shell script, DOS script
• SQL (optionally PLSQL)
• Design patterns
• HTML5, CSS3, and TypeScript development
• Ability to transform specific domain requirements into generalized technical requirements, and design and implement abstract solutions that are understandable and scalable in performance and reuse
• HL7 and/or Healthcare and/or Clinical Toxicology
• Oracle, MySQL, Derby DB
Key Responsibilities:
- Perform comprehensive Functional and Integration Testing across Oracle modules and connected systems.
- Conduct detailed End-to-End (E2E) Testing to ensure business processes function seamlessly across applications.
- Collaborate with cross-functional teams, including Business Analysts, Developers, and Automation teams, to validate business requirements and deliver high-quality releases.
- Identify, document, and track functional defects, ensuring timely closure and root cause analysis.
- Execute and validate SQL queries for backend data verification and cross-system data consistency checks.
- Participate in regression cycles and support continuous improvement initiatives through data-driven analysis.
Required Skills & Competencies:
- Strong knowledge of Functional Testing processes and methodologies.
- Good to have Oracle fusion knowledge
- Solid understanding of Integration Flows between Oracle and peripheral systems.
- Proven ability in E2E Testing, including scenario design, execution, and defect management.
- Excellent Analytical and Logical Reasoning skills with attention to detail.
- Hands-on experience with SQL for data validation and analysis.
- Effective communication, documentation, and coordination skills.
Preferred Qualifications:
- Exposure to automation-assisted functional testing and cross-platform data validation.
- Experience in identifying test optimization opportunities and improving testing efficiency.
About the role:
The SDE 2 - Backend will work as part of the Digitization and Automation team to help Sun King design, develop, and implement - intelligent, tech-enabled solutions to help solve a large variety of our business problems. We are looking for candidates with an affinity for technology and automations, curiosity towards advancement in products, and strong coding skills for our in-house software development team.
What you will be expected to do:
- Design and build applications/systems based on wireframes and product requirements documents
- Design and develop conceptual and physical data models to meet application requirements.
- Identify and correct bottlenecks/bugs according to operational requirements
- Focus on scalability, performance, service robustness, and cost trade-offs.
- Create prototypes and proof-of-concepts for iterative development.
- Take complete ownership of projects (end to end) and their development cycle
- Mentoring and guiding team members
- Unit test code for robustness, including edge cases, usability and general reliability
- Integrate existing tools and business systems (in-house tools or business tools like Ticketing softwares, communication tools) with external services
- Coordinate with the Product Manager, development team & business analysts
You might be a strong candidate if you have/are:
- Development experience: 3 – 5 years
- Should be very strong in problem-solving, data structures, and algorithms.
- Deep knowledge of OOPS concepts and programming skills in Core Java and Spring Boot Framework
- Strong Experience in SQL
- Experience in web service development and integration (SOAP, REST, JSON, XML)
- Understanding of code versioning tools (e.g., git)
- Experience in Agile/Scrum development process and tools
- Experience in Microservice architecture
- Hands-on experience in AWS RDS, EC2, S3 and deployments
Good to have:
- Knowledge on messaging systems RabbitMQ, Kafka.
- Knowledge of Python
- Container-based application deployment (Docker or equivalent)
- Willing to learn new technologies and implement them in products
What Sun King offers:
- Professional growth in a dynamic, rapidly expanding, high-social-impact industry
- An open-minded, collaborative culture made up of enthusiastic colleagues who are driven by the challenge of innovation towards profound impact on people and the planet.
- A truly multicultural experience: you will have the chance to work with and learn from people from different geographies, nationalities, and backgrounds.
- Structured, tailored learning and development programs that help you become a better leader, manager, and professional through the Sun King Center for Leadership.
About Sun King
Sun King is a leading off-grid solar energy company providing affordable, reliable electricity to 1.8 billion people without grid access. Operating across Africa and Asia, Sun King has connected over 20 million homes, adding 200,000 homes monthly.
Through a ‘pay-as-you-go’ model, customers make small daily payments (as low as $0.11) via mobile money or cash, eventually owning their solar equipment and saving on costly kerosene or diesel. To date, Sun King products have saved customers over $4 billion.
With 28,000 field agents and embedded electronics that regulate usage based on payments, Sun King ensures seamless energy access. Its products range from home lighting and phone charging systems to solar inverters capable of powering high-energy appliances.
Sun King is expanding into clean cooking, electric mobility, and entertainment while serving a wide range of income segments.
The company employs 2,800 staff across 12 countries, with women representing 44% of the workforce, and expertise spanning product design, data science, logistics, sales, software, and operations.
We are looking for a highly skilled Sr. Big Data Engineer with 3-5 years of experience in
building large-scale data pipelines, real-time streaming solutions, and batch/stream
processing systems. The ideal candidate should be proficient in Spark, Kafka, Python, and
AWS Big Data services, with hands-on experience in implementing CDC (Change Data
Capture) pipelines and integrating multiple data sources and sinks.
Responsibilities
- Design, develop, and optimize batch and streaming data pipelines using Apache Spark and Python.
- Build and maintain real-time data ingestion pipelines leveraging Kafka and AWS Kinesis.
- Implement CDC (Change Data Capture) pipelines using Kafka Connect, Debezium or similar frameworks.
- Integrate data from multiple sources and sinks (databases, APIs, message queues, file systems, cloud storage).
- Work with AWS Big Data ecosystem: Glue, EMR, Kinesis, Athena, S3, Lambda, Step Functions.
- Ensure pipeline scalability, reliability, and performance tuning of Spark jobs and EMR clusters.
- Develop data transformation and ETL workflows in AWS Glue and manage schema evolution.
- Collaborate with data scientists, analysts, and product teams to deliver reliable and high-quality data solutions.
- Implement monitoring, logging, and alerting for critical data pipelines.
- Follow best practices for data security, compliance, and cost optimization in cloud environments.
Required Skills & Experience
- Programming: Strong proficiency in Python (PySpark, data frameworks, automation).
- Big Data Processing: Hands-on experience with Apache Spark (batch & streaming).
- Messaging & Streaming: Proficient in Kafka (brokers, topics, partitions, consumer groups) and AWS Kinesis.
- CDC Pipelines: Experience with Debezium / Kafka Connect / custom CDC frameworks.
- AWS Services: AWS Glue, EMR, S3, Athena, Lambda, IAM, CloudWatch.
- ETL/ELT Workflows: Strong knowledge of data ingestion, transformation, partitioning, schema management.
- Databases: Experience with relational databases (MySQL, Postgres, Oracle) and NoSQL (MongoDB, DynamoDB, Cassandra).
- Data Formats: JSON, Parquet, Avro, ORC, Delta/Iceberg/Hudi.
- Version Control & CI/CD: Git, GitHub/GitLab, Jenkins, or CodePipeline.
- Monitoring/Logging: CloudWatch, Prometheus, ELK/Opensearch.
- Containers & Orchestration (nice-to-have): Docker, Kubernetes, Airflow/Step
- Functions for workflow orchestration.
Preferred Qualifications
- Bachelor’s or Master’s degree in Computer Science, Data Engineering, or related field.
- Experience in large-scale data lake / lake house architectures.
- Knowledge of data warehousing concepts and query optimisation.
- Familiarity with data governance, lineage, and cataloging tools (Glue Data Catalog, Apache Atlas).
- Exposure to ML/AI data pipelines is a plus.
Tools & Technologies (must-have exposure)
- Big Data & Processing: Apache Spark, PySpark, AWS EMR, AWS Glue
- Streaming & Messaging: Apache Kafka, Kafka Connect, Debezium, AWS Kinesis
- Cloud & Storage: AWS (S3, Athena, Lambda, IAM, CloudWatch)
- Programming & Scripting: Python, SQL, Bash
- Orchestration: Airflow / Step Functions
- Version Control & CI/CD: Git, Jenkins/CodePipeline
- Data Formats: Parquet, Avro, ORC, JSON, Delta, Iceberg, Hudi
CTC: 15 LPA to 21 LPA
Exp: 5 to 8 Years
Mandatory
- Strong Behavioral Data Analyst Profiles
- Mandatory (Experience 1): Minimum 4+ years of experience in user analytics or behavioural data analysis, focusing on user app and web journeys
- Mandatory (Experience 2): Experience in analyzing clickstream and behavioral data using tools such as Google Analytics, Mixpanel, CleverTap, or Firebase
- Mandatory (Skills 1): Hands-on experience in A/B testing, including hypothesis design, experimentation, and result interpretation.
- Mandatory (Skills 2): Strong analytical ability to identify behavioral patterns, anomalies, funnel drop-offs, and engagement trends from large datasets.
- Mandatory (Skills 3): Hands-on proficiency in SQL, Excel, and data visualization tools such as Tableau or Power BI for dashboard creation and data storytelling.
- Mandatory (Skills 4): Basic understanding of UX principles and customer journey mapping, collaborating effectively with UX/CX teams
- Mandatory (Company): B2C product Companies (fintech, or e-commerce organizations with large user behavior dataset is a plus)
- Mandatory (Note): Don't want data analysis but business/product/user analysts
Ideal Candidate:
- Bachelor’s or Master’s degree in a relevant field (e.g., UX Design, Human-Computer Interaction, Computer Science, Marketing).
- 5+ years of experience in CX/UX roles, preferably in a B2C environment.
- Proficiency in analytics tools (Google Analytics, CleverTap, Medallia, Hotjar, etc).
- Strong understanding of wireframing and prototyping tools (Figma, XMind, etc).
- Excellent communication and collaboration skills.
- Proven experience in managing cross-functional teams and projects.
- Strong background in data analytics and data-driven decision-making.
- Expert understanding of user experience and user-centered design approaches
- Detail-orientation with experience and will to continuously learn, adapt and evolve
- Creating and measuring the success and impact of your CX designs
- Knowledge of testing tools like Maze, UsabiltyHub, UserZoom would be a plus
- Experienced in designing responsive websites as well as mobile apps
- Understanding of iOS and Android design guidelines
- Passion for great customer-focus design, purposeful aesthetic sense and generating simple solutions to complex problems.
- Excellent communication skills to be able to present their work and ideas to the leadership team.
If interested kindly share your updated resume on 82008 31681
Key Responsibilities
- Design, develop, and maintain scalable microservices and RESTful APIs using Python (Flask, FastAPI, or Django).
- Architect data models for SQL and NoSQL databases (PostgreSQL, ClickHouse, MongoDB, DynamoDB) to optimize performance and reliability.
- Implement efficient and secure data access layers, caching, and indexing strategies.
- Collaborate closely with product and frontend teams to deliver seamless user experiences.
- Build responsive UI components using HTML, CSS, JavaScript, and frameworks like React or Angular.
- Ensure system reliability, observability, and fault tolerance across services.
- Lead code reviews, mentor junior engineers, and promote engineering best practices.
- Contribute to DevOps and CI/CD workflows for smooth deployments and testing automation.
Required Skills & Experience
- 10+ years of professional software development experience.
- Strong proficiency in Python, with deep understanding of OOP, asynchronous programming, and performance optimization.
- Proven expertise in building FAST API based microservices architectures.
- Solid understanding of SQL and NoSQL data modeling, query optimization, and schema design.
- Excellent hands on proficiency in frontend proficiency with HTML, CSS, JavaScript, and a modern framework (React, Angular, or Vue).
- Experience working with cloud platforms (AWS, GCP, or Azure) and containerized deployments (Docker, Kubernetes).
- Familiarity with distributed systems, event-driven architectures, and messaging queues (Kafka, RabbitMQ).
- Excellent problem-solving, communication, and system design skills.
Profile: Big Data Engineer (System Design)
Experience: 5+ years
Location: Bangalore
Work Mode: Hybrid
About the Role
We're looking for an experienced Big Data Engineer with system design expertise to architect and build scalable data pipelines and optimize big data solutions.
Key Responsibilities
- Design, develop, and maintain data pipelines and ETL processes using Python, Hive, and Spark
- Architect scalable big data solutions with strong system design principles
- Build and optimize workflows using Apache Airflow
- Implement data modeling, integration, and warehousing solutions
- Collaborate with cross-functional teams to deliver data solutions
Must-Have Skills
- 5+ years as a Data Engineer with Python, Hive, and Spark
- Strong hands-on experience with Java
- Advanced SQL and Hadoop experience
- Expertise in Apache Airflow
- Strong understanding of data modeling, integration, and warehousing
- Experience with relational databases (PostgreSQL, MySQL)
- System design knowledge
- Excellent problem-solving and communication skills
Good to Have
- Docker and containerization experience
- Knowledge of Apache Beam, Apache Flink, or similar frameworks
- Cloud platform experience.
🚀 Hiring: PL/SQL Developer
⭐ Experience: 5+ Years
📍 Location: Pune
⭐ Work Mode:- Hybrid
⏱️ Notice Period: Immediate Joiners
(Only immediate joiners & candidates serving notice period)
What We’re Looking For:
☑️ Hands-on PL/SQL developer with strong database and scripting skills, ready to work onsite and collaborate with cross-functional financial domain teams.
Key Skills:
✅ Must Have: PL/SQL, SQL, Databases, Unix/Linux & Shell Scripting
✅ Nice to Have: DevOps tools (Jenkins, Artifactory, Docker, Kubernetes),
✅AWS/Cloud, Basic Python, AML/Fraud/Financial domain, Actimize (AIS/RCM/UDM)
Key Responsibilities
- Develop and maintain Python-based applications.
- Design and optimize SQL queries and databases.
- Collaborate with cross-functional teams to define, design, and ship new features.
- Write clean, maintainable, and efficient code.
- Troubleshoot and debug applications.
- Participate in code reviews and contribute to team knowledge sharing.
Qualifications and Required Skills
- Strong proficiency in Python programming.
- Experience with SQL and database management.
- Experience with web frameworks such as Django or Flask.
- Knowledge of front-end technologies like HTML, CSS, and JavaScript.
- Familiarity with version control systems like Git.
- Strong problem-solving skills and attention to detail.
- Excellent communication and teamwork skills.
Good to Have Skills
- Experience with cloud platforms like AWS or Azure.
- Knowledge of containerization technologies like Docker.
- Familiarity with continuous integration and continuous deployment (CI/CD) pipelines
Role: Lead Java Developer
Work Location: Chennai, Pune
No of years’ experience: 8+ years
Hybrid (3 days office and 2 days home)
Type: Fulltime
Skill Set: Java + Spring Boot + Sql + Microservices + DevOps
Job Responsibilities:
Design, develop, and maintain high-quality software applications using Java and Spring Boot.
Develop and maintain RESTful APIs to support various business requirements.
Write and execute unit tests using TestNG to ensure code quality and reliability.
Work with NoSQL databases to design and implement data storage solutions.
Collaborate with cross-functional teams in an Agile environment to deliver high-quality software solutions.
Utilize Git for version control and collaborate with team members on code reviews and merge requests.
Troubleshoot and resolve software defects and issues in a timely manner.
Continuously improve software development processes and practices.
Description:
8 years of professional experience in backend development using Java and leading a team.
Strong expertise in Spring Boot, Apache Camel, Hibernate, JPA, and REST API design
Hands-on experience with PostgreSQL, MySQL, or other SQL-based databases
Working knowledge of AWS cloud services (EC2, S3, RDS, etc.)
Experience in DevOps activities.
Proficiency in using Docker for containerization and deployment.
Strong understanding of object-oriented programming, multithreading, and performance tuning
Self-driven and capable of working independently with minimal supervision
Role Summary:
We are seeking experienced Application Support Engineers to join our client-facing support team. The ideal candidate will
be the first point of contact for client issues, ensuring timely resolution, clear communication, and high customer satisfaction
in a fast-paced trading environment.
Key Responsibilities:
• Act as the primary contact for clients reporting issues related to trading applications and platforms.
• Log, track, and monitor issues using internal tools and ensure resolution within defined TAT (Turnaround Time).
• Liaise with development, QA, infrastructure, and other internal teams to drive issue resolution.
• Provide clear and timely updates to clients and stakeholders regarding issue status and resolution.
• Maintain comprehensive logs of incidents, escalations, and fixes for future reference and audits.
• Offer appropriate and effective resolutions for client queries on functionality, performance, and usage.
• Communicate proactively with clients about upcoming product features, enhancements, or changes.
• Build and maintain strong relationships with clients through regular, value-added interactions.
• Collaborate in conducting UAT, release validations, and production deployment verifications.
• Assist in root cause analysis and post-incident reviews to prevent recurrences.
Required Skills & Qualifications:
• Bachelor's degree in Computer Science, IT, or related field.
• 2+ years in Application/Technical Support, preferably in the broking/trading domain.
• Sound understanding of capital markets – Equity, F&O, Currency, Commodities.
• Strong technical troubleshooting skills – Linux/Unix, SQL, log analysis.
• Familiarity with trading systems, RMS, OMS, APIs (REST/FIX), and order lifecycle.
• Excellent communication and interpersonal skills for effective client interaction.
• Ability to work under pressure during trading hours and manage multiple priorities.
• Customer-centric mindset with a focus on relationship building and problem-solving.





















