50+ Remote SQL Jobs in India
Apply to 50+ Remote SQL Jobs on CutShort.io. Find your next job, effortlessly. Browse SQL Jobs and apply today!
About Hudson Data
At Hudson Data, we view AI as both an art and a science. Our cross-functional teams — spanning business leaders, data scientists, and engineers — blend AI/ML and Big Data technologies to solve real-world business challenges. We harness predictive analytics to uncover new revenue opportunities, optimize operational efficiency, and enable data-driven transformation for our clients.
Beyond traditional AI/ML consulting, we actively collaborate with academic and industry partners to stay at the forefront of innovation. Alongside delivering projects for Fortune 500 clients, we also develop proprietary AI/ML products addressing diverse industry challenges.
Headquartered in New Delhi, India, with an office in New York, USA, Hudson Data operates globally, driving excellence in data science, analytics, and artificial intelligence.
⸻
About the Role
We are seeking a Data Analyst & Modeling Specialist with a passion for leveraging AI, machine learning, and cloud analytics to improve business processes, enhance decision-making, and drive innovation. You’ll play a key role in transforming raw data into insights, building predictive models, and delivering data-driven strategies that have real business impact.
⸻
Key Responsibilities
1. Data Collection & Management
• Gather and integrate data from multiple sources including databases, APIs, spreadsheets, and cloud warehouses.
• Design and maintain ETL pipelines ensuring data accuracy, scalability, and availability.
• Utilize any major cloud platform (Google Cloud, AWS, or Azure) for data storage, processing, and analytics workflows.
• Collaborate with engineering teams to define data governance, lineage, and security standards.
2. Data Cleaning & Preprocessing
• Clean, transform, and organize large datasets using Python (pandas, NumPy) and SQL.
• Handle missing data, duplicates, and outliers while ensuring consistency and quality.
• Automate data preparation using Linux scripting, Airflow, or cloud-native schedulers.
3. Data Analysis & Insights
• Perform exploratory data analysis (EDA) to identify key trends, correlations, and drivers.
• Apply statistical techniques such as regression, time-series analysis, and hypothesis testing.
• Use Excel (including pivot tables) and BI tools (Tableau, Power BI, Looker, or Google Data Studio) to develop insightful reports and dashboards.
• Present findings and recommendations to cross-functional stakeholders in a clear and actionable manner.
4. Predictive Modeling & Machine Learning
• Build and optimize predictive and classification models using scikit-learn, XGBoost, LightGBM, TensorFlow, Keras, and H2O.ai.
• Perform feature engineering, model tuning, and cross-validation for performance optimization.
• Deploy and manage ML models using Vertex AI (GCP), AWS SageMaker, or Azure ML Studio.
• Continuously monitor, evaluate, and retrain models to ensure business relevance.
5. Reporting & Visualization
• Develop interactive dashboards and automated reports for performance tracking.
• Use pivot tables, KPIs, and data visualizations to simplify complex analytical findings.
• Communicate insights effectively through clear data storytelling.
6. Collaboration & Communication
• Partner with business, engineering, and product teams to define analytical goals and success metrics.
• Translate complex data and model results into actionable insights for decision-makers.
• Advocate for data-driven culture and support data literacy across teams.
7. Continuous Improvement & Innovation
• Stay current with emerging trends in AI, ML, data visualization, and cloud technologies.
• Identify opportunities for process optimization, automation, and innovation.
• Contribute to internal R&D and AI product development initiatives.
⸻
Required Skills & Qualifications
Technical Skills
• Programming: Proficient in Python (pandas, NumPy, scikit-learn, XGBoost, LightGBM, TensorFlow, Keras, H2O.ai).
• Databases & Querying: Advanced SQL skills; experience with BigQuery, Redshift, or Azure Synapse is a plus.
• Cloud Expertise: Hands-on experience with one or more major platforms — Google Cloud, AWS, or Azure.
• Visualization & Reporting: Skilled in Tableau, Power BI, Looker, or Excel (pivot tables, data modeling).
• Data Engineering: Familiarity with ETL tools (Airflow, dbt, or similar).
• Operating Systems: Strong proficiency with Linux/Unix for scripting and automation.
Soft Skills
• Strong analytical, problem-solving, and critical-thinking abilities.
• Excellent communication and presentation skills, including data storytelling.
• Curiosity and creativity in exploring and interpreting data.
• Collaborative mindset, capable of working in cross-functional and fast-paced environments.
⸻
Education & Certifications
• Bachelor’s degree in Data Science, Computer Science, Statistics, Mathematics, or a related field.
• Master’s degree in Data Analytics, Machine Learning, or Business Intelligence preferred.
• Relevant certifications are highly valued:
• Google Cloud Professional Data Engineer
• AWS Certified Data Analytics – Specialty
• Microsoft Certified: Azure Data Scientist Associate
• TensorFlow Developer Certificate
⸻
Why Join Hudson Data
At Hudson Data, you’ll be part of a dynamic, innovative, and globally connected team that uses cutting-edge tools — from AI and ML frameworks to cloud-based analytics platforms — to solve meaningful problems. You’ll have the opportunity to grow, experiment, and make a tangible impact in a culture that values creativity, precision, and collaboration.
If interested please share your resume at ayushi.dwivedi at cloudsufi.com
Note - This role is remote but with quarterly visit to Noida office (1 week in a qarter) if you are ok for that then pls share your resume.
Data Engineer
Position Type: Full-time
About Us
CLOUDSUFI, a Google Cloud Premier Partner, is a global leading provider of data-driven digital transformation across cloud-based enterprises. With a global presence and focus on Software & Platforms, Life sciences and Healthcare, Retail, CPG, financial services, and supply chain, CLOUDSUFI is positioned to meet customers where they are in their data monetization journey.
Job Summary
We are seeking a highly skilled and motivated Data Engineer to join our Development POD for the Integration Project. The ideal candidate will be responsible for designing, building, and maintaining robust data pipelines to ingest, clean, transform, and integrate diverse public datasets into our knowledge graph. This role requires a strong understanding of Cloud Platform (GCP) services, data engineering best practices, and a commitment to data quality and scalability.
Key Responsibilities
ETL Development: Design, develop, and optimize data ingestion, cleaning, and transformation pipelines for various data sources (e.g., CSV, API, XLS, JSON, SDMX) using Cloud Platform services (Cloud Run, Dataflow) and Python.
Schema Mapping & Modeling: Work with LLM-based auto-schematization tools to map source data to our schema.org vocabulary, defining appropriate Statistical Variables (SVs) and generating MCF/TMCF files.
Entity Resolution & ID Generation: Implement processes for accurately matching new entities with existing IDs or generating unique, standardized IDs for new entities.
Knowledge Graph Integration: Integrate transformed data into the Knowledge Graph, ensuring proper versioning and adherence to existing standards.
API Development: Develop and enhance REST and SPARQL APIs via Apigee to enable efficient access to integrated data for internal and external stakeholders.
Data Validation & Quality Assurance: Implement comprehensive data validation and quality checks (statistical, schema, anomaly detection) to ensure data integrity, accuracy, and freshness. Troubleshoot and resolve data import errors.
Automation & Optimization: Collaborate with the Automation POD to leverage and integrate intelligent assets for data identification, profiling, cleaning, schema mapping, and validation, aiming for significant reduction in manual effort.
Collaboration: Work closely with cross-functional teams, including Managed Service POD, Automation POD, and relevant stakeholders.
Qualifications and Skills
Education: Bachelor's or Master's degree in Computer Science, Data Engineering, Information Technology, or a related quantitative field.
Experience: 3+ years of proven experience as a Data Engineer, with a strong portfolio of successfully implemented data pipelines.
Programming Languages: Proficiency in Python for data manipulation, scripting, and pipeline development.
Cloud Platforms and Tools: Expertise in Google Cloud Platform (GCP) services, including Cloud Storage, Cloud SQL, Cloud Run, Dataflow, Pub/Sub, BigQuery, and Apigee. Proficiency with Git-based version control.
Core Competencies:
Must Have - SQL, Python, BigQuery, (GCP DataFlow / Apache Beam), Google Cloud Storage (GCS)
Must Have - Proven ability in comprehensive data wrangling, cleaning, and transforming complex datasets from various formats (e.g., API, CSV, XLS, JSON)
Secondary Skills - SPARQL, Schema.org, Apigee, CI/CD (Cloud Build), GCP, Cloud Data Fusion, Data Modelling
Solid understanding of data modeling, schema design, and knowledge graph concepts (e.g., Schema.org, RDF, SPARQL, JSON-LD).
Experience with data validation techniques and tools.
Familiarity with CI/CD practices and the ability to work in an Agile framework.
Strong problem-solving skills and keen attention to detail.
Preferred Qualifications:
Experience with LLM-based tools or concepts for data automation (e.g., auto-schematization).
Familiarity with similar large-scale public dataset integration initiatives.
Experience with multilingual data integration.
If interested please send your resume at ayushi.dwivedi at cloudsufi.com
Current location of candidate must be Bangalore (as client office visit is required), also candidate must be open for 1 week in a quarter visit to Noida office.
About Us
CLOUDSUFI, a Google Cloud Premier Partner, is a global leading provider of data-driven digital transformation across cloud-based enterprises. With a global presence and focus on Software & Platforms, Life sciences and Healthcare, Retail, CPG, financial services and supply chain, CLOUDSUFI is positioned to meet customers where they are in their data monetization journey.
Our Values
We are a passionate and empathetic team that prioritizes human values. Our purpose is to elevate the quality of lives for our family, customers, partners and the community.
Equal Opportunity Statement
CLOUDSUFI is an equal opportunity employer. We celebrate diversity and are committed to creating an inclusive environment for all employees. All qualified candidates receive consideration for employment without regard to race, colour, religion, gender, gender identity or expression, sexual orientation and national origin status. We provide equal opportunities in employment, advancement, and all other areas of our workplace. Please explore more at https://www.cloudsufi.com/
Job Summary
We are seeking a highly skilled and motivated Data Engineer to join our Development POD for the Integration Project. The ideal candidate will be responsible for designing, building, and maintaining robust data pipelines to ingest, clean, transform, and integrate diverse public datasets into our knowledge graph. This role requires a strong understanding of Cloud Platform (GCP) services, data engineering best practices, and a commitment to data quality and scalability.
Key Responsibilities
ETL Development: Design, develop, and optimize data ingestion, cleaning, and transformation pipelines for various data sources (e.g., CSV, API, XLS, JSON, SDMX) using Cloud Platform services (Cloud Run, Dataflow) and Python.
Schema Mapping & Modeling: Work with LLM-based auto-schematization tools to map source data to our schema.org vocabulary, defining appropriate Statistical Variables (SVs) and generating MCF/TMCF files.
Entity Resolution & ID Generation: Implement processes for accurately matching new entities with existing IDs or generating unique, standardized IDs for new entities.
Knowledge Graph Integration: Integrate transformed data into the Knowledge Graph, ensuring proper versioning and adherence to existing standards.
API Development: Develop and enhance REST and SPARQL APIs via Apigee to enable efficient access to integrated data for internal and external stakeholders.
Data Validation & Quality Assurance: Implement comprehensive data validation and quality checks (statistical, schema, anomaly detection) to ensure data integrity, accuracy, and freshness. Troubleshoot and resolve data import errors.
Automation & Optimization: Collaborate with the Automation POD to leverage and integrate intelligent assets for data identification, profiling, cleaning, schema mapping, and validation, aiming for significant reduction in manual effort.
Collaboration: Work closely with cross-functional teams, including Managed Service POD, Automation POD, and relevant stakeholders.
Qualifications and Skills
Education: Bachelor's or Master's degree in Computer Science, Data Engineering, Information Technology, or a related quantitative field.
Experience: 3+ years of proven experience as a Data Engineer, with a strong portfolio of successfully implemented data pipelines.
Programming Languages: Proficiency in Python for data manipulation, scripting, and pipeline development.
Cloud Platforms and Tools: Expertise in Google Cloud Platform (GCP) services, including Cloud Storage, Cloud SQL, Cloud Run, Dataflow, Pub/Sub, BigQuery, and Apigee. Proficiency with Git-based version control.
Core Competencies:
Must Have - SQL, Python, BigQuery, (GCP DataFlow / Apache Beam), Google Cloud Storage (GCS)
Must Have - Proven ability in comprehensive data wrangling, cleaning, and transforming complex datasets from various formats (e.g., API, CSV, XLS, JSON)
Secondary Skills - SPARQL, Schema.org, Apigee, CI/CD (Cloud Build), GCP, Cloud Data Fusion, Data Modelling
Solid understanding of data modeling, schema design, and knowledge graph concepts (e.g., Schema.org, RDF, SPARQL, JSON-LD).
Experience with data validation techniques and tools.
Familiarity with CI/CD practices and the ability to work in an Agile framework.
Strong problem-solving skills and keen attention to detail.
Preferred Qualifications:
Experience with LLM-based tools or concepts for data automation (e.g., auto-schematization).
Familiarity with similar large-scale public dataset integration initiatives.
Experience with multilingual data integration.

Position: Full Stack Developer ( PHP Codeigniter)
Company : Mayura Consultancy Services
Experience: 2 yrs
Location : Bangalore
Skill: HTML, CSS, Bootstrap, Javascript, Ajax, Jquery , PHP and Codeigniter or CI
Work Location: Work From Home(WFH)
Apply: Please apply for the job opening using the URL below, based on your skill set. Once you complete the application form, we will review your profile.
Website:
https://www.mayuraconsultancy.com/careers/mcs-full-stack-web-developer-opening?r=jlp
Requirements :
- Prior experience in Full Stack Development using PHP Codeigniter
Perks of Working with MCS :
- Contribute to Innovative Solutions: Join a dynamic team at the forefront of software development, contributing to innovative projects and shaping the technological solutions of the organization.
- Work with Clients from across the Globe: Collaborate with clients from around the world, gaining exposure to diverse cultures and industries, and contributing to the development of solutions that address the unique needs and challenges of global businesses.
- Complete Work From Home Opportunity: Enjoy the flexibility of working entirely from the comfort of your home, empowering you to manage your schedule and achieve a better work-life balance while coding innovative solutions for MCS.
- Opportunity to Work on Projects Developing from Scratch: Engage in projects from inception to completion, working on solutions developed from scratch and having the opportunity to make a significant impact on the design, architecture, and functionality of the final product.
- Diverse Projects: Be involved in a variety of development projects, including web applications, mobile apps, e-commerce platforms, and more, allowing you to showcase your versatility as a Full Stack Developer and expand your portfolio.
Joining MCS as a Full Stack Developer opens the door to a world where your technical skills can shine and grow, all while enjoying a supportive and dynamic work environment. We're not just building solutions; we're building the future—and you can be a key part of that journey.
Role - Dynamics 365 Data Migration Engineer/Developer
Experience level: 5+ years
Location: Remote
Prior experience in Dynamics 365 data migration projects.
knowledge of SSIS, Azure Fabric, and Azure Data Factory.
Good understanding of Dataverse data structure and integration patterns.
Proficiency in SQL for data extraction and transformation.
Experience in preparing data mapping and migration documentation.
Collaborate with functional teams for data validation and reconciliation.
Prepare data mapping documents and ensure accurate transformation
We are seeking a Node.js Developer to build and maintain backend systems for University ERP, Examination Management, and LMS platforms, ensuring secure, scalable, and high-performance applications.
Key Responsibilities
Develop backend services using Node.js & Express.js
Build APIs for exam workflows, results, LMS modules, and ERP integrations
Manage databases (MongoDB / MySQL)
Implement role-based access, data security, and performance optimization
Integrate third-party services (payments, notifications, proctoring)
Collaborate with product, QA, and implementation teams
Required Skills
Node.js, Express.js, JavaScript
Database: MongoDB / SQL
API security (JWT, OAuth)
Experience in ERP / Exam / LMS systems preferred
About the Role
Hudson Data is looking for a Senior / Mid-Level SQL Engineer to design, build, optimize, and manage our data platforms. This role requires strong hands-on expertise in SQL, Google Cloud Platform (GCP), and Linux to support high-performance, scalable data solutions.
We are also hiring Python Programers / Software Developers / Front end and Back End Engineers
Key Responsibilities:
1.Develop and optimize complex SQL queries, views, and stored procedures
- Build and maintain data pipelines and ETL workflows on GCP (e.g., BigQuery, Cloud SQL)
- Manage database performance, monitoring, and troubleshooting
- Work extensively in Linux environments for deployments and automation
- Partner with data, product, and engineering teams on data initiatives
Required Skills & Qualifications
Must-Have Skills (Essential)
- Expert GCP mandatory
- Strong Linux / shell scripting mandatory
Nice to Have
- Experience with data warehousing and ETL frameworks
- Python / scripting for automation
- Performance tuning and query optimization experience
Soft Skills
- Strong analytical, problem-solving, and critical-thinking abilities.
- Excellent communication and presentation skills, including data storytelling.
- Curiosity and creativity in exploring and interpreting data.
- Collaborative mindset, capable of working in cross-functional and fast-paced environments.
Education & Certifications
- Bachelors degree in Data Science, Computer Science, Statistics, Mathematics, or a related field.
- Masters degree in Data Analytics, Machine Learning, or Business Intelligence preferred.
⸻
Why Join Hudson Data
At Hudson Data, youll be part of a dynamic, innovative, and globally connected team that uses cutting-edge tools from AI and ML frameworks to cloud-based analytics platforms to solve meaningful problems. Youll have the opportunity to grow, experiment, and make a tangible impact in a culture that values creativity, precision, and collaboration.
The Power BI Intern will assist the analytics team in using Microsoft Power BI to create interactive dashboards and reports. Working with actual datasets to assist well-informed business decision-making, this position provides practical exposure to data analysis, visualization, and business intelligence techniques.
Job Description
DesignByte Studio is looking for a passionate MERN Stack Developer who enjoys building real world products and learning through hands on work. This role is ideal for someone at the start of their career who wants to grow by working on live projects with a small and focused team.
You will work closely with designers and product members to build modern web applications and improve existing features. The role is fully remote and outcome focused.
Responsibilities
Build and maintain web applications using MongoDB, Express, React and Node.js
Develop clean, reusable and scalable frontend components
Integrate APIs and work with backend logic and databases
Fix bugs, improve performance and maintain code quality
Collaborate with the team on feature planning and implementation
Required Skills
Good understanding of JavaScript fundamentals
Basic to intermediate knowledge of React and Node.js
Familiarity with MongoDB and REST APIs
Understanding of HTML, CSS and modern frontend practices
Basic knowledge of Git and version control
Good to Have
Experience with Next.js or Tailwind CSS
Basic understanding of authentication and database design
Any personal or academic projects using the MERN stack
What We Offer
Remote work environment
Opportunity to work on real products and client projects
Learning focused culture with mentorship
Career growth based on performance and skills
Experience required is 0 to 2 year. Salary range is ₹2L to ₹5L per year.
We are seeking a highly skilled software developer with proven experience in developing and scaling education ERP solutions. The ideal candidate should have strong expertise in Node.js or PHP (Laravel), MySQL, and MongoDB, along with hands-on experience in implementing ERP modules such as HR, Exams, Inventory, Learning Management System (LMS), Admissions, Fee Management, and Finance.
Key Responsibilities
Design, develop, and maintain scalable Education ERP modules.
Work on end-to-end ERP features, including HR, exams, inventory, LMS, admissions, fees, and finance.
Build and optimize REST APIs/GraphQL services and ensure seamless integrations.
Optimize system performance, scalability, and security for high-volume ERP usage.
Conduct code reviews, enforce coding standards, and mentor junior developers.
Stay updated with emerging technologies and recommend improvements for ERP solutions.
Required Skills & Qualifications
Strong expertise in Node.js and PHP (Laravel, Core PHP).
Proficiency with MySQL, MongoDB, and PostgreSQL (database design & optimization).
Frontend knowledge: JavaScript, jQuery, HTML, CSS (React/Vue preferred).
Experience with REST APIs, GraphQL, and third-party integrations (payment gateways, SMS, and email).
Hands-on with Git/GitHub, Docker, and CI/CD pipelines.
Familiarity with cloud platforms (AWS, Azure, GCP) is a plus.
4+ years of professional development experience, with a minimum of 2 years in ERP systems.
Preferred Experience
Prior work in the education ERP domain.
Deep knowledge of HR, Exam, Inventory, LMS, Admissions, Fees & Finance modules.
Exposure to high-traffic enterprise applications.
Strong leadership, mentoring, and problem-solving abilities
Benefit:
Permanent Work From Home
𝐇𝐢 𝐂𝐨𝐧𝐧𝐞𝐜𝐭𝐢𝐨𝐧𝐬! 👋 𝐖𝐞𝐥𝐜𝐨𝐦𝐞 𝐭𝐨 2026! 🎉
Starting the new year with an exciting opportunity!
Deqode 𝐈𝐒 𝐇𝐈𝐑𝐈𝐍𝐆! 💻
Hiring: .Net Developer
⭐ Experience: 4+ Years
⭐ Work Mode: Remote
⏱️ Notice Period: Immediate Joiners
(Only immediate joiners & candidates serving notice period)
🔧 Role Overview
We are looking for passionate .NET Developers to design, develop, and maintain scalable microservices for enterprise-grade applications. You’ll work closely with cross-functional teams and clients on high-performance, cloud-native solutions.
🛠️ Key Responsibilities
✅Build and maintain scalable .NET microservices
✅Develop secure, high-quality RESTful Web APIs
✅Write unit and integration tests to ensure code quality
✅Optimize performance and implement caching strategies
💫 Must-Have Skills
✅ 4+ years of experience with .NET Core / .NET 5+ & C#
✅Strong hands-on experience with ASP.NET Core Web API & EF Core
✅REST API development & middleware implementation
✅Solid understanding of SOLID principles & design patterns
✅Unit testing experience (xUnit, NUnit, MSTest, Moq)
Java Tech Lead (5–6 Years Experience)
About the Role
We are seeking a highly skilled Java Tech Lead with 5–6 years of hands-on experience in backend engineering, architecture design, and leading development teams.
The ideal candidate will combine strong technical expertise in Java frameworks with a deep understanding of system design, scalability, and performance optimization.
This role involves technical leadership, code reviews, and architectural decision-making for complex enterprise systems — with occasional exposure to analytics-driven and Python-based components.
Key Responsibilities
- Architect, design, and develop scalable backend systems using Java (Quarkus, Spring Boot, Spring, Java EE).
- Own the architecture — ensure modular, extensible, and high-performance service design.
- Lead and mentor a team of developers; conduct code reviews, enforce best practices, and ensure high code quality.
- Collaborate with cross-functional teams (frontend, DevOps, product, data) to deliver integrated, end-to-end solutions.
- Design and optimize database schemas (MySQL, PostgreSQL) and ensure efficient query performance.
- Implement and maintain microservices and distributed systems with strong fault tolerance and observability.
- Drive the adoption of modern development workflows — Git branching strategy, CI/CD, and code quality automation.
- Analyze system performance bottlenecks, implement monitoring, and ensure smooth production deployments.
- Contribute to architecture reviews, technical documentation, and design discussions.
- Occasionally contribute to Python-based analytics modules or automation scripts.
- Work with AWS cloud services (EC2, S3, RDS, Lambda) for deployment, scaling, and infrastructure automation.
Required Skills & Qualifications
- 5–6 years of professional experience in backend application development using Java.
- Strong proficiency in Java frameworks: Quarkus, Spring Boot, Spring, Java EE.
- Proven experience in architecture design, system decomposition, and microservices design principles.
- Solid understanding of object-oriented design (OOD), design patterns, and SOLID principles.
- Strong experience with relational databases (MySQL, PostgreSQL) and query optimization.
- Good understanding of event-driven systems, RESTful APIs, and asynchronous processing.
- Proficiency in Git for version control and team collaboration.
- Strong analytical and debugging skills; ability to diagnose complex production issues.
Good to Have
- Hands-on experience with Python for data processing or analytics integrations.
- Familiarity with AWS cloud architecture and cost optimization practices.
- Experience with CI/CD pipelines (GitHub Actions, Jenkins, GitLab CI).
- Knowledge of Docker/Kubernetes for containerized deployments.
- Exposure to NoSQL databases (MongoDB, DynamoDB, Cassandra).
- Experience with message queues (Kafka, RabbitMQ, or AWS SQS).
- Understanding of system scalability, caching (Redis/Memcached), and observability stacks (Prometheus, Grafana, ELK).
Soft Skills
- Strong leadership, mentoring, and communication skills.
- Proven ability to drive technical decisions and balance short-term delivery with long-term architectural health.
- Collaborative mindset — works closely with product, design, and operations teams.
- Passion for clean architecture, high performance, and continuous improvement.
- Self-driven with a strong sense of ownership and accountability.
About Sun King
Sun King is the world’s leading off-grid solar energy company, providing affordable solar solutions to the 1.8 billion people without reliable access to electricity. By combining product design, fintech, and field operations, Sun King has connected over 20 million homes to solar power across Africa and Asia, adding more than 200,000 new homes each month. Through ‘pay-as-you-go’ financing, customers make small payments to eventually own their solar systems, saving money and reducing reliance on harmful energy sources like kerosene.
Sun King employs 2,800 staff across 12 countries, with expertise in product design, data science, logistics, customer service, and more. The company is expanding its product range to include clean cooking, electric mobility, and entertainment solutions, all while supporting a diverse workforce — with women making up 44% of the team.
About the role:
The role involves designing, executing, and maintaining robust functional, regression, and integration testing to ensure product quality and reliability, along with thorough defect tracking, analysis, and resolution. The individual will develop and maintain UI and API automation frameworks to improve test coverage, minimize manual effort, and enhance release efficiency. Close collaboration with development teams is expected to reproduce issues, validate fixes, and ensure high-quality releases. The role also includes integrating automated tests into CI/CD pipelines, supporting production issue analysis, and verifying hotfixes in live environments. Additionally, the candidate will actively participate in requirement and design reviews to ensure testability and clarity, maintain comprehensive QA documentation, and continuously improve testing frameworks, tools, and overall QA processes.
What you will be expected to do:
- Design, execute, and maintain test cases, test plans, and test scripts for functional, regression, and integration testing.
- Identify software defects, document them clearly, and track them through to closure.
- Analyze bugs and provide detailed insights to help developers understand root causes.
- Partner closely with the development team to reproduce issues, validate fixes, and ensure overall product quality.
- Develop, maintain, and improve automated test suites (API/UI) to enhance test coverage, reduce manual effort, and improve release confidence.
- Work with CI/CD pipelines to integrate automated tests into the deployment workflow.
- Validate production issues, support troubleshooting, and verify hotfixes in real-time environments.
- Recommend improvements in product performance, usability, and reliability based on test findings.
- Participate in requirement and design reviews to ensure clarity, completeness, and testability.
- Benchmark against competitor products and suggest enhancements based on industry trends.
- Maintain detailed test documentation, including test results, defect logs, and release readiness assessments.
- Continuously improve QA processes, automation frameworks, and testing methodologies.
You might be a strong candidate if you have/are:
- Bachelor’s Degree in Computer Science, Information Technology, or a related field.
- 2+ years of hands-on experience in software testing (manual + exposure to automation).
- Strong understanding of QA methodologies, testing types, and best practices.
- Experience in designing and executing test cases, test plans, and regression suites.
- Exposure to automation tools/frameworks such as Selenium, Playwright, Cypress, TestNG, JUnit, or similar.
- Basic programming or scripting knowledge (Java/Python preferred).
- Good understanding of SQL for backend and data validation testing.
- Familiarity with API testing tools such as Postman or RestAssured.
- Experience with defect tracking and test management tools (Jira, TestRail, etc.).
- Strong analytical and debugging skills with the ability to identify root causes.
- Ability to work effectively in Agile/Scrum environments and partner with developers, product, and DevOps teams.
- Strong ownership mindset — having contributed to high-quality, near bug-free releases.
Good to have:
- Exceptional attention to detail and a strong focus on product quality.
- Experience with performance, load, or security testing (JMeter, Gatling, OWASP tools, etc.).
- Exposure to advanced automation frameworks or building automation scripts from scratch.
- Familiarity with CI/CD pipelines and integrating automated tests.
- Experience working with observability tools like Grafana, Kibana, and Prometheus for production verification.
- Good understanding of microservices, distributed systems, or cloud platforms.
What Sun King offers:
- Professional growth in a dynamic, rapidly expanding, high-social-impact industry
- An open-minded, collaborative culture made up of enthusiastic colleagues who are driven by the challenge of innovation towards profound impact on people and the planet.
- A truly multicultural experience: you will have the chance to work with and learn from people from different geographies, nationalities, and backgrounds.
- Structured, tailored learning and development programs that help you become a better leader, manager, and professional through the Sun King Center for Leadership.
Role: Lead Data Engineer Core
Responsibilities: Lead end-to-end design, development, and delivery of complex cloud-based data pipelines.
Collaborate with architects and stakeholders to translate business requirements into technical data solutions.
Ensure scalability, reliability, and performance of data systems across environments. Provide mentorship and technical leadership to data engineering teams. Define and enforce best practices for data modeling, transformation, and governance.
Optimize data ingestion and transformation frameworks for efficiency and cost management. Contribute to data architecture design and review sessions across projects.
Qualifications: Bachelor’s or Master’s degree in Computer Science, Engineering, or related field.
8+ years of experience in data engineering with proven leadership in designing cloud native data systems.
Strong expertise in Python, SQL, Apache Spark, and at least one cloud platform (Azure, AWS, or GCP). Experience with Big Data, DataLake, DeltaLake, and Lakehouse architectures Proficient in one or more database technologies (e.g. PostgreSQL, Redshift, Snowflake, and NoSQL databases).
Ability to recommend and implement scalable data pipelines Preferred Qualifications: Cloud certification (AWS, Azure, or GCP). Experience with Databricks, Snowflake, or Terraform. Familiarity with data governance, lineage, and observability tools. Strong collaboration skills and ability to influence data-driven decisions across teams.
Java Angular Fullstack Developer
Job Description:
Technical Lead – Full Stack
Experience: 8–12 years (Strong candidates Java 50% - Angular 50%)
Location – remote
Pf no is mandatory
Tech Stack: Java, Spring Boot, Microservices, Angular, SQL
Focus: Hands-on coding, solution design, team leadership, delivery ownership
Must-Have Skills (Depth)
Java (8+): Streams, concurrency, collections, JVM internals (GC), exception handling.
Spring Boot: Security, Actuator, Data/JPA, Feign/RestTemplate, validation, profiles, configuration management.
Microservices: API design, service discovery, resilience patterns (Hystrix/Resilience4j), messaging (Kafka/RabbitMQ) optional.
React: Hooks, component lifecycle, state management, error boundaries, testing (Jest/RTL).
SQL: Joins, aggregations, indexing, query optimization, transaction isolation, schema design.
Testing: JUnit/Mockito for backend; Jest/RTL/Cypress for frontend.
DevOps: Git, CI/CD, containers (Docker), familiarity with deployment environments.
Role: Full-Time, Long-Term Required: Python, SQL Preferred: Experience with financial or crypto data
OVERVIEW
We are seeking a data engineer to join as a core member of our technical team. This is a long-term position for someone who wants to build robust, production-grade data infrastructure and grow with a small, focused team. You will own the data layer that feeds our machine learning pipeline—from ingestion and validation through transformation, storage, and delivery.
The ideal candidate is meticulous about data quality, thinks deeply about failure modes, and builds systems that run reliably without constant attention. You understand that downstream ML models are only as good as the data they consume.
CORE TECHNICAL REQUIREMENTS
Python (Required): Professional-level proficiency. You write clean, maintainable code for data pipelines—not throwaway scripts. Comfortable with Pandas, NumPy, and their performance characteristics. You know when to use Python versus push computation to the database.
SQL (Required): Advanced SQL skills. Complex queries, query optimization, schema design, execution plans. PostgreSQL experience strongly preferred. You think about indexing, partitioning, and query performance as second nature.
Data Pipeline Design (Required): You build pipelines that handle real-world messiness gracefully. You understand idempotency, exactly-once semantics, backfill strategies, and incremental versus full recomputation tradeoffs. You design for failure—what happens when an upstream source is late, returns malformed data, or goes down entirely. Experience with workflow orchestration required: Airflow, Prefect, Dagster, or similar.
Data Quality (Required): You treat data quality as a first-class concern. You implement validation checks, anomaly detection, and monitoring. You know the difference between data that is missing versus data that should not exist. You build systems that catch problems before they propagate downstream.
WHAT YOU WILL BUILD
Data Ingestion: Pipelines pulling from diverse sources—crypto exchanges, traditional market feeds, on-chain data, alternative data. Handling rate limits, API quirks, authentication, and source-specific idiosyncrasies.
Data Validation: Checks ensuring completeness, consistency, and correctness. Schema validation, range checks, freshness monitoring, cross-source reconciliation.
Transformation Layer: Converting raw data into clean, analysis-ready formats. Time series alignment, handling different frequencies and timezones, managing gaps.
Storage and Access: Schema design optimized for both write patterns (ingestion) and read patterns (ML training, feature computation). Data lifecycle and retention management.
Monitoring and Alerting: Observability into pipeline health. Knowing when something breaks before it affects downstream systems.
DOMAIN EXPERIENCE
Preference for candidates with experience in financial or crypto data—understanding market data conventions, exchange-specific quirks, and point-in-time correctness. You know why look-ahead bias is dangerous and how to prevent it.
Time series data at scale—hundreds of symbols with years of history, multiple frequencies, derived features. You understand temporal joins, windowed computations, and time-aligned data challenges.
High-dimensional feature stores—we work with hundreds of thousands of derived features. Experience managing, versioning, and serving large feature sets is valuable.
ENGINEERING STANDARDS
Reliability: Pipelines run unattended. Failures are graceful with clear errors, not silent corruption. Recovery is straightforward.
Reproducibility: Same inputs and code version produce identical outputs. You version schemas, track lineage, and can reconstruct historical states.
Documentation: Schemas, data dictionaries, pipeline dependencies, operational runbooks. Others can understand and maintain your systems.
Testing: You write tests for pipelines—validation logic, transformation correctness, edge cases. Untested pipelines are broken pipelines waiting to happen.
TECHNICAL ENVIRONMENT
PostgreSQL, Python, workflow orchestration (flexible on tool), cloud infrastructure (GCP preferred but flexible), Git.
WHAT WE ARE LOOKING FOR
Attention to Detail: You notice when something is slightly off and investigate rather than ignore.
Defensive Thinking: You assume sources will send bad data, APIs will fail, schemas will change. You build accordingly.
Self-Direction: You identify problems, propose solutions, and execute without waiting to be told.
Long-Term Orientation: You build systems you will maintain for years.
Communication: You document clearly, explain data issues to non-engineers, and surface problems early.
EDUCATION
University degree in a quantitative/technical field preferred: Computer Science, Mathematics, Statistics, Engineering. Equivalent demonstrated expertise also considered.
TO APPLY
Include: (1) CV/resume, (2) Brief description of a data pipeline you built and maintained, (3) Links to relevant work if available, (4) Availability and timezone.
Job Description
We are looking for motivated IT professionals with at least one year of industry experience. The ideal candidate should have hands-on experience in AWS, Azure, AI, or Cloud technologies, or should be enthusiastic and ready to upskill and shift to new and emerging technologies. This role is primarily remote; however, candidates may be required to visit the office occasionally for meetings or project needs.
Key Requirements
- Minimum 1 year of experience in the IT industry
- Exposure to AWS / Azure / AI / Cloud platforms (any one or more)
- Willingness to learn and adapt to new technologies
- Strong problem-solving and communication skills
- Ability to work independently in a remote setup
- Must have a proper work-from-home environment (laptop, stable internet, quiet workspace)
Education Qualification
- B.Tech / BE / MCA / M.Sc (IT) / equivalent
The Senior Software Developer is responsible for development of CFRA’s report generation framework using a modern technology stack: Python on AWS cloud infrastructure, SQL, and Web technologies. This is an opportunity to make an impact on both the team and the organization by being part of the design and development of a new customer-facing report generation framework that will serve as the foundation for all future report development at CFRA.
The ideal candidate has a passion for solving business problems with technology and can effectively communicate business and technical needs to stakeholders. We are looking for candidates that value collaboration with colleagues and having an immediate, tangible impact for a leading global independent financial insights and data company.
Key Responsibilities
- Analyst Workflows: Design and development of CFRA’s integrated content publishing platform using a proprietary 3rd party editorial and publishing platform for integrated digital publishing.
- Designing and Developing APIs: Design and development of robust, scalable, and secure APIs on AWS, considering factors like performance, reliability, and cost-efficiency.
- AWS Service Integration: Integrate APIs with various AWS services such as AWS Lambda, Amazon API Gateway, Amazon SQS, Amazon SNS, AWS Glue, and others, to build comprehensive and efficient solutions.
- Performance Optimization: Identify and implement optimizations to improve performance, scalability, and efficiency, leveraging AWS services and tools.
- Security and Compliance: Ensure APIs are developed following best security practices, including authentication, authorization, encryption, and compliance with relevant standards and regulations.
- Monitoring and Logging: Implement monitoring and logging solutions for APIs using AWS CloudWatch, AWS X-Ray, or similar tools, to ensure availability, performance, and reliability.
- Continuous Integration and Deployment (CI/CD): Establish and maintain CI/CD pipelines for API development, automating testing, deployment, and monitoring processes on AWS.
- Documentation and Training: Create and maintain comprehensive documentation for internal and external users, and provide training and support to developers and stakeholders.
- Team Collaboration: Collaborate effectively with cross-functional teams, including product managers, designers, and other developers, to deliver high-quality solutions that meet business requirements.
- Problem Solving: troubleshooting efforts, identifying root causes and implementing solutions to ensure system stability and performance.
- Stay Updated: Stay updated with the latest trends, tools, and technologies related to development on AWS, and continuously improve your skills and knowledge.
Desired Skills and Experience
- Development: 5+ years of extensive experience in designing, developing, and deploying using modern technologies, with a focus on scalability, performance, and security.
- AWS Services: proficiency in using AWS services such as AWS Lambda, Amazon API Gateway, Amazon SQS, Amazon SNS, Amazon SES, Amazon RDS, Amazon DynamoDB, and others, to build and deploy API solutions.
- Programming Languages: Proficiency in programming languages commonly used for development, such as Python, Node.js, or others, as well as experience with serverless frameworks like AWS.
- Architecture Design: Ability to design scalable and resilient API architectures using microservices, serverless, or other modern architectural patterns, considering factors like performance, reliability, and cost-efficiency.
- Security: Strong understanding of security principles and best practices, including authentication, authorization, encryption, and compliance with standards like OAuth, OpenID Connect, and AWS IAM.
- DevOps Practices: Familiarity with DevOps practices and tools, including CI/CD pipelines, infrastructure as code (IaC), and automated testing, to ensure efficient and reliable deployment on AWS.
- Problem-solving Skills: Excellent problem-solving skills, with the ability to troubleshoot complex issues, identify root causes, and implement effective solutions to ensure the stability and performance.
- Communication Skills: Strong communication skills, with the ability to effectively communicate technical concepts to both technical and non-technical stakeholders, and collaborate with cross- functional teams.
- Agile Methodologies: Experience working in Agile development environments, following practices like Scrum or Kanban, and ability to adapt to changing requirements and priorities.
- Continuous Learning: A commitment to continuous learning and staying updated with the latest trends, tools, and technologies related to development and AWS services.
- Bachelor's Degree: A bachelor's degree in Computer Science, Software Engineering, or a related field is often preferred, although equivalent experience and certifications can also be valuable.
The Lead Software Developer is responsible for development of CFRA’s report generation framework using a modern technology stack: Python on AWS cloud infrastructure, SQL, and Web technologies. This is an opportunity to make an impact on both the team and the organization by being part of the design and development of a new customer-facing report generation framework that will serve as the foundation for all future report development at CFRA.
The ideal candidate has a passion for solving business problems with technology and can effectively communicate business and technical needs to stakeholders. We are looking for candidates that value collaboration with colleagues and having an immediate, tangible impact for a leading global independent financial insights and data company.
Key Responsibilities
- Analyst Workflows: Lead the design and development of CFRA’s integrated content publishing platform using a proprietary 3rd party editorial and publishing platform for integrated digital publishing.
- Designing and Developing APIs: Lead the design and development of robust, scalable, and secure APIs on AWS, considering factors like performance, reliability, and cost-efficiency.
- Architecture Planning: Collaborate with architects and stakeholders to define architecture, including API gateway, microservices, and serverless components, ensuring alignment with business goals and AWS best practices.
- Technical Leadership: Provide technical guidance and leadership to the development team, ensuring adherence to coding standards, best practices, and AWS guidelines.
- AWS Service Integration: Integrate APIs with various AWS services such as AWS Lambda, Amazon API Gateway, Amazon SQS, Amazon SNS, AWS Glue, and others, to build comprehensive and efficient solutions.
- Performance Optimization: Identify and implement optimizations to improve performance, scalability, and efficiency, leveraging AWS services and tools.
- Security and Compliance: Ensure APIs are developed following best security practices, including authentication, authorization, encryption, and compliance with relevant standards and regulations.
- Monitoring and Logging: Implement monitoring and logging solutions for APIs using AWS CloudWatch, AWS X-Ray, or similar tools, to ensure availability, performance, and reliability.
- Continuous Integration and Deployment (CI/CD): Establish and maintain CI/CD pipelines for API development, automating testing, deployment, and monitoring processes on AWS.
- Documentation and Training: Create and maintain comprehensive documentation for internal and external users, and provide training and support to developers and stakeholders.
- Team Collaboration: Collaborate effectively with cross-functional teams, including product managers, designers, and other developers, to deliver high-quality solutions that meet business requirements.
- Problem Solving: Lead troubleshooting efforts, identifying root causes and implementing solutions to ensure system stability and performance.
- Stay Updated: Stay updated with the latest trends, tools, and technologies related to development on AWS, and continuously improve your skills and knowledge.
Desired Skills and Experience
- Development: 10+ years of extensive experience in designing, developing, and deploying using modern technologies, with a focus on scalability, performance, and security.
- AWS Services: Strong proficiency in using AWS services such as AWS Lambda, Amazon API Gateway, Amazon SQS, Amazon SNS, Amazon SES, Amazon RDS, Amazon DynamoDB, and others, to build and deploy API solutions.
- Programming Languages: Proficiency in programming languages commonly used for development, such as Python, Node.js, or others, as well as experience with serverless frameworks like AWS.
- Architecture Design: Ability to design scalable and resilient API architectures using microservices, serverless, or other modern architectural patterns, considering factors like performance, reliability, and cost-efficiency.
- Security: Strong understanding of security principles and best practices, including authentication, authorization, encryption, and compliance with standards like OAuth, OpenID Connect, and AWS IAM.
- DevOps Practices: Familiarity with DevOps practices and tools, including CI/CD pipelines, infrastructure as code (IaC), and automated testing, to ensure efficient and reliable deployment on AWS.
- Problem-solving Skills: Excellent problem-solving skills, with the ability to troubleshoot complex issues, identify root causes, and implement effective solutions to ensure the stability and performance.
- Team Leadership: Experience leading and mentoring a team of developers, providing technical guidance, code reviews, and fostering a collaborative and innovative environment.
- Communication Skills: Strong communication skills, with the ability to effectively communicate technical concepts to both technical and non-technical stakeholders, and collaborate with cross- functional teams.
- Agile Methodologies: Experience working in Agile development environments, following practices like Scrum or Kanban, and ability to adapt to changing requirements and priorities.
- Continuous Learning: A commitment to continuous learning and staying updated with the latest trends, tools, and technologies related to development and AWS services.
- Bachelor's Degree: A bachelor's degree in Computer Science, Software Engineering, or a related field is often preferred, although equivalent experience and certifications can also be valuable.
1、Job Responsibilities:
Backend Development (.NET)
- Design and implement ASP.NET Core WebAPIs
- Design and implement background jobs using Azure Function Apps
- Optimize performance for long-running operations, ensuring high concurrency and system stability.
- Develop efficient and scalable task scheduling solutions to execute periodic tasks
Frontend Development (React)
- Build high-performance, maintainable React applications and optimize component rendering.
- Continuously improving front-end performance using best practices
- Deployment & Operations
- Deploy React applications on Azure platforms (Azure Web Apps), ensuring smooth and reliable delivery.
- Collaborate with DevOps teams to enhance CI/CD pipelines and improve deployment efficiency.
2、Job Requirements:
Tech Stack:
- Backend: ASP.NET Core Web API, C#
- Frontend: React, JavaScript/TypeScript, Redux or other state management libraries
- Azure: Function Apps, Web Apps, Logic Apps
- Database: Cosmos DB, SQL Server
Strong knowledge of asynchronous programming, performance optimization, and task scheduling
- Proficiency in React performance optimization techniques, understanding of virtual DOM and component lifecycle.
- Experience with cloud deployment, preferably Azure App Service or Azure Static Web Apps.
- Familiarity with Git and CI/CD workflows, with strong coding standards.
3、Project Background:
Mission: Transform Microsoft Cloud customers into fans by delivering exceptional support and engagement.
- Scope:
- Customer reliability engineering
- Advanced cloud engineering and supportability
- Business management and operations
- Product and platform orchestration
- Activities:
- Technical skilling programs
- AI strategy for customer experience
- Handling escalations and service reliability issues
4、Project Highlights:
React Js, ASP.NET Core Web API; Azure Function Apps, Cosmos DB
Job Title: Sr. Frontend Developer (Javascript)
Location: Remote Only
Experience Required: 4–6 years
Salary Range: 7L – 10L per year
About the Role:
We are looking for an experienced Sr. Frontend Developer with strong expertise in Javascript to join our remote team. The ideal candidate will have 4–6 years of hands-on experience in frontend development, with a focus on building responsive, high-performance web applications. You will work closely with cross-functional teams to design, develop, and implement user-facing features that align with business goals and enhance user experience.
Key Responsibilities:
- Develop and maintain scalable, reusable frontend components and applications using modern Javascript frameworks and libraries.
- Collaborate with UI/UX designers, product managers, and backend developers to deliver seamless user experiences.
- Optimize applications for maximum speed, scalability, and accessibility.
- Write clean, modular, and well-documented code following best practices.
- Participate in code reviews, sprint planning, and agile development processes.
- Troubleshoot, debug, and resolve frontend-related issues.
- Stay updated with emerging frontend technologies and industry trends.
Must-Have Skills:
- Javascript (ES6+)
- React.js
- React Native
- NodeJS (Node.js)
- SQL
Nice-to-Have Skills:
- Experience with state management libraries (Redux, Context API, etc.)
- Familiarity with testing frameworks (Jest, Cypress, React Testing Library)
- Knowledge of frontend build tools (Webpack, Babel, NPM/Yarn)
- Understanding of RESTful APIs and GraphQL
- Experience with version control systems (Git)
- Familiarity with CI/CD pipelines and deployment processes
Qualifications:
- 4–6 years of professional frontend development experience.
- Proven track record of delivering high-quality, production-ready applications.
- Strong understanding of responsive design, cross-browser compatibility, and web performance optimization.
- Excellent problem-solving skills and attention to detail.
- Ability to work independently in a remote environment and communicate effectively with distributed teams.
What We Offer:
- Competitive salary within the range of 7L – 10L per year.
- Fully remote work flexibility.
- Opportunity to work on innovative projects with a talented and supportive team.
- Professional growth and skill development opportunities.
Job Summary
We are seeking an experienced Databricks Developer with strong skills in PySpark, SQL, Python, and hands-on experience deploying data solutions on AWS (preferred), Azure. The role involves designing, developing, and optimizing scalable data pipelines and analytics workflows on the Databricks platform.
Key Responsibilities
- Develop and optimize ETL/ELT pipelines using Databricks and PySpark.
- Build scalable data workflows on AWS (EC2, S3, Glue, Lambda, IAM) or Azure (ADF, ADLS, Synapse).
- Implement and manage Delta Lake (ACID, schema evolution, time travel).
- Write efficient, complex SQL for transformation and analytics.
- Build and support batch and streaming ingestion (Kafka, Kinesis, EventHub).
- Optimize Databricks clusters, jobs, notebooks, and PySpark performance.
- Collaborate with cross-functional teams to deliver reliable data solutions.
- Ensure data governance, security, and compliance.
- Troubleshoot pipelines and support CI/CD deployments.
Required Skills & Experience
- 4–8 years in Data Engineering / Big Data development.
- Strong hands-on experience with Databricks (clusters, jobs, workflows).
- Advanced PySpark and strong Python skills.
- Expert-level SQL (complex queries, window functions).
- Practical experience with AWS (preferred) or Azure cloud services.
- Experience with Delta Lake, Parquet, and data lake architectures.
- Familiarity with CI/CD tools (GitHub Actions, Azure DevOps, Jenkins).
- Good understanding of data modeling, optimization, and distributed systems.
JOB TITLE: Associate Full Stack Developer (SDE-2)
LOCATION: Remote/Hybrid.
Associate Full Stack Developer (SDE-2)
A LITTLE BIT ABOUT THE ROLE:
As a Full Stack Developer, you will be responsible for developing digital systems that deliver optimal end-to-end solutions to our business needs. The work will cover all aspects of software delivery, including working with staff, vendors, and outsourced contributors to build, release and maintain the product.
Fountane operates a scrum-based Agile delivery cycle, and you will be working within this. You will work with product owners, user experience, test, infrastructure, and operations professionals to build the most effective solutions.
WHAT YOU WILL BE DOING:
- Full-stack development on a multinational team on various products across different technologies and industries.
- Optimize the development process and identify continuing improvements.
- Monitor technology landscape, assess and introduce new technology. Own and communicate development processes and standards.
- The job title does not define or limit your duties, and you may be required to carry out other work within your abilities from time to time at our request. We reserve the right to introduce changes in line with technological developments which may impact your job duties or methods of working.
WHAT YOU WILL NEED TO BE GREAT IN THIS ROLE:
- Minimum of 2+ years of full-stack development, combined back and front-end experience building fast, reliable web and/or mobile applications.
- Experience with Web frameworks (e.g., React, Angular or Vue) and/or mobile development (e.g., Native, Native Script, React)
- Proficient in at least one JavaScript framework such as React, NodeJs, AngularJS (2. x), or jQuery.
- Ability to optimize product development by leveraging software development processes.
- Bachelor's degree or equivalent (minimum six years) of work experience. If you have an Associate’s Degree must have a minimum of 4 years of work experience.
- Fountane's current technology stack driving our digital products includes React.js, Node.js, React Native, Angular, Firebase, Bootstrap, MongoDB, Express, Hasura, GraphQl, Amazon Web Services(AWS), and Google Cloud Platform.
SOFT SKILLS:
- Collaboration - Ability to work in teams across the world
- Adaptability - situations are unexpected, and you need to be quick to adapt
- Open-mindedness - Expect to see things outside the ordinary
LIFE AT FOUNTANE:
- Fountane offers an environment where all members are supported, challenged, recognized & given opportunities to grow to their fullest potential.
- Competitive pay
- Health insurance
- Individual/team bonuses
- Employee stock ownership plan
- Fun/challenging variety of projects/industries
- Flexible workplace policy - remote/physical
- Flat organization - no micromanagement
- Individual contribution - set your deadlines
- Above all - culture that helps you grow exponentially.
Qualifications - No bachelor's degree required. Good communication skills are a must!
A LITTLE BIT ABOUT THE COMPANY:
Established in 2017, Fountane Inc is a Ventures Lab incubating and investing in new competitive technology businesses from scratch. Thus far, we’ve created half a dozen multi-million valuation companies in the US and a handful of sister ventures for large corporations, including Target, US Ventures, and Imprint Engine.
We’re a team of 80 strong from around the world that are radically open-minded and believes in excellence, respecting one
Summary:
We are seeking a highly skilled Python Backend Developer with proven expertise in FastAPI to join our team as a full-time contractor for 12 months. The ideal candidate will have 5+ years of experience in backend development, a strong understanding of API design, and the ability to deliver scalable, secure solutions. Knowledge of front-end technologies is an added advantage. Immediate joiners are preferred. This role requires full-time commitment—please apply only if you are not engaged in other projects.
Job Type:
Full-Time Contractor (12 months)
Location:
Remote / On-site (Jaipur preferred, as per project needs)
Experience:
5+ years in backend development
Key Responsibilities:
- Design, develop, and maintain robust backend services using Python and FastAPI.
- Implement and manage Prisma ORM for database operations.
- Build scalable APIs and integrate with SQL databases and third-party services.
- Deploy and manage backend services using Azure Function Apps and Microsoft Azure Cloud.
- Collaborate with front-end developers and other team members to deliver high-quality web applications.
- Ensure application performance, security, and reliability.
- Participate in code reviews, testing, and deployment processes.
Required Skills:
- Expertise in Python backend development with strong experience in FastAPI.
- Solid understanding of RESTful API design and implementation.
- Proficiency in SQL databases and ORM tools (preferably Prisma)
- Hands-on experience with Microsoft Azure Cloud and Azure Function Apps.
- Familiarity with CI/CD pipelines and containerization (Docker).
- Knowledge of cloud architecture best practices.
Added Advantage:
- Front-end development knowledge (React, Angular, or similar frameworks).
- Exposure to AWS/GCP cloud platforms.
- Experience with NoSQL databases.
Eligibility:
- Minimum 5 years of professional experience in backend development.
- Available for full-time engagement.
- Please excuse if you are currently engaged in other projects—we require dedicated availability.
Role: Senior Data Engineer (Azure)
Experience: 5+ Years
Location: Anywhere in india
Work Mode: Remote
Notice Period - Immediate joiners or Serving notice period
𝐊𝐞𝐲 𝐑𝐞𝐬𝐩𝐨𝐧𝐬𝐢𝐛𝐢𝐥𝐢𝐭𝐢𝐞𝐬:
- Data processing on Azure using ADF, Streaming Analytics, Event Hubs, Azure Databricks, Data Migration Services, and Data Pipelines
- Provisioning, configuring, and developing Azure solutions (ADB, ADF, ADW, etc.)
- Designing and implementing scalable data models and migration strategies
- Working on distributed big data batch or streaming pipelines (Kafka or similar)
- Developing data integration & transformation solutions for structured and unstructured data
- Collaborating with cross-functional teams for performance tuning and optimization
- Monitoring data workflows and ensuring compliance with governance and quality standards
- Driving continuous improvement through automation and DevOps practices
𝐌𝐚𝐧𝐝𝐚𝐭𝐨𝐫𝐲 𝐒𝐤𝐢𝐥𝐥𝐬 & 𝐄𝐱𝐩𝐞𝐫𝐢𝐞𝐧𝐜𝐞:
- 5–10 years of experience as a Data Engineer
- Strong proficiency in Azure Databricks, PySpark, Python, SQL, and Azure Data Factory
- Experience in Data Modelling, Data Migration, and Data Warehousing
- Good understanding of database structure principles and schema design
- Hands-on experience with MS SQL Server, Oracle, or similar RDBMS platforms
- Experience with DevOps tools (Azure DevOps, Jenkins, Airflow, Azure Monitor) — good to have
- Knowledge of distributed data processing and real-time streaming (Kafka/Event Hub)
- Familiarity with visualization tools like Power BI or Tableau
- Strong analytical, problem-solving, and debugging skills
- Self-motivated, detail-oriented, and capable of managing priorities effectively
Position Overview: The Lead Software Architect - Python & Data Engineering is a senior technical leadership role responsible for designing and owning end-to-end architecture for data-intensive, AI/ML, and analytics platforms, while mentoring developers and ensuring technical excellence across the organization.
Key Responsibilities:
- Design end-to-end software architecture for data-intensive applications, AI/ML pipelines, and analytics platforms
- Evaluate trade-offs between competing technical approaches
- Define data models, API approach, and integration patterns across systems
- Create technical specifications and architecture documentation
- Lead by example through production-grade Python code and mentor developers on engineering fundamentals
- Conduct design and code reviews focused on architectural soundness
- Establish engineering standards, coding practices, and design patterns for the team
- Translate business requirements into technical architecture
- Collaborate with data scientists, analysts, and other teams to design integrated solutions
- Whiteboard and defend system design and architectural choices
- Take responsibility for system performance, reliability, and maintainability
- Identify and resolve architectural bottlenecks proactively
Required Skills:
- 8+ years of experience in software architecture and development
- Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field
- Strong foundations in data structures, algorithms, and computational complexity
- Experience in system design for scale, including caching strategies, load balancing, and asynchronous processing
- 6+ years of Python development experience
- Deep knowledge of Django, Flask, or FastAPI
- Expert understanding of Python internals including GIL and memory management
- Experience with RESTful API design and event-driven architectures (Kafka, RabbitMQ)
- Proficiency in data processing frameworks such as Pandas, Apache Spark, and Airflow
- Strong SQL optimization and database design experience (PostgreSQL, MySQL, MongoDB) Experience with AWS, GCP, or Azure cloud platforms
- Knowledge of containerization (Docker) and orchestration (Kubernetes)
- Hands-on experience designing CI/CD pipelines Preferred (Bonus)
Skills:
- Experience deploying ML models to production (MLOps, model serving, monitoring) Understanding of ML system design including feature stores and model versioning
- Familiarity with ML frameworks such as scikit-learn, TensorFlow, and PyTorch
- Open-source contributions or technical blogging demonstrating architectural depth
- Experience with modern front-end frameworks for full-stack perspective
Experience: 3+ years (Backend/Full-Stack)
Note: You will be the 3rd engineer on the team. If you are comfortable with Java and Springboot plus Cloud, then you will easily be able to pick up the following stack.
Key Requirements —
- Primary Stack: Experience with .NET
- Cloud: Solid understanding of cloud platforms (preferably Azure)
- Frontend/DevOps: Familiarity with React and DevOps practices
- Architecture: Strong grasp of microservices
- Technical Skills: Basic proficiency in scripting, databases, and Git
Compensation: competitive salary, based on experience and fit
Requires that any candidate know the M365 Collaboration environment. SharePoint Online, MS Teams. Exchange Online, Entra and Purview. Need developer that possess a strong understanding of Data Structure, Problem Solving abilities, SQL, PowerShell, MS Teams App Development, Python, Visual Basic, C##, JavaScript, Java, HTML, PHP, C.
Job Description:
Technical Lead – Full Stack
Experience: 8–12 years (Strong candidates Java 50% - React 50%)
Location – Bangalore/Hyderabad
Interview Levels – 3 Rounds
Tech Stack: Java, Spring Boot, Microservices, React, SQL
Focus: Hands-on coding, solution design, team leadership, delivery ownership
Must-Have Skills (Depth)
Java (8+): Streams, concurrency, collections, JVM internals (GC), exception handling.
Spring Boot: Security, Actuator, Data/JPA, Feign/RestTemplate, validation, profiles, configuration management.
Microservices: API design, service discovery, resilience patterns (Hystrix/Resilience4j), messaging (Kafka/RabbitMQ) optional.
React: Hooks, component lifecycle, state management, error boundaries, testing (Jest/RTL).
SQL: Joins, aggregations, indexing, query optimization, transaction isolation, schema design.
Testing: JUnit/Mockito for backend; Jest/RTL/Cypress for frontend.
DevOps: Git, CI/CD, containers (Docker), familiarity with deployment environments.
We are seeking a highly skilled Power Platform Developer with deep expertise in designing, developing, and deploying solutions using Microsoft Power Platform. The ideal candidate will have strong knowledge of Power Apps, Power Automate, Power BI, Power Pages, and Dataverse, along with integration capabilities across Microsoft 365, Azure, and third-party systems.
Key Responsibilities
- Solution Development:
- Design and build custom applications using Power Apps (Canvas & Model-Driven).
- Develop automated workflows using Power Automate for business process optimization.
- Create interactive dashboards and reports using Power BI for data visualization and analytics.
- Configure and manage Dataverse for secure data storage and modelling.
- Develop and maintain Power Pages for external-facing portals.
- Integration & Customization:
- Integrate Power Platform solutions with Microsoft 365, Dynamics 365, Azure services, and external APIs.
- Implement custom connectors and leverage Power Platform SDK for advanced scenarios.
- Utilize Azure Functions, Logic Apps, and REST APIs for extended functionality.
- Governance & Security:
- Apply best practices for environment management, ALM (Application Lifecycle Management), and solution deployment.
- Ensure compliance with security, data governance, and licensing guidelines.
- Implement role-based access control and manage user permissions.
- Performance & Optimization:
- Monitor and optimize app performance, workflow efficiency, and data refresh strategies.
- Troubleshoot and resolve technical issues promptly.
- Collaboration & Documentation:
- Work closely with business stakeholders to gather requirements and translate them into technical solutions.
- Document architecture, workflows, and processes for maintainability.
Required Skills & Qualifications
- Technical Expertise:
- Strong proficiency in Power Apps (Canvas & Model-Driven), Power Automate, Power BI, Power Pages, and Dataverse.
- Experience with Microsoft 365, Dynamics 365, and Azure services.
- Knowledge of JavaScript, TypeScript, C#, .NET, and Power Fx for custom development.
- Familiarity with SQL, DAX, and data modeling.
- Additional Skills:
- Understanding of ALM practices, solution packaging, and deployment pipelines.
- Experience with Git, Azure DevOps, or similar tools for version control and CI/CD.
- Strong problem-solving and analytical skills.
- Certifications (Preferred):
- Microsoft Certified: Power Platform Developer Associate.
- Microsoft Certified: Power Platform Solution Architect Expert.
Soft Skills
- Excellent communication and collaboration skills.
- Ability to work in agile environments and manage multiple priorities.
- Strong documentation and presentation abilities.
We are looking for an enthusiastic and dynamic individual to join Upland India as a Senior Software Engineer I (Backend) for our Panviva product. The individual will work with our global development team.
What would you do?
- Develop, Review, test and maintain application code
- Collaborating with other developers and product to fulfil objectives
- Troubleshoot and diagnose issues
- Take lead on tasks as needed
- Jump in and help the team deliver features when it is required
What are we looking for?
Experience
- 5 + years of experience in Designing and implementing application architecture
- Back-end developer who enjoys solving problems
- Demonstrated experience with the .NET ecosystem (.NET Framework, ASP.NET, .NET Core) & SQL server
- Experience in building cloud-native applications (Azure)
- Must be skilled at writing Quality, scalable, maintainable, testable code
Leadership Skills
- Strong communication skills
- Ability to mentor/lead junior developers
Primary Skills: The candidate must possess the following primary skills:
- Strong Back-end developer who enjoys solving problems
- Solid experience NET Core, SQL Server, and .Net Design patterns such as Strong Understanding of OOPs Principles, .net specific implementation (DI/CQRS/Repository etc., patterns) & Knowing Architectural Solid principles, Unit testing tools, Debugging techniques
- Applying patterns to improve scalability and reduce technical debt
- Experience with refactoring legacy codebases using design patterns
- Real-World Problem Solving
- Ability to analyze a problem and choose the most suitable design pattern
- Experience balancing performance, readability, and maintainability
- Experience building modern, scalable, reliable applications on the MS Azure cloud including services such as:
- App Services
- Azure Service Bus/ Event Hubs
- Azure API Management Service Azure Bot Service
- Function/Logic Apps
- Azure key vault & Azure Configuration Service
- CosmosDB, Mongo DB
- Azure Search
- Azure Cognitive Services
Understanding Agile Methodology and Tool Familiarity
- Solid understanding of Agile development processes, including sprint planning, daily stand-ups, retrospectives, and backlog grooming
- Familiarity with Agile tools such as JIRA for tracking tasks, managing workflows, and collaborating across teams
- Experience working in cross-functional Agile teams and contributing to iterative development cycles
Secondary Skills: It would be advantageous if the candidate also has the following secondary skills:
- Experience with front-end React/Jquery/Javascript, HTML and CSS Frameworks
- APM tools - Worked on any tools such as Grafana, NR, Cloudwatch etc.,
- Basic Understanding of AI models
- Python
About Upland
Upland Software (Nasdaq: UPLD) helps global businesses accelerate digital transformation with a powerful cloud software library that provides choice, flexibility, and value. Upland India is a fully owned subsidiary of Upland Software and headquartered in Bangalore. We are a remote-first company. Interviews and on-boarding are conducted virtually.
About Ven Analytics
At Ven Analytics, we don’t just crunch numbers — we decode them to uncover insights that drive real business impact. We’re a data-driven analytics company that partners with high-growth startups and enterprises to build powerful data products, business intelligence systems, and scalable reporting solutions. With a focus on innovation, collaboration, and continuous learning, we empower our teams to solve real-world business problems using the power of data.
Role Overview
We’re looking for a Power BI Data Analyst who is not just proficient in tools but passionate about building insightful, scalable, and high-performing dashboards. The ideal candidate should have strong fundamentals in data modeling, a flair for storytelling through data, and the technical skills to implement robust data solutions using Power BI, Python, and SQL..
Key Responsibilities
- Technical Expertise: Develop scalable, accurate, and maintainable data models using Power BI, with a clear understanding of Data Modeling, DAX, Power Query, and visualization principles.
- Programming Proficiency: Use SQL and Python for complex data manipulation, automation, and analysis.
- Business Problem Translation: Collaborate with stakeholders to convert business problems into structured data-centric solutions considering performance, scalability, and commercial goals.
- Hypothesis Development: Break down complex use-cases into testable hypotheses and define relevant datasets required for evaluation.
- Solution Design: Create wireframes, proof-of-concepts (POC), and final dashboards in line with business requirements.
- Dashboard Quality: Ensure dashboards meet high standards of data accuracy, visual clarity, performance, and support SLAs.
- Performance Optimization: Continuously enhance user experience by improving performance, maintainability, and scalability of Power BI solutions.
- Troubleshooting & Support: Quick resolution of access, latency, and data issues as per defined SLAs.
- Power BI Development: Use power BI desktop for report building and service for distribution
- Backend development: Develop optimized SQL queries that are easy to consume, maintain and debug.
- Version Control: Strict control on versions by tracking CRs and Bugfixes. Ensuring the maintenance of Prod and Dev dashboards.
- Client Servicing : Engage with clients to understand their data needs, gather requirements, present insights, and ensure timely, clear communication throughout project cycles.
- Team Management : Lead and mentor a small team by assigning tasks, reviewing work quality, guiding technical problem-solving, and ensuring timely delivery of dashboards and reports..
Must-Have Skills
- Strong experience building robust data models in Power BI
- Hands-on expertise with DAX (complex measures and calculated columns)
- Proficiency in M Language (Power Query) beyond drag-and-drop UI
- Clear understanding of data visualization best practices (less fluff, more insight)
- Solid grasp of SQL and Python for data processing
- Strong analytical thinking and ability to craft compelling data stories
- Client Servicing Background.
Good-to-Have (Bonus Points)
- Experience using DAX Studio and Tabular Editor
- Prior work in a high-volume data processing production environment
- Exposure to modern CI/CD practices or version control with BI tools
Why Join Ven Analytics?
- Be part of a fast-growing startup that puts data at the heart of every decision.
- Opportunity to work on high-impact, real-world business challenges.
- Collaborative, transparent, and learning-oriented work environment.
- Flexible work culture and focus on career development.
Requires that any candidate know the M365 Collaboration environment. SharePoint Online, MS Teams. Exchange Online, Entra and Purview. Need developer that possess a strong understanding of Data Structure, Problem Solving abilities, SQL, PowerShell, MS Teams App Development, Python, Visual Basic, C##, JavaScript, Java, HTML, PHP, C.
Need a strong understanding of the development lifecycle, and possess debugging skills time management, business acumen, and have a positive attitude is a must and open to continual growth.
Capability to code appropriate solutions will be tested in any interview.
Knowledge of a wide variety of Generative AI models
Conceptual understanding of how large language models work
Proficiency in coding languages for data manipulation (e.g., SQL) and machine learning & AI development (e.g., Python)
Experience with dashboarding tools such as Power BI and Tableau (beneficial but not essential)
We are looking for highly experienced Senior Java Developers who can architect, design, and deliver high-performance enterprise applications using Spring Boot and Microservices . The role requires a strong understanding of distributed systems, scalability, and data consistency.
About Forbes Advisor
Forbes Digital Marketing Inc. is a high-growth digital media and technology company dedicated to helping consumers make confident, informed decisions about their money, health, careers, and everyday life.
We do this by combining data-driven content, rigorous product comparisons, and user-first design — all built on top of a modern, scalable platform. Our global teams bring deep expertise across journalism, product, performance marketing, data, and analytics.
The Role
We’re hiring a Data Scientist to help us unlock growth through advanced analytics and machine learning. This role sits at the intersection of marketing performance, product optimization, and decision science.
You’ll partner closely with Paid Media, Product, and Engineering to build models, generate insight, and influence how we acquire, retain, and monetize users. From campaign ROI to user segmentation and funnel optimization, your work will directly shape how we grow.This role is ideal for someone who thrives on business impact, communicates clearly, and wants to build re-usable, production-ready insights — not just run one-off analyses.
What You’ll Do
Marketing & Revenue Modelling
• Own end-to-end modelling of LTV, user segmentation, retention, and marketing
efficiency to inform media optimization and value attribution.
• Collaborate with Paid Media and RevOps to optimize SEM performance, predict high-
value cohorts, and power strategic bidding and targeting.
Product & Growth Analytics
• Work closely with Product Insights and General Managers (GMs) to define core metrics, KPIs, and success frameworks for new launches and features.
• Conduct deep-dive analysis of user behaviour, funnel performance, and product engagement to uncover actionable insights.
• Monitor and explain changes in key product metrics, identifying root causes and business impact.
• Work closely with Data Engineering to design and maintain scalable data pipelines that
support machine learning workflows, model retraining, and real-time inference.
Predictive Modelling & Machine Learning
• Build predictive models for conversion, churn, revenue, and engagement using regression, classification, or time-series approaches.
• Identify opportunities for prescriptive analytics and automation in key product and marketing workflows.
• Support development of reusable ML pipelines for production-scale use cases in product recommendation, lead scoring, and SEM planning.
Collaboration & Communication
• Present insights and recommendations to a variety of stakeholders — from ICs to executives — in a clear and compelling manner.
• Translate business needs into data problems, and complex findings into strategic action plans.
• Work cross-functionally with Engineering, Product, BI, and Marketing to deliver and deploy your work.
What You’ll Bring
Minimum Qualifications
• Bachelor’s degree in a quantitative field (Mathematics, Statistics, CS, Engineering, etc.).
• 4+ years in data science, growth analytics, or decision science roles.
• Strong SQL and Python skills (Pandas, Scikit-learn, NumPy).
• Hands-on experience with Tableau, Looker, or similar BI tools.
• Familiarity with LTV modelling, retention curves, cohort analysis, and media attribution.
• Experience with GA4, Google Ads, Meta, or other performance marketing platforms.
• Clear communication skills and a track record of turning data into decisions.
Nice to Have
• Experience with BigQuery and Google Cloud Platform (or equivalent).
• Familiarity with affiliate or lead-gen business models.
• Exposure to NLP, LLMs, embeddings, or agent-based analytics.
• Ability to contribute to model deployment workflows (e.g., using Vertex AI, Airflow, or Composer).
Why Join Us?
• Remote-first and flexible — work from anywhere in India with global exposure.
• Monthly long weekends (every third Friday off).
• Generous wellness stipends and parental leave.
• A collaborative team where your voice is heard and your work drives real impact.
• Opportunity to help shape the future of data science at one of the world’s most trusted
brands.
Data Engineer – Validation & Quality
Responsibilities
- Build rule-based and statistical validation frameworks using Pandas / NumPy.
- Implement contradiction detection, reconciliation, and anomaly flagging.
- Design and compute confidence metrics for each evidence record.
- Automate schema compliance, sampling, and checksum verification across data sources.
- Collaborate with the Kernel to embed validation results into every output artifact.
Requirements
- 5 + years in data engineering, data quality, or MLOps validation.
- Strong SQL optimization and ETL background.
- Familiarity with data lineage, DQ frameworks, and regulatory standards (SOC 2 / GDPR).
Bidgely is seeking an outstanding and deeply technical Principal Engineer / Sr. Principal Engineer / Architect to lead the architecture and evolution of our next-generation data and platform infrastructure. This is a senior IC role for someone who loves solving complex problems at scale, thrives in high-ownership environments, and influences engineering direction across teams.
You will be instrumental in designing scalable and resilient platform components that can handle trillions of data points, integrate machine learning pipelines, and support advanced energy analytics. As we evolve our systems for the future of clean energy, you will play a critical role in shaping the platform that powers all Bidgely products.
Responsibilities
- Architect & Design: Lead the end-to-end architecture of core platform components – from ingestion pipelines to ML orchestration and serving layers. Architect for scale (200Bn+ daily data points), performance, and flexibility.
- Technical Leadership: Act as a thought leader and trusted advisor for engineering teams. Review designs, guide critical decisions, and set high standards for software engineering excellence.
- Platform Evolution: Define and evolve the platform’s vision, making key choices in data processing, storage, orchestration, and cloud-native patterns.
- Mentorship: Coach senior engineers and staff on architecture, engineering best practices, and system thinking. Foster a culture of engineering excellence and continuous improvement.
- Innovation & Research: Evaluate and experiment with emerging technologies (e.g., event-driven architectures, AI infrastructure, new cloud-native tools) to stay ahead of the curve.
- Cross-functional Collaboration: Partner with Engineering Managers, Product Managers, and Data Scientists to align platform capabilities with product needs.
- Non-functional Leadership: Ensure systems are secure, observable, resilient, performant, and cost-efficient. Drive excellence in areas like compliance, DevSecOps, and cloud cost optimization.
- GenAI Integration: Explore and drive adoption of Generative AI to enhance developer productivity, platform intelligence, and automation of repetitive engineering tasks.
Requirements:
- 8+ years of experience in backend/platform architecture roles, ideally with experience at scale.
- Deep expertise in distributed systems, data engineering stacks (Kafka, Spark, HDFS, NoSQL DBs like Cassandra/ElasticSearch), and cloud-native infrastructure (AWS, GCP, or Azure).
- Proven ability to architect high-throughput, low-latency systems with batch + real-time processing.
- Experience designing and implementing DAG-based data processing and orchestration systems.
- Proficient in Java (Spring Boot, REST), and comfortable with infrastructure-as-code and CI/CD practices.
- Strong understanding of non-functional areas: security, scalability, observability, and
- compliance.
- Exceptional problem-solving skills and a data-driven approach to decision-making.
- Excellent communication and collaboration skills with the ability to influence at all levels.
- Prior experience working in a SaaS environment is a strong plus.
- Experience with GenAI tools or frameworks (e.g., LLMs, embedding models, prompt engineering, RAG, Copilot-like integrations) to accelerate engineering workflows or enhance platform intelligence is highly desirable.
Role Overview
We are seeking a ServiceNow Product Owner with deep expertise in ServiceNow modules (CSM, ITSM, HRSD)
and strong scripting and data-handling skills.
This role focuses on translating real enterprise workflows into structured, data-driven AI training tasks, helping improve reasoning and understanding within AI systems. It is not a platform configuration or app development role — instead, it blends functional ServiceNow knowledge, prompt engineering, and data design to build the next generation of intelligent enterprise models.
Key Responsibilities
· Define decision frameworks and realistic scenarios for AI reinforcement learning based on ServiceNow workflows.
· Design scenario-driven tasks mirroring ServiceNow processes like case handling, SLA tracking, and IT incident management.
· Develop and validate structured data tasks in JSON, ensuring accuracy and clarity.
· Write natural language instructions aligned with ServiceNow’s business logic and workflows.
· Use SQL queries for validation and quality checks of task data.
· Apply prompt engineering techniques to guide model reasoning.
· Collaborate with peers to expand and document cross-domain scenarios (CSM, ITSM, HRSD).
· Create and maintain documentation of scenario patterns and best practices.
Required Experience
· 4–6 years of experience with ServiceNow (CSM, ITSM, HRSD).
· Deep understanding of cases, incidents, requests, SLAs, and knowledge management processes.
· Proven ability to design realistic enterprise scenarios mapping to ServiceNow operations.
· Exposure to AI model training workflows or structured data design is a plus.
Preferred Qualifications
· ServiceNow Certified System Administrator (CSA)
· ServiceNow Certified Implementation Specialist (CIS-ITSM / CSM / HRSD)
· Exposure to AI/ML workflows or model training data preparation.
· Excellent written and verbal communication skills, with client-facing
Mandatory Skills: Scripting (Javascript, Glide Script), JSON Handling, SQL, Service Now Modules (ITSM, CSM, HRSD) and Prompt Engineering.
Role Overview
We are seeking a Junior Developer with 1-3 year’s experience with strong foundations in Python, databases, and AI technologies. The ideal candidate will support the development of AI-powered solutions, focusing on LLM integration, prompt engineering, and database-driven workflows. This is a hands-on role with opportunities to learn and grow into advanced AI engineering responsibilities.
Key Responsibilities
- Develop, test, and maintain Python-based applications and APIs.
- Design and optimize prompts for Large Language Models (LLMs) to improve accuracy and performance.
- Work with JSON-based data structures for request/response handling.
- Integrate and manage PostgreSQL (pgSQL) databases, including writing queries and handling data pipelines.
- Collaborate with the product and AI teams to implement new features.
- Debug, troubleshoot, and optimize performance of applications and workflows.
- Stay updated on advancements in LLMs, AI frameworks, and generative AI tools.
Required Skills & Qualifications
- Strong knowledge of Python (scripting, APIs, data handling).
- Basic understanding of Large Language Models (LLMs) and prompt engineering techniques.
- Experience with JSON data parsing and transformations.
- Familiarity with PostgreSQL or other relational databases.
- Ability to write clean, maintainable, and well-documented code.
- Strong problem-solving skills and eagerness to learn.
- Bachelor’s degree in Computer Science, Engineering, or related field (or equivalent practical experience).
Nice-to-Have (Preferred)
- Exposure to AI/ML frameworks (e.g., LangChain, Hugging Face, OpenAI APIs).
- Experience working in startups or fast-paced environments.
- Familiarity with version control (Git/GitHub) and cloud platforms (AWS, GCP, or Azure).
What We Offer
- Opportunity to work on cutting-edge AI applications in permitting & compliance.
- Collaborative, growth-focused, and innovation-driven work culture.
- Mentorship and learning opportunities in AI/LLM development.
- Competitive compensation with performance-based growth.
We are seeking a highly skilled and experienced Senior Full Stack Developerwith 8+years of experience to join our dynamic team. The ideal candidate will have a strong background in both front-end and back-end development, with expertise in .NET, Angular, TypeScript, Azure, SQL Server, Agile methodologies, and Design Patterns. Experience with DocuSign is a plus.
Responsibilities:
- Design, develop, and maintain web applications using .NET, Angular, and TypeScript.
- Collaborate with cross-functional teams to define, design, and ship new features.
- Implement and maintain cloud-based solutions using Azure.
- Develop and optimize SQL Server databases.
- Follow Agile methodologies to manage project tasks and deliverables.
- Apply design patterns and best practices to ensure high-quality, maintainable code.
- Troubleshoot and resolve software defects and issues.
- Mentor and guide junior developers.
Requirements:
- Bachelor's degree in computer science, Engineering, or a related field.
- Proven experience as a Full Stack Developer or similar role.
- Strong proficiency in .NET, Angular, and TypeScript.
- Experience with Azure cloud services.
- Proficient in SQL Server and database design.
- Familiarity with Agile methodologies and practices.
- Solid understanding of design patterns and software architecture principles.
- Excellent problem-solving skills and attention to detail.
- Strong communication and teamwork abilities.
- Experience with DocuSign is a plus.
What You’ll Be Doing:
● Own the architecture and roadmap for scalable, secure, and high-quality data pipelines
and platforms.
● Lead and mentor a team of data engineers while establishing engineering best practices,
coding standards, and governance models.
● Design and implement high-performance ETL/ELT pipelines using modern Big Data
technologies for diverse internal and external data sources.
● Drive modernization initiatives including re-architecting legacy systems to support
next-generation data products, ML workloads, and analytics use cases.
● Partner with Product, Engineering, and Business teams to translate requirements into
robust technical solutions that align with organizational priorities.
● Champion data quality, monitoring, metadata management, and observability across the
ecosystem.
● Lead initiatives to improve cost efficiency, data delivery SLAs, automation, and
infrastructure scalability.
● Provide technical leadership on data modeling, orchestration, CI/CD for data workflows,
and cloud-based architecture improvements.
Qualifications:
● Bachelor's degree in Engineering, Computer Science, or relevant field.
● 8+ years of relevant and recent experience in a Data Engineer role.
● 5+ years recent experience with Apache Spark and solid understanding of the
fundamentals.
● Deep understanding of Big Data concepts and distributed systems.
● Demonstrated ability to design, review, and optimize scalable data architectures across
ingestion.
● Strong coding skills with Scala, Python and the ability to quickly switch between them with
ease.
● Advanced working SQL knowledge and experience working with a variety of relational
databases such as Postgres and/or MySQL.
● Cloud Experience with DataBricks.
● Strong understanding of Delta Lake architecture and working with Parquet, JSON, CSV,
and similar formats.
● Experience establishing and enforcing data engineering best practices, including CI/CD
for data, orchestration and automation, and metadata management.
● Comfortable working in an Agile environment
● Machine Learning knowledge is a plus.
● Demonstrated ability to operate independently, take ownership of deliverables, and lead
technical decisions.
● Excellent written and verbal communication skills in English.
● Experience supporting and working with cross-functional teams in a dynamic
environment.
REPORTING: This position will report to Sr. Technical Manager or Director of Engineering as
assigned by Management.
EMPLOYMENT TYPE: Full-Time, Permanent
SHIFT TIMINGS: 10:00 AM - 07:00 PM IST
Sr Software Engineer
Company Summary :
As the recognized global standard for project-based businesses, Deltek delivers software and information solutions to help organizations achieve their purpose. Our market leadership stems from the work of our diverse employees who are united by a passion for learning, growing and making a difference.
At Deltek, we take immense pride in creating a balanced, values-driven environment, where every employee feels included and empowered to do their best work. Our employees put our core values into action daily, creating a one-of-a-kind culture that has been recognized globally. Thanks to our incredible team, Deltek has been named one of America's Best Midsize Employers by Forbes, a Best Place to Work by Glassdoor, a Top Workplace by The Washington Post and a Best Place to Work in Asia by World HRD Congress. www.deltek.com
Business Summary :
The Deltek Engineering and Technology team builds best-in-class solutions to delight customers and meet their business needs. We are laser-focused on software design, development, innovation and quality. Our team of experts has the talent, skills and values to deliver products and services that are easy to use, reliable, sustainable and competitive. If you're looking for a safe environment where ideas are welcome, growth is supported and questions are encouraged – consider joining us as we explore the limitless opportunities of the software industry.
Position Responsibilities :
About the Role
We are looking for a skilled and motivated Senior Software Developer to join our team responsible for developing and maintaining a robust ERP solution used by approximately 400 customers and more than 30000 users worldwide. The system is built using C# (.NET Core), leverages SQL Server for data management, and is hosted in the Microsoft Azure cloud.
This role offers the opportunity to work on a mission-critical product, contribute to architectural decisions, and help shape the future of our cloud-native ERP platform.
Key Responsibilities
- Design, develop, and maintain features and modules within the ERP system using C# (.NET Core)
- Optimize and manage SQL Server database interactions for performance and scalability
- Collaborate with cross-functional teams, including QA, DevOps, Product Management, and Support
- Participate in code reviews, architecture discussions, and technical planning
- Contribute to the adoption and improvement of CI/CD pipelines and cloud deployment practices
- Troubleshoot and resolve complex technical issues across the stack
- Ensure code quality, maintainability, and adherence to best practices
- Stay current with emerging technologies and recommend improvements where applicable
Qualifications
- Curiosity, passion, teamwork, and initiative
- Strong experience with C# and .NET Core in enterprise application development
- Solid understanding of SQL Server, including query optimization and schema design
- Experience with Azure cloud services (App Services, Azure SQL, Storage, etc.)
- Ability to utilize agentic AI as a development support, with a critical thinking attitude
- Familiarity with agile development methodologies and DevOps practices
- Ability to work independently and collaboratively in a fast-paced environment
- Excellent problem-solving and communication skills
- Master's degree in Computer Science or equivalent; 5+ years of relevant work experience
- Experience with ERP systems or other complex business applications is a plus
What We Offer
- A chance to work on a product that directly impacts thousands of users worldwide
- A collaborative and supportive engineering culture
- Opportunities for professional growth and technical leadership
- Competitive salary and benefits package
Job Title: PHP Coordinator / Laravel Developer
Experience: 4+ Years
Work Mode: Work From Home (WFH)
Working Days: 5 Days
Job Description:
We are looking for an experienced PHP Coordinator / Laravel Developer to join our team. The ideal candidate should have strong expertise in PHP and Laravel framework, along with the ability to coordinate and manage development (as Team Lead) tasks effectively.
Key Responsibilities:
- Develop, test, and maintain web applications using PHP and Laravel.
- Coordinate with team members to ensure timely project delivery.
- Write clean, secure, and efficient code.
- Troubleshoot, debug, and optimize existing applications.
- Collaborate with stakeholders to gather and analyze requirements.
Required Skills:
- Strong experience in PHP and Laravel framework.
- Good understanding of MySQL and RESTful APIs and Cloud (AWS/ Azure/ GCP).
- Familiarity with front-end technologies (HTML, CSS, JavaScript).
- Excellent communication and coordination skills.
- Ability to work independently in a remote environment.
Tech Stack / Requirements:
- Experience required: 1 - 2 yrs atleast
- Candidates must be from an IT Engineering background (B.E./B.Tech in Information Technology, Computer Science, or related fields), B.Sc. IT, BCA or related fields.
- Strong understanding of JavaScript
- Experience with React Native / Expo
- Familiarity with SQL
- Exposure to REST APIs integration
- Fast learner with strong problem-solving & debugging skills
Responsibilities:
- Build & improve mobile app features using React Native / Expo
- Develop and maintain web features using React.js / Next.js
- Integrate APIs and ensure seamless user experiences across platforms
- Collaborate with backend & design teams for end-to-end development
- Debug & optimize performance across mobile and web
- Write clean, maintainable code and ship to production regularly
Work closely with the founding team / CTO and contribute to product launches
Growth: Performance-based growth with significant hikes possible in the same or upcoming months.
Job Title : Informatica Cloud Developer / Migration Specialist
Experience : 6 to 10 Years
Location : Remote
Notice Period : Immediate
Job Summary :
We are looking for an experienced Informatica Cloud Developer with strong expertise in Informatica IDMC/IICS and experience in migrating from PowerCenter to Cloud.
The candidate will be responsible for designing, developing, and maintaining ETL workflows, data warehouses, and performing data integration across multiple systems.
Mandatory Skills :
Informatica IICS/IDMC, Informatica PowerCenter, ETL Development, SQL, Data Migration (PowerCenter to IICS), and Performance Tuning.
Key Responsibilities :
- Design, develop, and maintain ETL processes using Informatica IICS/IDMC.
- Work on migration projects from Informatica PowerCenter to IICS Cloud.
- Troubleshoot and resolve issues related to mappings, mapping tasks, and taskflows.
- Analyze business requirements and translate them into technical specifications.
- Conduct unit testing, performance tuning, and ensure data quality.
- Collaborate with cross-functional teams for data integration and reporting needs.
- Prepare and maintain technical documentation.
Required Skills :
- 4 to 5 years of hands-on experience in Informatica Cloud (IICS/IDMC).
- Strong experience with Informatica PowerCenter.
- Proficiency in SQL and data warehouse concepts.
- Good understanding of ETL performance tuning and debugging.
- Excellent communication and problem-solving skills.
Role & responsibilities
- Develop and maintain server-side applications using Go Lang.
- Design and implement scalable, secure, and maintainable RESTful APIs and microservices.
- Collaborate with front-end developers to integrate user-facing elements with server-side logic
- Optimize applications for performance, reliability, and scalability.
- Write clean, efficient, and reusable code that adheres to best practices.
Preferred candidate profile
- Minimum 5 years of working experience in Go Lang development.
- Proven experience in developing RESTful APIs and microservices.
- Familiarity of cloud platforms like AWS, GCP, or Azure.
- Familiarity with CI/CD pipelines and DevOps practices
Position: Senior Data Engineer
Overview:
We are seeking an experienced Senior Data Engineer to design, build, and optimize scalable data pipelines and infrastructure to support cross-functional teams and next-generation data initiatives. The ideal candidate is a hands-on data expert with strong technical proficiency in Big Data technologies and a passion for developing efficient, reliable, and future-ready data systems.
Reporting: Reports to the CEO or designated Lead as assigned by management.
Employment Type: Full-time, Permanent
Location: Remote (Pan India)
Shift Timings: 2:00 PM – 11:00 PM IST
Key Responsibilities:
- Design and develop scalable data pipeline architectures for data extraction, transformation, and loading (ETL) using modern Big Data frameworks.
- Identify and implement process improvements such as automation, optimization, and infrastructure re-design for scalability and performance.
- Collaborate closely with Engineering, Product, Data Science, and Design teams to resolve data-related challenges and meet infrastructure needs.
- Partner with machine learning and analytics experts to enhance system accuracy, functionality, and innovation.
- Maintain and extend robust data workflows and ensure consistent delivery across multiple products and systems.
Required Qualifications:
- Bachelor’s degree in Computer Science, Engineering, or related field.
- 10+ years of hands-on experience in Data Engineering.
- 5+ years of recent experience with Apache Spark, with a strong grasp of distributed systems and Big Data fundamentals.
- Proficiency in Scala, Python, Java, or similar languages, with the ability to work across multiple programming environments.
- Strong SQL expertise and experience working with relational databases such as PostgreSQL or MySQL.
- Proven experience with Databricks and cloud-based data ecosystems.
- Familiarity with diverse data formats such as Delta Tables, Parquet, CSV, and JSON.
- Skilled in Linux environments and shell scripting for automation and system tasks.
- Experience working within Agile teams.
- Knowledge of Machine Learning concepts is an added advantage.
- Demonstrated ability to work independently and deliver efficient, stable, and reliable software solutions.
- Excellent communication and collaboration skills in English.
About the Organization:
We are a leading B2B data and intelligence platform specializing in high-accuracy contact and company data to empower revenue teams. Our technology combines human verification and automation to ensure exceptional data quality and scalability, helping businesses make informed, data-driven decisions.
What We Offer:
Our workplace embraces diversity, inclusion, and continuous learning. With a fast-paced and evolving environment, we provide opportunities for growth through competitive benefits including:
- Paid Holidays and Leaves
- Performance Bonuses and Incentives
- Comprehensive Medical Policy
- Company-Sponsored Training Programs
We are an Equal Opportunity Employer, committed to maintaining a workplace free from discrimination and harassment. All employment decisions are made based on merit, competence, and business needs.
Required Skills:
- 4+ years of experience designing, developing, and implementing enterprise-level, n-tier, software solutions.
- Proficiency with Microsoft C# is a must.
- In-depth experience with .NET framework and .NET Core.
- Knowledge of OOP, server technologies, and SOA is a must. 3+ Years Micro-service experience .
- Relevant experience with database design and SQL (Postgres is preferred).
- Experience with ORM tooling.
- Experience delivering software that is correct, stable, and security compliant.
- Basic understanding of common cloud platform. (Good to have)
- Financial services experience is strongly preferred.
- Thorough understanding of XML/JSON and related technologies.
- Thorough understanding of unit, integration, and performance testing for APIs.
- Entrepreneurial spirit. You are self-directed, innovative, and biased towards action. You love to build new things and thrive in fast-paced environments.
- Excellent communication and interpersonal skills, with an emphasis on strong writing and analytical problem-solving.
Now Hiring: Tableau Developer (Banking Domain) 🚀
We’re looking for a 6+ years experienced Tableau pro to design and optimize dashboards for Banking & Financial Services.
🔹 Design & optimize interactive Tableau dashboards for large banking datasets
🔹 Translate KPIs into scalable reporting solutions
🔹 Ensure compliance with regulations like KYC, AML, Basel III, PCI-DSS
🔹 Collaborate with business analysts, data engineers, and banking experts
🔹 Bring deep knowledge of SQL, data modeling, and performance optimization
🌍 Location: Remote
📊 Domain Expertise: Banking / Financial Services
✨ Preferred experience with cloud data platforms (AWS, Azure, GCP) & certifications in Tableau are a big plus!
Bring your data visualization skills to transform banking intelligence & compliance reporting.
About the Role
We are seeking motivated Data Engineering Interns to join our team remotely for a 3-month internship. This role is designed for students or recent graduates interested in working with data pipelines, ETL processes, and big data tools. You will gain practical experience in building scalable data solutions. While this is an unpaid internship, interns who successfully complete the program will receive a Completion Certificate and a Letter of Recommendation.
Responsibilities
- Assist in designing and building data pipelines for structured and unstructured data.
- Support ETL (Extract, Transform, Load) processes to prepare data for analytics.
- Work with databases (SQL/NoSQL) for data storage and retrieval.
- Help optimize data workflows for performance and scalability.
- Collaborate with data scientists and analysts to ensure data quality and consistency.
- Document workflows, schemas, and technical processes.
Requirements
- Strong interest in data engineering, databases, and big data systems.
- Basic knowledge of SQL and relational database concepts.
- Familiarity with Python, Java, or Scala for data processing.
- Understanding of ETL concepts and data pipelines.
- Exposure to cloud platforms (AWS, Azure, or GCP) is a plus.
- Familiarity with big data frameworks (Hadoop, Spark, Kafka) is an advantage.
- Good problem-solving skills and ability to work independently in a remote setup.
What You’ll Gain
- Hands-on experience in data engineering and ETL pipelines.
- Exposure to real-world data workflows.
- Mentorship and guidance from experienced engineers.
- Completion Certificate upon successful completion.
- Letter of Recommendation based on performance.
Internship Details
- Duration: 3 months
- Location: Remote (Work from Home)
- Stipend: Unpaid
- Perks: Completion Certificate + Letter of Recommendation























