50+ Remote SQL Jobs in India
Apply to 50+ Remote SQL Jobs on CutShort.io. Find your next job, effortlessly. Browse SQL Jobs and apply today!
Job Summary
We are seeking an experienced Databricks Developer with strong skills in PySpark, SQL, Python, and hands-on experience deploying data solutions on AWS (preferred), Azure. The role involves designing, developing, and optimizing scalable data pipelines and analytics workflows on the Databricks platform.
Key Responsibilities
- Develop and optimize ETL/ELT pipelines using Databricks and PySpark.
- Build scalable data workflows on AWS (EC2, S3, Glue, Lambda, IAM) or Azure (ADF, ADLS, Synapse).
- Implement and manage Delta Lake (ACID, schema evolution, time travel).
- Write efficient, complex SQL for transformation and analytics.
- Build and support batch and streaming ingestion (Kafka, Kinesis, EventHub).
- Optimize Databricks clusters, jobs, notebooks, and PySpark performance.
- Collaborate with cross-functional teams to deliver reliable data solutions.
- Ensure data governance, security, and compliance.
- Troubleshoot pipelines and support CI/CD deployments.
Required Skills & Experience
- 4–8 years in Data Engineering / Big Data development.
- Strong hands-on experience with Databricks (clusters, jobs, workflows).
- Advanced PySpark and strong Python skills.
- Expert-level SQL (complex queries, window functions).
- Practical experience with AWS (preferred) or Azure cloud services.
- Experience with Delta Lake, Parquet, and data lake architectures.
- Familiarity with CI/CD tools (GitHub Actions, Azure DevOps, Jenkins).
- Good understanding of data modeling, optimization, and distributed systems.
JOB TITLE: Senior Full Stack Developer (SDE-3)
LOCATION: Remote/Hybrid.
A LITTLE BIT ABOUT THE ROLE:
As a Full Stack Developer, you will be responsible for developing digital systems that deliver optimal end-to-end solutions to our business needs. The work will cover all aspects of software delivery, including working with staff, vendors, and outsourced contributors to build, release and maintain the product.
Fountane operates a scrum-based Agile delivery cycle, and you will be working within this. You will work with product owners, user experience, test, infrastructure, and operations professionals to build the most effective solutions.
WHAT YOU WILL BE DOING:
- Full-stack development on a multinational team on various products across different technologies and industries.
- Optimize the development process and identify continuing improvements.
- Monitor technology landscape, assess and introduce new technology. Own and communicate development processes and standards.
- The job title does not define or limit your duties, and you may be required to carry out other work within your abilities from time to time at our request. We reserve the right to introduce changes in line with technological developments which may impact your job duties or methods of working.
WHAT YOU WILL NEED TO BE GREAT IN THIS ROLE:
- Minimum of 3+ years of full-stack development, combined back and front-end experience building fast, reliable web and/or mobile applications.
- Experience with Web frameworks (e.g., React, Angular or Vue) and/or mobile development (e.g., Native, Native Script, React)
- Proficient in at least one JavaScript framework such as React, NodeJs, AngularJS (2. x), or jQuery.
- Ability to optimize product development by leveraging software development processes.
- Bachelor's degree or equivalent (minimum six years) of work experience. If you have an Associate’s Degree must have a minimum of 4 years of work experience.
- Fountane's current technology stack driving our digital products includes React.js, Node.js, React Native, Angular, Firebase, Bootstrap, MongoDB, Express, Hasura, GraphQl, Amazon Web Services(AWS), and Google Cloud Platform.
SOFT SKILLS:
- Collaboration - Ability to work in teams across the world
- Adaptability - situations are unexpected, and you need to be quick to adapt
- Open-mindedness - Expect to see things outside the ordinary
LIFE AT FOUNTANE:
- Fountane offers an environment where all members are supported, challenged, recognized & given opportunities to grow to their fullest potential.
- Competitive pay
- Health insurance
- Individual/team bonuses
- Employee stock ownership plan
- Fun/challenging variety of projects/industries
- Flexible workplace policy - remote/physical
- Flat organization - no micromanagement
- Individual contribution - set your deadlines
- Above all - culture that helps you grow exponentially.
Qualifications - No bachelor's degree required. Good communication skills are a must!
A LITTLE BIT ABOUT THE COMPANY:
Established in 2017, Fountane Inc is a Ventures Lab incubating and investing in new competitive technology businesses from scratch. Thus far, we’ve created half a dozen multi-million valuation companies in the US and a handful of sister ventures for large corporations, including Target, US Ventures, and Imprint Engine.
We’re a team of 80 strong from around the world that are radically open-minded and believes in excellence, respecting one
Summary:
We are seeking a highly skilled Python Backend Developer with proven expertise in FastAPI to join our team as a full-time contractor for 12 months. The ideal candidate will have 5+ years of experience in backend development, a strong understanding of API design, and the ability to deliver scalable, secure solutions. Knowledge of front-end technologies is an added advantage. Immediate joiners are preferred. This role requires full-time commitment—please apply only if you are not engaged in other projects.
Job Type:
Full-Time Contractor (12 months)
Location:
Remote / On-site (Jaipur preferred, as per project needs)
Experience:
5+ years in backend development
Key Responsibilities:
- Design, develop, and maintain robust backend services using Python and FastAPI.
- Implement and manage Prisma ORM for database operations.
- Build scalable APIs and integrate with SQL databases and third-party services.
- Deploy and manage backend services using Azure Function Apps and Microsoft Azure Cloud.
- Collaborate with front-end developers and other team members to deliver high-quality web applications.
- Ensure application performance, security, and reliability.
- Participate in code reviews, testing, and deployment processes.
Required Skills:
- Expertise in Python backend development with strong experience in FastAPI.
- Solid understanding of RESTful API design and implementation.
- Proficiency in SQL databases and ORM tools (preferably Prisma)
- Hands-on experience with Microsoft Azure Cloud and Azure Function Apps.
- Familiarity with CI/CD pipelines and containerization (Docker).
- Knowledge of cloud architecture best practices.
Added Advantage:
- Front-end development knowledge (React, Angular, or similar frameworks).
- Exposure to AWS/GCP cloud platforms.
- Experience with NoSQL databases.
Eligibility:
- Minimum 5 years of professional experience in backend development.
- Available for full-time engagement.
- Please excuse if you are currently engaged in other projects—we require dedicated availability.
Role Overview
As a Senior SQL Developer, you’ll be responsible for data extracts, updating, and maintaining reports as requested by stakeholders. You’ll work closely with finance operations and developers to ensure data requests are appropriately managed.
Key Responsibilities
- Design, develop, and optimize complex SQL queries, stored procedures, functions, and tasks across multiple databases/schemas.
- Transform cost-intensive models from full-refreshes to incremental loads based on upstream data.
- Help design proactive monitoring of data to catch data issues/data delays.
Qualifications
- 5+ years of experience as a SQL developer, preferably in a B2C or tech environment. • Ability to translate requirements into datasets.
- Understanding of dbt framework for transformations.
- Basic usage of git - branching/ PR generation.
- Detail-oriented with strong organizational and time management skills.
- Ability to work cross-functionally and manage multiple projects simultaneously.
Bonus Points
- Experience with Snowflake and AWS data technologies.
- Experience with Python and containers (Docker)
Job description
We are seeking a highly skilled Senior Software Developer with proven experience in developing and scaling Education ERP solutions. The ideal candidate should have strong expertise in Node.js, PHP (Laravel), MySQL, and MongoDB, along with hands-on experience in implementing ERP modules such as HR, Exams, Inventory, Learning Management System (LMS), Admissions, Fee Management, and Finance.Key Responsibilities
Design, develop, and maintain scalable Education ERP modules.
Work on end-to-end ERP features, including HR, Exam, Inventory, LMS, Admissions, Fees, and Finance.
Build and optimize REST APIs/GraphQL services and ensure seamless integrations.
Optimize system performance, scalability, and security for high-volume ERP usage.
Conduct code reviews, enforce coding standards, and mentor junior developers.
Stay updated with emerging technologies and recommend improvements for ERP solutions.
Required Skills & Qualifications
Strong expertise in Node.js and PHP (Laravel, Core PHP).
Proficiency with MySQL, MongoDB, PostgreSQL (database design & optimization).
Frontend knowledge: JavaScript, jQuery, HTML, CSS (React/Vue preferred).
Experience with REST APIs, GraphQL, third-party integrations (payment gateways, SMS, email).
Hands-on with Git/GitHub, Docker, CI/CD pipelines.
Familiarity with cloud platforms (AWS, Azure, GCP) is a plus.
6 years of professional development experience, with a minimum of 4 years in ERP systems.
Preferred Experience
Prior work in Education ERP domain.
Deep knowledge of HR, Exam, Inventory, LMS, Admissions, Fees & Finance modules.
Exposure to high-traffic enterprise applications.
Strong leadership, mentoring, and problem-solving abilities
Benefit;
Permanent Work From Home.
Role: Senior Data Engineer (Azure)
Experience: 5+ Years
Location: Anywhere in india
Work Mode: Remote
Notice Period - Immediate joiners or Serving notice period
𝐊𝐞𝐲 𝐑𝐞𝐬𝐩𝐨𝐧𝐬𝐢𝐛𝐢𝐥𝐢𝐭𝐢𝐞𝐬:
- Data processing on Azure using ADF, Streaming Analytics, Event Hubs, Azure Databricks, Data Migration Services, and Data Pipelines
- Provisioning, configuring, and developing Azure solutions (ADB, ADF, ADW, etc.)
- Designing and implementing scalable data models and migration strategies
- Working on distributed big data batch or streaming pipelines (Kafka or similar)
- Developing data integration & transformation solutions for structured and unstructured data
- Collaborating with cross-functional teams for performance tuning and optimization
- Monitoring data workflows and ensuring compliance with governance and quality standards
- Driving continuous improvement through automation and DevOps practices
𝐌𝐚𝐧𝐝𝐚𝐭𝐨𝐫𝐲 𝐒𝐤𝐢𝐥𝐥𝐬 & 𝐄𝐱𝐩𝐞𝐫𝐢𝐞𝐧𝐜𝐞:
- 5–10 years of experience as a Data Engineer
- Strong proficiency in Azure Databricks, PySpark, Python, SQL, and Azure Data Factory
- Experience in Data Modelling, Data Migration, and Data Warehousing
- Good understanding of database structure principles and schema design
- Hands-on experience with MS SQL Server, Oracle, or similar RDBMS platforms
- Experience with DevOps tools (Azure DevOps, Jenkins, Airflow, Azure Monitor) — good to have
- Knowledge of distributed data processing and real-time streaming (Kafka/Event Hub)
- Familiarity with visualization tools like Power BI or Tableau
- Strong analytical, problem-solving, and debugging skills
- Self-motivated, detail-oriented, and capable of managing priorities effectively
About the Company
Hypersonix.ai is disrupting the e-commerce space with AI, ML and advanced decision capabilities to drive real-time business insights. Hypersonix.ai has been built ground up with new age technology to simplify the consumption of data for our customers in various industry verticals. Hypersonix.ai is seeking a well-rounded, hands-on product leader to help lead product management of key capabilities and features.
About the Role
We are looking for talented and driven Data Engineers at various levels to work with customers to build the data warehouse, analytical dashboards and ML capabilities as per customer needs.
Roles and Responsibilities
- Create and maintain optimal data pipeline architecture
- Assemble large, complex data sets that meet functional / non-functional business requirements; should write complex queries in an optimized way
- Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
- Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and AWS ‘big data’ technologies
- Run ad-hoc analysis utilizing the data pipeline to provide actionable insights
- Work with stakeholders including the Executive, Product, Data and Design teams to assist with data-related technical issues and support their data infrastructure needs
- Keep our data separated and secure across national boundaries through multiple data centers and AWS regions
- Work with analytics and data scientist team members and assist them in building and optimizing our product into an innovative industry leader
Requirements
- Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases
- Experience building and optimizing ‘big data’ data pipelines, architectures and data sets
- Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement
- Strong analytic skills related to working with unstructured datasets
- Build processes supporting data transformation, data structures, metadata, dependency and workload management
- A successful history of manipulating, processing and extracting value from large disconnected datasets
- Working knowledge of message queuing, stream processing, and highly scalable ‘big data’ data stores
- Experience supporting and working with cross-functional teams in a dynamic environment
- We are looking for a candidate with 4+ years of experience in a Data Engineer role, who has attained a Graduate degree in Computer Science, Information Technology or completed MCA.
Position Overview: The Lead Software Architect - Python & Data Engineering is a senior technical leadership role responsible for designing and owning end-to-end architecture for data-intensive, AI/ML, and analytics platforms, while mentoring developers and ensuring technical excellence across the organization.
Key Responsibilities:
- Design end-to-end software architecture for data-intensive applications, AI/ML pipelines, and analytics platforms
- Evaluate trade-offs between competing technical approaches
- Define data models, API approach, and integration patterns across systems
- Create technical specifications and architecture documentation
- Lead by example through production-grade Python code and mentor developers on engineering fundamentals
- Conduct design and code reviews focused on architectural soundness
- Establish engineering standards, coding practices, and design patterns for the team
- Translate business requirements into technical architecture
- Collaborate with data scientists, analysts, and other teams to design integrated solutions
- Whiteboard and defend system design and architectural choices
- Take responsibility for system performance, reliability, and maintainability
- Identify and resolve architectural bottlenecks proactively
Required Skills:
- 8+ years of experience in software architecture and development
- Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field
- Strong foundations in data structures, algorithms, and computational complexity
- Experience in system design for scale, including caching strategies, load balancing, and asynchronous processing
- 6+ years of Python development experience
- Deep knowledge of Django, Flask, or FastAPI
- Expert understanding of Python internals including GIL and memory management
- Experience with RESTful API design and event-driven architectures (Kafka, RabbitMQ)
- Proficiency in data processing frameworks such as Pandas, Apache Spark, and Airflow
- Strong SQL optimization and database design experience (PostgreSQL, MySQL, MongoDB) Experience with AWS, GCP, or Azure cloud platforms
- Knowledge of containerization (Docker) and orchestration (Kubernetes)
- Hands-on experience designing CI/CD pipelines Preferred (Bonus)
Skills:
- Experience deploying ML models to production (MLOps, model serving, monitoring) Understanding of ML system design including feature stores and model versioning
- Familiarity with ML frameworks such as scikit-learn, TensorFlow, and PyTorch
- Open-source contributions or technical blogging demonstrating architectural depth
- Experience with modern front-end frameworks for full-stack perspective
Experience: 3+ years (Backend/Full-Stack)
Note: You will be the 3rd engineer on the team. If you are comfortable with Java and Springboot plus Cloud, then you will easily be able to pick up the following stack.
Key Requirements —
- Primary Stack: Experience with .NET
- Cloud: Solid understanding of cloud platforms (preferably Azure)
- Frontend/DevOps: Familiarity with React and DevOps practices
- Architecture: Strong grasp of microservices
- Technical Skills: Basic proficiency in scripting, databases, and Git
Compensation: competitive salary, based on experience and fit
Requires that any candidate know the M365 Collaboration environment. SharePoint Online, MS Teams. Exchange Online, Entra and Purview. Need developer that possess a strong understanding of Data Structure, Problem Solving abilities, SQL, PowerShell, MS Teams App Development, Python, Visual Basic, C##, JavaScript, Java, HTML, PHP, C.
About Sun King
Sun King is the world’s leading off-grid solar energy company, delivering energy access to 1.8 billion people without reliable grid connections through innovative product design, fintech solutions, and field operations.
Key highlights:
- Connected over 20 million homes to solar power across Africa and Asia, adding 200,000 homes monthly.
- Affordable ‘pay-as-you-go’ financing model; after 1-2 years, customers own their solar equipment.
- Saved customers over $4 billion to date.
- Collect 650,000 daily payments via 28,000 field agents using mobile money systems.
- Products range from home lighting to high-energy appliances, with expansion into clean cooking, electric mobility, and entertainment.
With 2,800 staff across 12 countries, our team includes experts in various fields, all passionate about serving off-grid communities.
Diversity Commitment:
44% of our workforce are women, reflecting our commitment to gender diversity.
About the role:
The Backend Developer works remotely as part of the technology team to help Sun King’s EasyBuy business unit design and develop software to improve its field team operations.
What you will be expected to do
- Design and develop applications/systems based on wireframes and product requirements documents.
- Design and develop logical and physical data models to meet application requirements.
- Identify and resolve bottlenecks and bugs based on operational requirements.
- Perform unit tests on code to ensure robustness, including edge cases, usability, and general reliability.
- Write reusable and easily maintainable code following the principles of DRY (Don’t Repeat Yourself).
- Integrate existing tools and business systems, both in-house and external services, such as ticketing software and communication tools.
- Collaborate with team members and product managers to understand project requirements and contribute to the overall system design.
You might be a strong candidate if you have/are
- Have development experience: 1-3 years backend development experience and have strong problem-solving abilities, proficiency in data structures, and algorithms.
- Have a profound grasp of object-oriented programming (OOPS) standards and expertise in Core Java.
- Have knowledge of SQL, MySQL, or similar database management.
- Have Experience in integrating web services, such as SOAP, REST, JSON, and XML.
- Have familiarity with RESTful APIs for linking Android applications to backend services.
- Have preferred experience with version control systems like Git, but not mandatory.
- Have additional knowledge of web technologies like HTML, CSS, JavaScript, and frameworks like Spring or Hibernate would be advantageous.
What we offer (in addition to compensation and statutory benefits):
- A platform for professional growth in a rapidly expanding, high-impact sector.
- Immerse in a collaborative culture, energized by employees of Sun King who are collectively motivated by fostering a transformative, sustainable venture.
- A genuinely global environment: Engage and learn alongside a diverse group from varied geographies and backgrounds.
- Tailored learning pathways through the Sun King Center for Leadership to elevate your leadership and managerial capabilities.
AccioJob is conducting a Walk-In Hiring Drive with PayTabs Global for the position of Java Backend Developer.
To apply, register and select your slot here: https://go.acciojob.com/yU7t3p
Required Skills: Java, DSA, OOPS, Spring Boot, SQL
Eligibility:
- Degree: MTech./ME, BTech./BE, BCA, MCA
- Branch: IT, Computer Science/CSE/Other CS related branch, Electrical/Other electrical related branches
- Graduation Year: 2024, 2025, 2026
Work Details:
- First 3 months of internship will be Work From Home (WFH)
- After that, selected candidates must relocate to Chennai
- CTC: 3.5 LPA to 4 LPA
Evaluation Process:
Round 1: Offline Assessment at AccioJob Bangalore Centre, AccioJob Chennai Centre
Further Rounds (for shortlisted candidates only):
Technical Interview 1, Technical Interview 2, HR Discussion
Important Note: Bring your laptop & earphones for the test.
Register here: https://go.acciojob.com/yU7t3p
FAST SLOT BOOKING
[ DOWNLOAD ACCIOJOB APP ]
- Design and implement integration solutions using iPaaS tools.
- Collaborate with customers, product, engineering and business stakeholders to translate business requirements into robust and scalable integration processes.
- Develop and maintain SQL queries and scripts to facilitate data manipulation and integration.
- Utilize RESTful API design and consumption to ensure seamless data flow between various systems and applications.
- Lead the configuration, deployment, and ongoing management of integration projects.
- Troubleshoot and resolve technical issues related to integration solutions.
- Document integration processes and create user guides for internal and external users.
- Stay current with the latest developments in iPaaS technologies and best practices
Qualifications:
- Bachelor’s degree in Computer Science, Information Technology, or a related field.
- Minimum of 3 years’ experience in an integration engineering role with hands-on experience in an iPaaS tool, preferably Boomi.
- Proficiency in SQL and experience with database management and data integration patterns. - Strong understanding of integration patterns and solutions, API design, and cloud-based technologies.
- Good understanding of RESTful APIs and integration.
- Excellent problem-solving and analytical skills.
- Strong communication and interpersonal skills, with the ability to work effectively in a team environment.
- Experience with various integration protocols (REST, SOAP, FTP, etc.) and data formats (JSON, XML, etc.).
Preferred Skills:
- Boomi (or other iPaaS) certifications
- Experience with Intapp's Integration Builder is highly desirable but not mandatory.
- Certifications in Boomi or similar integration platforms.
- Experience with cloud services like MS Azure.
- Knowledge of additional programming languages (e.g., .NET, Java) is advantageous.
What we offer:
- Competitive salary and benefits package.
- Dynamic and innovative work environment.
- Opportunities for professional growth and advancement.
Job Description:
Technical Lead – Full Stack
Experience: 8–12 years (Strong candidates Java 50% - React 50%)
Location – Bangalore/Hyderabad
Interview Levels – 3 Rounds
Tech Stack: Java, Spring Boot, Microservices, React, SQL
Focus: Hands-on coding, solution design, team leadership, delivery ownership
Must-Have Skills (Depth)
Java (8+): Streams, concurrency, collections, JVM internals (GC), exception handling.
Spring Boot: Security, Actuator, Data/JPA, Feign/RestTemplate, validation, profiles, configuration management.
Microservices: API design, service discovery, resilience patterns (Hystrix/Resilience4j), messaging (Kafka/RabbitMQ) optional.
React: Hooks, component lifecycle, state management, error boundaries, testing (Jest/RTL).
SQL: Joins, aggregations, indexing, query optimization, transaction isolation, schema design.
Testing: JUnit/Mockito for backend; Jest/RTL/Cypress for frontend.
DevOps: Git, CI/CD, containers (Docker), familiarity with deployment environments.
We are seeking a highly skilled Power Platform Developer with deep expertise in designing, developing, and deploying solutions using Microsoft Power Platform. The ideal candidate will have strong knowledge of Power Apps, Power Automate, Power BI, Power Pages, and Dataverse, along with integration capabilities across Microsoft 365, Azure, and third-party systems.
Key Responsibilities
- Solution Development:
- Design and build custom applications using Power Apps (Canvas & Model-Driven).
- Develop automated workflows using Power Automate for business process optimization.
- Create interactive dashboards and reports using Power BI for data visualization and analytics.
- Configure and manage Dataverse for secure data storage and modelling.
- Develop and maintain Power Pages for external-facing portals.
- Integration & Customization:
- Integrate Power Platform solutions with Microsoft 365, Dynamics 365, Azure services, and external APIs.
- Implement custom connectors and leverage Power Platform SDK for advanced scenarios.
- Utilize Azure Functions, Logic Apps, and REST APIs for extended functionality.
- Governance & Security:
- Apply best practices for environment management, ALM (Application Lifecycle Management), and solution deployment.
- Ensure compliance with security, data governance, and licensing guidelines.
- Implement role-based access control and manage user permissions.
- Performance & Optimization:
- Monitor and optimize app performance, workflow efficiency, and data refresh strategies.
- Troubleshoot and resolve technical issues promptly.
- Collaboration & Documentation:
- Work closely with business stakeholders to gather requirements and translate them into technical solutions.
- Document architecture, workflows, and processes for maintainability.
Required Skills & Qualifications
- Technical Expertise:
- Strong proficiency in Power Apps (Canvas & Model-Driven), Power Automate, Power BI, Power Pages, and Dataverse.
- Experience with Microsoft 365, Dynamics 365, and Azure services.
- Knowledge of JavaScript, TypeScript, C#, .NET, and Power Fx for custom development.
- Familiarity with SQL, DAX, and data modeling.
- Additional Skills:
- Understanding of ALM practices, solution packaging, and deployment pipelines.
- Experience with Git, Azure DevOps, or similar tools for version control and CI/CD.
- Strong problem-solving and analytical skills.
- Certifications (Preferred):
- Microsoft Certified: Power Platform Developer Associate.
- Microsoft Certified: Power Platform Solution Architect Expert.
Soft Skills
- Excellent communication and collaboration skills.
- Ability to work in agile environments and manage multiple priorities.
- Strong documentation and presentation abilities.
We are looking for an enthusiastic and dynamic individual to join Upland India as a Senior Software Engineer I (Backend) for our Panviva product. The individual will work with our global development team.
What would you do?
- Develop, Review, test and maintain application code
- Collaborating with other developers and product to fulfil objectives
- Troubleshoot and diagnose issues
- Take lead on tasks as needed
- Jump in and help the team deliver features when it is required
What are we looking for?
Experience
- 5 + years of experience in Designing and implementing application architecture
- Back-end developer who enjoys solving problems
- Demonstrated experience with the .NET ecosystem (.NET Framework, ASP.NET, .NET Core) & SQL server
- Experience in building cloud-native applications (Azure)
- Must be skilled at writing Quality, scalable, maintainable, testable code
Leadership Skills
- Strong communication skills
- Ability to mentor/lead junior developers
Primary Skills: The candidate must possess the following primary skills:
- Strong Back-end developer who enjoys solving problems
- Solid experience NET Core, SQL Server, and .Net Design patterns such as Strong Understanding of OOPs Principles, .net specific implementation (DI/CQRS/Repository etc., patterns) & Knowing Architectural Solid principles, Unit testing tools, Debugging techniques
- Applying patterns to improve scalability and reduce technical debt
- Experience with refactoring legacy codebases using design patterns
- Real-World Problem Solving
- Ability to analyze a problem and choose the most suitable design pattern
- Experience balancing performance, readability, and maintainability
- Experience building modern, scalable, reliable applications on the MS Azure cloud including services such as:
- App Services
- Azure Service Bus/ Event Hubs
- Azure API Management Service Azure Bot Service
- Function/Logic Apps
- Azure key vault & Azure Configuration Service
- CosmosDB, Mongo DB
- Azure Search
- Azure Cognitive Services
Understanding Agile Methodology and Tool Familiarity
- Solid understanding of Agile development processes, including sprint planning, daily stand-ups, retrospectives, and backlog grooming
- Familiarity with Agile tools such as JIRA for tracking tasks, managing workflows, and collaborating across teams
- Experience working in cross-functional Agile teams and contributing to iterative development cycles
Secondary Skills: It would be advantageous if the candidate also has the following secondary skills:
- Experience with front-end React/Jquery/Javascript, HTML and CSS Frameworks
- APM tools - Worked on any tools such as Grafana, NR, Cloudwatch etc.,
- Basic Understanding of AI models
- Python
About Upland
Upland Software (Nasdaq: UPLD) helps global businesses accelerate digital transformation with a powerful cloud software library that provides choice, flexibility, and value. Upland India is a fully owned subsidiary of Upland Software and headquartered in Bangalore. We are a remote-first company. Interviews and on-boarding are conducted virtually.
About Ven Analytics
At Ven Analytics, we don’t just crunch numbers — we decode them to uncover insights that drive real business impact. We’re a data-driven analytics company that partners with high-growth startups and enterprises to build powerful data products, business intelligence systems, and scalable reporting solutions. With a focus on innovation, collaboration, and continuous learning, we empower our teams to solve real-world business problems using the power of data.
Role Overview
We’re looking for a Power BI Data Analyst who is not just proficient in tools but passionate about building insightful, scalable, and high-performing dashboards. The ideal candidate should have strong fundamentals in data modeling, a flair for storytelling through data, and the technical skills to implement robust data solutions using Power BI, Python, and SQL..
Key Responsibilities
- Technical Expertise: Develop scalable, accurate, and maintainable data models using Power BI, with a clear understanding of Data Modeling, DAX, Power Query, and visualization principles.
- Programming Proficiency: Use SQL and Python for complex data manipulation, automation, and analysis.
- Business Problem Translation: Collaborate with stakeholders to convert business problems into structured data-centric solutions considering performance, scalability, and commercial goals.
- Hypothesis Development: Break down complex use-cases into testable hypotheses and define relevant datasets required for evaluation.
- Solution Design: Create wireframes, proof-of-concepts (POC), and final dashboards in line with business requirements.
- Dashboard Quality: Ensure dashboards meet high standards of data accuracy, visual clarity, performance, and support SLAs.
- Performance Optimization: Continuously enhance user experience by improving performance, maintainability, and scalability of Power BI solutions.
- Troubleshooting & Support: Quick resolution of access, latency, and data issues as per defined SLAs.
- Power BI Development: Use power BI desktop for report building and service for distribution
- Backend development: Develop optimized SQL queries that are easy to consume, maintain and debug.
- Version Control: Strict control on versions by tracking CRs and Bugfixes. Ensuring the maintenance of Prod and Dev dashboards.
- Client Servicing : Engage with clients to understand their data needs, gather requirements, present insights, and ensure timely, clear communication throughout project cycles.
- Team Management : Lead and mentor a small team by assigning tasks, reviewing work quality, guiding technical problem-solving, and ensuring timely delivery of dashboards and reports..
Must-Have Skills
- Strong experience building robust data models in Power BI
- Hands-on expertise with DAX (complex measures and calculated columns)
- Proficiency in M Language (Power Query) beyond drag-and-drop UI
- Clear understanding of data visualization best practices (less fluff, more insight)
- Solid grasp of SQL and Python for data processing
- Strong analytical thinking and ability to craft compelling data stories
- Client Servicing Background.
Good-to-Have (Bonus Points)
- Experience using DAX Studio and Tabular Editor
- Prior work in a high-volume data processing production environment
- Exposure to modern CI/CD practices or version control with BI tools
Why Join Ven Analytics?
- Be part of a fast-growing startup that puts data at the heart of every decision.
- Opportunity to work on high-impact, real-world business challenges.
- Collaborative, transparent, and learning-oriented work environment.
- Flexible work culture and focus on career development.
Requires that any candidate know the M365 Collaboration environment. SharePoint Online, MS Teams. Exchange Online, Entra and Purview. Need developer that possess a strong understanding of Data Structure, Problem Solving abilities, SQL, PowerShell, MS Teams App Development, Python, Visual Basic, C##, JavaScript, Java, HTML, PHP, C.
Need a strong understanding of the development lifecycle, and possess debugging skills time management, business acumen, and have a positive attitude is a must and open to continual growth.
Capability to code appropriate solutions will be tested in any interview.
Knowledge of a wide variety of Generative AI models
Conceptual understanding of how large language models work
Proficiency in coding languages for data manipulation (e.g., SQL) and machine learning & AI development (e.g., Python)
Experience with dashboarding tools such as Power BI and Tableau (beneficial but not essential)
We are looking for highly experienced Senior Java Developers who can architect, design, and deliver high-performance enterprise applications using Spring Boot and Microservices . The role requires a strong understanding of distributed systems, scalability, and data consistency.
About Forbes Advisor
Forbes Digital Marketing Inc. is a high-growth digital media and technology company dedicated to helping consumers make confident, informed decisions about their money, health, careers, and everyday life.
We do this by combining data-driven content, rigorous product comparisons, and user-first design — all built on top of a modern, scalable platform. Our global teams bring deep expertise across journalism, product, performance marketing, data, and analytics.
The Role
We’re hiring a Data Scientist to help us unlock growth through advanced analytics and machine learning. This role sits at the intersection of marketing performance, product optimization, and decision science.
You’ll partner closely with Paid Media, Product, and Engineering to build models, generate insight, and influence how we acquire, retain, and monetize users. From campaign ROI to user segmentation and funnel optimization, your work will directly shape how we grow.This role is ideal for someone who thrives on business impact, communicates clearly, and wants to build re-usable, production-ready insights — not just run one-off analyses.
What You’ll Do
Marketing & Revenue Modelling
• Own end-to-end modelling of LTV, user segmentation, retention, and marketing
efficiency to inform media optimization and value attribution.
• Collaborate with Paid Media and RevOps to optimize SEM performance, predict high-
value cohorts, and power strategic bidding and targeting.
Product & Growth Analytics
• Work closely with Product Insights and General Managers (GMs) to define core metrics, KPIs, and success frameworks for new launches and features.
• Conduct deep-dive analysis of user behaviour, funnel performance, and product engagement to uncover actionable insights.
• Monitor and explain changes in key product metrics, identifying root causes and business impact.
• Work closely with Data Engineering to design and maintain scalable data pipelines that
support machine learning workflows, model retraining, and real-time inference.
Predictive Modelling & Machine Learning
• Build predictive models for conversion, churn, revenue, and engagement using regression, classification, or time-series approaches.
• Identify opportunities for prescriptive analytics and automation in key product and marketing workflows.
• Support development of reusable ML pipelines for production-scale use cases in product recommendation, lead scoring, and SEM planning.
Collaboration & Communication
• Present insights and recommendations to a variety of stakeholders — from ICs to executives — in a clear and compelling manner.
• Translate business needs into data problems, and complex findings into strategic action plans.
• Work cross-functionally with Engineering, Product, BI, and Marketing to deliver and deploy your work.
What You’ll Bring
Minimum Qualifications
• Bachelor’s degree in a quantitative field (Mathematics, Statistics, CS, Engineering, etc.).
• 4+ years in data science, growth analytics, or decision science roles.
• Strong SQL and Python skills (Pandas, Scikit-learn, NumPy).
• Hands-on experience with Tableau, Looker, or similar BI tools.
• Familiarity with LTV modelling, retention curves, cohort analysis, and media attribution.
• Experience with GA4, Google Ads, Meta, or other performance marketing platforms.
• Clear communication skills and a track record of turning data into decisions.
Nice to Have
• Experience with BigQuery and Google Cloud Platform (or equivalent).
• Familiarity with affiliate or lead-gen business models.
• Exposure to NLP, LLMs, embeddings, or agent-based analytics.
• Ability to contribute to model deployment workflows (e.g., using Vertex AI, Airflow, or Composer).
Why Join Us?
• Remote-first and flexible — work from anywhere in India with global exposure.
• Monthly long weekends (every third Friday off).
• Generous wellness stipends and parental leave.
• A collaborative team where your voice is heard and your work drives real impact.
• Opportunity to help shape the future of data science at one of the world’s most trusted
brands.
Data Engineer – Validation & Quality
Responsibilities
- Build rule-based and statistical validation frameworks using Pandas / NumPy.
- Implement contradiction detection, reconciliation, and anomaly flagging.
- Design and compute confidence metrics for each evidence record.
- Automate schema compliance, sampling, and checksum verification across data sources.
- Collaborate with the Kernel to embed validation results into every output artifact.
Requirements
- 5 + years in data engineering, data quality, or MLOps validation.
- Strong SQL optimization and ETL background.
- Familiarity with data lineage, DQ frameworks, and regulatory standards (SOC 2 / GDPR).
Position: QA Engineer – Machine Learning Systems (5 - 7 years)
Location: Remote (Company in Mumbai)
Company: Big Rattle Technologies Private Limited
Immediate Joiners only.
Summary:
The QA Engineer will own quality assurance across the ML lifecycle—from raw data validation through feature engineering checks, model training/evaluation verification, batch prediction/optimization validation, and end-to-end (E2E) workflow testing. The role is hands-on with Python automation, data profiling, and pipeline test harnesses in Azure ML and Azure DevOps. Success means probably correct data, models, and outputs at production scale and cadence.
Key Responsibilities:
Test Strategy & Governance
- ○ Define an ML-specific Test Strategy covering data quality KPIs, feature consistency
- checks, model acceptance gates (metrics + guardrails), and E2E run acceptance
- (timeliness, completeness, integrity).
- ○ Establish versioned test datasets & golden baselines for repeatable regression of
- features, models, and optimizers.
Data Quality & Transformation
- Validate raw data extracts and landed data lake data: schema/contract checks, null/outlier thresholds, time-window completeness, duplicate detection, site/material coverage.
- Validate transformed/feature datasets: deterministic feature generation, leakage detection, drift vs. historical distributions, feature parity across runs (hash or statistical similarity tests).
- Implement automated data quality checks (e.g., Great Expectations/pytest + Pandas/SQL) executed in CI and AML pipelines.
Model Training & Evaluation
- Verify training inputs (splits, windowing, target leakage prevention) and hyperparameter configs per site/cluster.
- Automate metric verification (e.g., MAPE/MAE/RMSE, uplift vs. last model, stability tests) with acceptance thresholds and champion/challenger logic.
- Validate feature importance stability and sensitivity/elasticity sanity checks (price/volume monotonicity where applicable).
- Gate model registration/promotion in AML based on signed test artifacts and reproducible metrics.
Predictions, Optimization & Guardrails
- Validate batch predictions: result shapes, coverage, latency, and failure handling.
- Test model optimization outputs and enforced guardrails: detect violations and prove idempotent writes to DB.
- Verify API push to third party system (idempotency keys, retry/backoff, delivery receipts).
Pipelines & E2E
- Build pipeline test harnesses for AML pipelines (data-gen nightly, training weekly,
- prediction/optimization) including orchestrated synthetic runs and fault injection
- (missing slice, late competitor data, SB backlog).
- Run E2E tests from raw data store -> ADLS -> AML -> RDBMS -> APIM/Frontend, assert
- freshness SLOs and audit event completeness (Event Hubs -> ADLS immutable).
Automation & Tooling
- Develop Python-based automated tests (pytest) for data checks, model metrics, and API contracts; integrate with Azure DevOps (pipelines, badges, gates).
- Implement data-driven test runners (parameterized by site/material/model-version) and store signed test artifacts alongside models in AML Registry.
- Create synthetic test data generators and golden fixtures to cover edge cases (price gaps, competitor shocks, cold starts).
Reporting & Quality Ops
- Publish weekly test reports and go/no-go recommendations for promotions; maintain a defect taxonomy (data vs. model vs. serving vs. optimization).
- Contribute to SLI/SLO dashboards (prediction timeliness, queue/DLQ, push success, data drift) used for release gates.
Required Skills (hands-on experience in the following):
- Python automation (pytest, pandas, NumPy), SQL (PostgreSQL/Snowflake), and CI/CD (Azure
- DevOps) for fully automated ML QA.
- Strong grasp of ML validation: leakage checks, proper splits, metric selection
- (MAE/MAPE/RMSE), drift detection, sensitivity/elasticity sanity checks.
- Experience testing AML pipelines (pipelines/jobs/components), and message-driven integrations
- (Service Bus/Event Hubs).
- API test skills (FastAPI/OpenAPI, contract tests, Postman/pytest-httpx) + idempotency and retry
- patterns.
- Familiar with feature stores/feature engineering concepts and reproducibility.
- Solid understanding of observability (App Insights/Log Analytics) and auditability requirements.
Required Qualifications:
- Bachelor’s or Master’s degree in Computer Science, Information Technology, or related field.
- 5–7+ years in QA with 3+ years focused on ML/Data systems (data pipelines + model validation).
- Certification in Azure Data or ML Engineer Associate is a plus.
Why should you join Big Rattle?
Big Rattle Technologies specializes in AI/ ML Products and Solutions as well as Mobile and Web Application Development. Our clients include Fortune 500 companies. Over the past 13 years, we have delivered multiple projects for international and Indian clients from various industries like FMCG, Banking and Finance, Automobiles, Ecommerce, etc. We also specialise in Product Development for our clients.
Big Rattle Technologies Private Limited is ISO 27001:2022 certified and CyberGRX certified.
What We Offer:
- Opportunity to work on diverse projects for Fortune 500 clients.
- Competitive salary and performance-based growth.
- Dynamic, collaborative, and growth-oriented work environment.
- Direct impact on product quality and client satisfaction.
- 5-day hybrid work week.
- Certification reimbursement.
- Healthcare coverage.
How to Apply:
Interested candidates are invited to submit their resume detailing their experience. Please detail out your work experience and the kind of projects you have worked on. Ensure you highlight your contributions and accomplishments to the projects.
Bidgely is seeking an outstanding and deeply technical Principal Engineer / Sr. Principal Engineer / Architect to lead the architecture and evolution of our next-generation data and platform infrastructure. This is a senior IC role for someone who loves solving complex problems at scale, thrives in high-ownership environments, and influences engineering direction across teams.
You will be instrumental in designing scalable and resilient platform components that can handle trillions of data points, integrate machine learning pipelines, and support advanced energy analytics. As we evolve our systems for the future of clean energy, you will play a critical role in shaping the platform that powers all Bidgely products.
Responsibilities
- Architect & Design: Lead the end-to-end architecture of core platform components – from ingestion pipelines to ML orchestration and serving layers. Architect for scale (200Bn+ daily data points), performance, and flexibility.
- Technical Leadership: Act as a thought leader and trusted advisor for engineering teams. Review designs, guide critical decisions, and set high standards for software engineering excellence.
- Platform Evolution: Define and evolve the platform’s vision, making key choices in data processing, storage, orchestration, and cloud-native patterns.
- Mentorship: Coach senior engineers and staff on architecture, engineering best practices, and system thinking. Foster a culture of engineering excellence and continuous improvement.
- Innovation & Research: Evaluate and experiment with emerging technologies (e.g., event-driven architectures, AI infrastructure, new cloud-native tools) to stay ahead of the curve.
- Cross-functional Collaboration: Partner with Engineering Managers, Product Managers, and Data Scientists to align platform capabilities with product needs.
- Non-functional Leadership: Ensure systems are secure, observable, resilient, performant, and cost-efficient. Drive excellence in areas like compliance, DevSecOps, and cloud cost optimization.
- GenAI Integration: Explore and drive adoption of Generative AI to enhance developer productivity, platform intelligence, and automation of repetitive engineering tasks.
Requirements:
- 8+ years of experience in backend/platform architecture roles, ideally with experience at scale.
- Deep expertise in distributed systems, data engineering stacks (Kafka, Spark, HDFS, NoSQL DBs like Cassandra/ElasticSearch), and cloud-native infrastructure (AWS, GCP, or Azure).
- Proven ability to architect high-throughput, low-latency systems with batch + real-time processing.
- Experience designing and implementing DAG-based data processing and orchestration systems.
- Proficient in Java (Spring Boot, REST), and comfortable with infrastructure-as-code and CI/CD practices.
- Strong understanding of non-functional areas: security, scalability, observability, and
- compliance.
- Exceptional problem-solving skills and a data-driven approach to decision-making.
- Excellent communication and collaboration skills with the ability to influence at all levels.
- Prior experience working in a SaaS environment is a strong plus.
- Experience with GenAI tools or frameworks (e.g., LLMs, embedding models, prompt engineering, RAG, Copilot-like integrations) to accelerate engineering workflows or enhance platform intelligence is highly desirable.
Role Overview
We are seeking a ServiceNow Product Owner with deep expertise in ServiceNow modules (CSM, ITSM, HRSD)
and strong scripting and data-handling skills.
This role focuses on translating real enterprise workflows into structured, data-driven AI training tasks, helping improve reasoning and understanding within AI systems. It is not a platform configuration or app development role — instead, it blends functional ServiceNow knowledge, prompt engineering, and data design to build the next generation of intelligent enterprise models.
Key Responsibilities
· Define decision frameworks and realistic scenarios for AI reinforcement learning based on ServiceNow workflows.
· Design scenario-driven tasks mirroring ServiceNow processes like case handling, SLA tracking, and IT incident management.
· Develop and validate structured data tasks in JSON, ensuring accuracy and clarity.
· Write natural language instructions aligned with ServiceNow’s business logic and workflows.
· Use SQL queries for validation and quality checks of task data.
· Apply prompt engineering techniques to guide model reasoning.
· Collaborate with peers to expand and document cross-domain scenarios (CSM, ITSM, HRSD).
· Create and maintain documentation of scenario patterns and best practices.
Required Experience
· 4–6 years of experience with ServiceNow (CSM, ITSM, HRSD).
· Deep understanding of cases, incidents, requests, SLAs, and knowledge management processes.
· Proven ability to design realistic enterprise scenarios mapping to ServiceNow operations.
· Exposure to AI model training workflows or structured data design is a plus.
Preferred Qualifications
· ServiceNow Certified System Administrator (CSA)
· ServiceNow Certified Implementation Specialist (CIS-ITSM / CSM / HRSD)
· Exposure to AI/ML workflows or model training data preparation.
· Excellent written and verbal communication skills, with client-facing
Mandatory Skills: Scripting (Javascript, Glide Script), JSON Handling, SQL, Service Now Modules (ITSM, CSM, HRSD) and Prompt Engineering.
Role Overview
We are seeking a Junior Developer with 1-3 year’s experience with strong foundations in Python, databases, and AI technologies. The ideal candidate will support the development of AI-powered solutions, focusing on LLM integration, prompt engineering, and database-driven workflows. This is a hands-on role with opportunities to learn and grow into advanced AI engineering responsibilities.
Key Responsibilities
- Develop, test, and maintain Python-based applications and APIs.
- Design and optimize prompts for Large Language Models (LLMs) to improve accuracy and performance.
- Work with JSON-based data structures for request/response handling.
- Integrate and manage PostgreSQL (pgSQL) databases, including writing queries and handling data pipelines.
- Collaborate with the product and AI teams to implement new features.
- Debug, troubleshoot, and optimize performance of applications and workflows.
- Stay updated on advancements in LLMs, AI frameworks, and generative AI tools.
Required Skills & Qualifications
- Strong knowledge of Python (scripting, APIs, data handling).
- Basic understanding of Large Language Models (LLMs) and prompt engineering techniques.
- Experience with JSON data parsing and transformations.
- Familiarity with PostgreSQL or other relational databases.
- Ability to write clean, maintainable, and well-documented code.
- Strong problem-solving skills and eagerness to learn.
- Bachelor’s degree in Computer Science, Engineering, or related field (or equivalent practical experience).
Nice-to-Have (Preferred)
- Exposure to AI/ML frameworks (e.g., LangChain, Hugging Face, OpenAI APIs).
- Experience working in startups or fast-paced environments.
- Familiarity with version control (Git/GitHub) and cloud platforms (AWS, GCP, or Azure).
What We Offer
- Opportunity to work on cutting-edge AI applications in permitting & compliance.
- Collaborative, growth-focused, and innovation-driven work culture.
- Mentorship and learning opportunities in AI/LLM development.
- Competitive compensation with performance-based growth.
We are seeking a highly skilled and experienced Senior Full Stack Developerwith 8+years of experience to join our dynamic team. The ideal candidate will have a strong background in both front-end and back-end development, with expertise in .NET, Angular, TypeScript, Azure, SQL Server, Agile methodologies, and Design Patterns. Experience with DocuSign is a plus.
Responsibilities:
- Design, develop, and maintain web applications using .NET, Angular, and TypeScript.
- Collaborate with cross-functional teams to define, design, and ship new features.
- Implement and maintain cloud-based solutions using Azure.
- Develop and optimize SQL Server databases.
- Follow Agile methodologies to manage project tasks and deliverables.
- Apply design patterns and best practices to ensure high-quality, maintainable code.
- Troubleshoot and resolve software defects and issues.
- Mentor and guide junior developers.
Requirements:
- Bachelor's degree in computer science, Engineering, or a related field.
- Proven experience as a Full Stack Developer or similar role.
- Strong proficiency in .NET, Angular, and TypeScript.
- Experience with Azure cloud services.
- Proficient in SQL Server and database design.
- Familiarity with Agile methodologies and practices.
- Solid understanding of design patterns and software architecture principles.
- Excellent problem-solving skills and attention to detail.
- Strong communication and teamwork abilities.
- Experience with DocuSign is a plus.
What You’ll Be Doing:
● Own the architecture and roadmap for scalable, secure, and high-quality data pipelines
and platforms.
● Lead and mentor a team of data engineers while establishing engineering best practices,
coding standards, and governance models.
● Design and implement high-performance ETL/ELT pipelines using modern Big Data
technologies for diverse internal and external data sources.
● Drive modernization initiatives including re-architecting legacy systems to support
next-generation data products, ML workloads, and analytics use cases.
● Partner with Product, Engineering, and Business teams to translate requirements into
robust technical solutions that align with organizational priorities.
● Champion data quality, monitoring, metadata management, and observability across the
ecosystem.
● Lead initiatives to improve cost efficiency, data delivery SLAs, automation, and
infrastructure scalability.
● Provide technical leadership on data modeling, orchestration, CI/CD for data workflows,
and cloud-based architecture improvements.
Qualifications:
● Bachelor's degree in Engineering, Computer Science, or relevant field.
● 8+ years of relevant and recent experience in a Data Engineer role.
● 5+ years recent experience with Apache Spark and solid understanding of the
fundamentals.
● Deep understanding of Big Data concepts and distributed systems.
● Demonstrated ability to design, review, and optimize scalable data architectures across
ingestion.
● Strong coding skills with Scala, Python and the ability to quickly switch between them with
ease.
● Advanced working SQL knowledge and experience working with a variety of relational
databases such as Postgres and/or MySQL.
● Cloud Experience with DataBricks.
● Strong understanding of Delta Lake architecture and working with Parquet, JSON, CSV,
and similar formats.
● Experience establishing and enforcing data engineering best practices, including CI/CD
for data, orchestration and automation, and metadata management.
● Comfortable working in an Agile environment
● Machine Learning knowledge is a plus.
● Demonstrated ability to operate independently, take ownership of deliverables, and lead
technical decisions.
● Excellent written and verbal communication skills in English.
● Experience supporting and working with cross-functional teams in a dynamic
environment.
REPORTING: This position will report to Sr. Technical Manager or Director of Engineering as
assigned by Management.
EMPLOYMENT TYPE: Full-Time, Permanent
SHIFT TIMINGS: 10:00 AM - 07:00 PM IST
Sr Software Engineer
Company Summary :
As the recognized global standard for project-based businesses, Deltek delivers software and information solutions to help organizations achieve their purpose. Our market leadership stems from the work of our diverse employees who are united by a passion for learning, growing and making a difference.
At Deltek, we take immense pride in creating a balanced, values-driven environment, where every employee feels included and empowered to do their best work. Our employees put our core values into action daily, creating a one-of-a-kind culture that has been recognized globally. Thanks to our incredible team, Deltek has been named one of America's Best Midsize Employers by Forbes, a Best Place to Work by Glassdoor, a Top Workplace by The Washington Post and a Best Place to Work in Asia by World HRD Congress. www.deltek.com
Business Summary :
The Deltek Engineering and Technology team builds best-in-class solutions to delight customers and meet their business needs. We are laser-focused on software design, development, innovation and quality. Our team of experts has the talent, skills and values to deliver products and services that are easy to use, reliable, sustainable and competitive. If you're looking for a safe environment where ideas are welcome, growth is supported and questions are encouraged – consider joining us as we explore the limitless opportunities of the software industry.
Position Responsibilities :
About the Role
We are looking for a skilled and motivated Senior Software Developer to join our team responsible for developing and maintaining a robust ERP solution used by approximately 400 customers and more than 30000 users worldwide. The system is built using C# (.NET Core), leverages SQL Server for data management, and is hosted in the Microsoft Azure cloud.
This role offers the opportunity to work on a mission-critical product, contribute to architectural decisions, and help shape the future of our cloud-native ERP platform.
Key Responsibilities
- Design, develop, and maintain features and modules within the ERP system using C# (.NET Core)
- Optimize and manage SQL Server database interactions for performance and scalability
- Collaborate with cross-functional teams, including QA, DevOps, Product Management, and Support
- Participate in code reviews, architecture discussions, and technical planning
- Contribute to the adoption and improvement of CI/CD pipelines and cloud deployment practices
- Troubleshoot and resolve complex technical issues across the stack
- Ensure code quality, maintainability, and adherence to best practices
- Stay current with emerging technologies and recommend improvements where applicable
Qualifications
- Curiosity, passion, teamwork, and initiative
- Strong experience with C# and .NET Core in enterprise application development
- Solid understanding of SQL Server, including query optimization and schema design
- Experience with Azure cloud services (App Services, Azure SQL, Storage, etc.)
- Ability to utilize agentic AI as a development support, with a critical thinking attitude
- Familiarity with agile development methodologies and DevOps practices
- Ability to work independently and collaboratively in a fast-paced environment
- Excellent problem-solving and communication skills
- Master's degree in Computer Science or equivalent; 5+ years of relevant work experience
- Experience with ERP systems or other complex business applications is a plus
What We Offer
- A chance to work on a product that directly impacts thousands of users worldwide
- A collaborative and supportive engineering culture
- Opportunities for professional growth and technical leadership
- Competitive salary and benefits package

Position: Full Stack Developer ( PHP Codeigniter)
Company : Mayura Consultancy Services
Experience: 2 yrs
Location : Bangalore
Skill: HTML, CSS, Bootstrap, Javascript, Ajax, Jquery , PHP and Codeigniter or CI
Work Location: Work From Home(WFH)
Website : https://www.mayuraconsultancy.com/
Requirements :
- Prior experience in Full Stack Development using PHP Codeigniter
Perks of Working with MCS :
- Contribute to Innovative Solutions: Join a dynamic team at the forefront of software development, contributing to innovative projects and shaping the technological solutions of the organization.
- Work with Clients from across the Globe: Collaborate with clients from around the world, gaining exposure to diverse cultures and industries, and contributing to the development of solutions that address the unique needs and challenges of global businesses.
- Complete Work From Home Opportunity: Enjoy the flexibility of working entirely from the comfort of your home, empowering you to manage your schedule and achieve a better work-life balance while coding innovative solutions for MCS.
- Opportunity to Work on Projects Developing from Scratch: Engage in projects from inception to completion, working on solutions developed from scratch and having the opportunity to make a significant impact on the design, architecture, and functionality of the final product.
- Diverse Projects: Be involved in a variety of development projects, including web applications, mobile apps, e-commerce platforms, and more, allowing you to showcase your versatility as a Full Stack Developer and expand your portfolio.
Joining MCS as a Full Stack Developer opens the door to a world where your technical skills can shine and grow, all while enjoying a supportive and dynamic work environment. We're not just building solutions; we're building the future—and you can be a key part of that journey.
Job Title: PHP Coordinator / Laravel Developer
Experience: 4+ Years
Work Mode: Work From Home (WFH)
Working Days: 5 Days
Job Description:
We are looking for an experienced PHP Coordinator / Laravel Developer to join our team. The ideal candidate should have strong expertise in PHP and Laravel framework, along with the ability to coordinate and manage development (as Team Lead) tasks effectively.
Key Responsibilities:
- Develop, test, and maintain web applications using PHP and Laravel.
- Coordinate with team members to ensure timely project delivery.
- Write clean, secure, and efficient code.
- Troubleshoot, debug, and optimize existing applications.
- Collaborate with stakeholders to gather and analyze requirements.
Required Skills:
- Strong experience in PHP and Laravel framework.
- Good understanding of MySQL and RESTful APIs and Cloud (AWS/ Azure/ GCP).
- Familiarity with front-end technologies (HTML, CSS, JavaScript).
- Excellent communication and coordination skills.
- Ability to work independently in a remote environment.
Tech Stack / Requirements:
- Experience required: 1 - 2 yrs atleast
- Candidates must be from an IT Engineering background (B.E./B.Tech in Information Technology, Computer Science, or related fields), B.Sc. IT, BCA or related fields.
- Strong understanding of JavaScript
- Experience with React Native / Expo
- Familiarity with SQL
- Exposure to REST APIs integration
- Fast learner with strong problem-solving & debugging skills
Responsibilities:
- Build & improve mobile app features using React Native / Expo
- Develop and maintain web features using React.js / Next.js
- Integrate APIs and ensure seamless user experiences across platforms
- Collaborate with backend & design teams for end-to-end development
- Debug & optimize performance across mobile and web
- Write clean, maintainable code and ship to production regularly
Work closely with the founding team / CTO and contribute to product launches
Growth: Performance-based growth with significant hikes possible in the same or upcoming months.
Job Title : Informatica Cloud Developer / Migration Specialist
Experience : 6 to 10 Years
Location : Remote
Notice Period : Immediate
Job Summary :
We are looking for an experienced Informatica Cloud Developer with strong expertise in Informatica IDMC/IICS and experience in migrating from PowerCenter to Cloud.
The candidate will be responsible for designing, developing, and maintaining ETL workflows, data warehouses, and performing data integration across multiple systems.
Mandatory Skills :
Informatica IICS/IDMC, Informatica PowerCenter, ETL Development, SQL, Data Migration (PowerCenter to IICS), and Performance Tuning.
Key Responsibilities :
- Design, develop, and maintain ETL processes using Informatica IICS/IDMC.
- Work on migration projects from Informatica PowerCenter to IICS Cloud.
- Troubleshoot and resolve issues related to mappings, mapping tasks, and taskflows.
- Analyze business requirements and translate them into technical specifications.
- Conduct unit testing, performance tuning, and ensure data quality.
- Collaborate with cross-functional teams for data integration and reporting needs.
- Prepare and maintain technical documentation.
Required Skills :
- 4 to 5 years of hands-on experience in Informatica Cloud (IICS/IDMC).
- Strong experience with Informatica PowerCenter.
- Proficiency in SQL and data warehouse concepts.
- Good understanding of ETL performance tuning and debugging.
- Excellent communication and problem-solving skills.
Role & responsibilities
- Develop and maintain server-side applications using Go Lang.
- Design and implement scalable, secure, and maintainable RESTful APIs and microservices.
- Collaborate with front-end developers to integrate user-facing elements with server-side logic
- Optimize applications for performance, reliability, and scalability.
- Write clean, efficient, and reusable code that adheres to best practices.
Preferred candidate profile
- Minimum 5 years of working experience in Go Lang development.
- Proven experience in developing RESTful APIs and microservices.
- Familiarity of cloud platforms like AWS, GCP, or Azure.
- Familiarity with CI/CD pipelines and DevOps practices
About the role:
The SDE 2 - Backend will work as part of the Digitization and Automation team to help Sun King design, develop, and implement - intelligent, tech-enabled solutions to help solve a large variety of our business problems. We are looking for candidates with an affinity for technology and automations, curiosity towards advancement in products, and strong coding skills for our in-house software development team.
What you will be expected to do:
- Design and build applications/systems based on wireframes and product requirements documents
- Design and develop conceptual and physical data models to meet application requirements.
- Identify and correct bottlenecks/bugs according to operational requirements
- Focus on scalability, performance, service robustness, and cost trade-offs.
- Create prototypes and proof-of-concepts for iterative development.
- Take complete ownership of projects (end to end) and their development cycle
- Mentoring and guiding team members
- Unit test code for robustness, including edge cases, usability and general reliability
- Integrate existing tools and business systems (in-house tools or business tools like Ticketing softwares, communication tools) with external services
- Coordinate with the Product Manager, development team & business analysts
You might be a strong candidate if you have/are:
- Development experience: 3 – 5 years
- Should be very strong in problem-solving, data structures, and algorithms.
- Deep knowledge of OOPS concepts and programming skills in Core Java and Spring Boot Framework
- Strong Experience in SQL
- Experience in web service development and integration (SOAP, REST, JSON, XML)
- Understanding of code versioning tools (e.g., git)
- Experience in Agile/Scrum development process and tools
- Experience in Microservice architecture
- Hands-on experience in AWS RDS, EC2, S3 and deployments
Good to have:
- Knowledge on messaging systems RabbitMQ, Kafka.
- Knowledge of Python
- Container-based application deployment (Docker or equivalent)
- Willing to learn new technologies and implement them in products
What Sun King offers:
- Professional growth in a dynamic, rapidly expanding, high-social-impact industry
- An open-minded, collaborative culture made up of enthusiastic colleagues who are driven by the challenge of innovation towards profound impact on people and the planet.
- A truly multicultural experience: you will have the chance to work with and learn from people from different geographies, nationalities, and backgrounds.
- Structured, tailored learning and development programs that help you become a better leader, manager, and professional through the Sun King Center for Leadership.
About Sun King
Sun King is a leading off-grid solar energy company providing affordable, reliable electricity to 1.8 billion people without grid access. Operating across Africa and Asia, Sun King has connected over 20 million homes, adding 200,000 homes monthly.
Through a ‘pay-as-you-go’ model, customers make small daily payments (as low as $0.11) via mobile money or cash, eventually owning their solar equipment and saving on costly kerosene or diesel. To date, Sun King products have saved customers over $4 billion.
With 28,000 field agents and embedded electronics that regulate usage based on payments, Sun King ensures seamless energy access. Its products range from home lighting and phone charging systems to solar inverters capable of powering high-energy appliances.
Sun King is expanding into clean cooking, electric mobility, and entertainment while serving a wide range of income segments.
The company employs 2,800 staff across 12 countries, with women representing 44% of the workforce, and expertise spanning product design, data science, logistics, sales, software, and operations.
Position: Senior Data Engineer
Overview:
We are seeking an experienced Senior Data Engineer to design, build, and optimize scalable data pipelines and infrastructure to support cross-functional teams and next-generation data initiatives. The ideal candidate is a hands-on data expert with strong technical proficiency in Big Data technologies and a passion for developing efficient, reliable, and future-ready data systems.
Reporting: Reports to the CEO or designated Lead as assigned by management.
Employment Type: Full-time, Permanent
Location: Remote (Pan India)
Shift Timings: 2:00 PM – 11:00 PM IST
Key Responsibilities:
- Design and develop scalable data pipeline architectures for data extraction, transformation, and loading (ETL) using modern Big Data frameworks.
- Identify and implement process improvements such as automation, optimization, and infrastructure re-design for scalability and performance.
- Collaborate closely with Engineering, Product, Data Science, and Design teams to resolve data-related challenges and meet infrastructure needs.
- Partner with machine learning and analytics experts to enhance system accuracy, functionality, and innovation.
- Maintain and extend robust data workflows and ensure consistent delivery across multiple products and systems.
Required Qualifications:
- Bachelor’s degree in Computer Science, Engineering, or related field.
- 10+ years of hands-on experience in Data Engineering.
- 5+ years of recent experience with Apache Spark, with a strong grasp of distributed systems and Big Data fundamentals.
- Proficiency in Scala, Python, Java, or similar languages, with the ability to work across multiple programming environments.
- Strong SQL expertise and experience working with relational databases such as PostgreSQL or MySQL.
- Proven experience with Databricks and cloud-based data ecosystems.
- Familiarity with diverse data formats such as Delta Tables, Parquet, CSV, and JSON.
- Skilled in Linux environments and shell scripting for automation and system tasks.
- Experience working within Agile teams.
- Knowledge of Machine Learning concepts is an added advantage.
- Demonstrated ability to work independently and deliver efficient, stable, and reliable software solutions.
- Excellent communication and collaboration skills in English.
About the Organization:
We are a leading B2B data and intelligence platform specializing in high-accuracy contact and company data to empower revenue teams. Our technology combines human verification and automation to ensure exceptional data quality and scalability, helping businesses make informed, data-driven decisions.
What We Offer:
Our workplace embraces diversity, inclusion, and continuous learning. With a fast-paced and evolving environment, we provide opportunities for growth through competitive benefits including:
- Paid Holidays and Leaves
- Performance Bonuses and Incentives
- Comprehensive Medical Policy
- Company-Sponsored Training Programs
We are an Equal Opportunity Employer, committed to maintaining a workplace free from discrimination and harassment. All employment decisions are made based on merit, competence, and business needs.
Responsibilities
Develop and maintain web and backend components using Python, Node.js, and Zoho tools
Design and implement custom workflows and automations in Zoho
Perform code reviews to maintain quality standards and best practices
Debug and resolve technical issues promptly
Collaborate with teams to gather and analyze requirements for effective solutions
Write clean, maintainable, and well-documented code
Manage and optimize databases to support changing business needs
Contribute individually while mentoring and supporting team members
Adapt quickly to a fast-paced environment and meet expectations within the first month
Selection Process
1. HR Screening: Review of qualifications and experience
2. Online Technical Assessment: Test coding and problem-solving skills
3. Technical Interview: Assess expertise in web development, Python, Node.js, APIs, and Zoho
4. Leadership Evaluation: Evaluate team collaboration and leadership abilities
5. Management Interview: Discuss cultural fit and career opportunities
6. Offer Discussion: Finalize compensation and role specifics
Experience Required
2-4 years of relevant experience as a Zoho Developer
Proven ability to work as a self-starter and contribute individually
Strong technical and interpersonal skills to support team members effectively
Required Skills:
- 4+ years of experience designing, developing, and implementing enterprise-level, n-tier, software solutions.
- Proficiency with Microsoft C# is a must.
- In-depth experience with .NET framework and .NET Core.
- Knowledge of OOP, server technologies, and SOA is a must. 3+ Years Micro-service experience .
- Relevant experience with database design and SQL (Postgres is preferred).
- Experience with ORM tooling.
- Experience delivering software that is correct, stable, and security compliant.
- Basic understanding of common cloud platform. (Good to have)
- Financial services experience is strongly preferred.
- Thorough understanding of XML/JSON and related technologies.
- Thorough understanding of unit, integration, and performance testing for APIs.
- Entrepreneurial spirit. You are self-directed, innovative, and biased towards action. You love to build new things and thrive in fast-paced environments.
- Excellent communication and interpersonal skills, with an emphasis on strong writing and analytical problem-solving.
Now Hiring: Tableau Developer (Banking Domain) 🚀
We’re looking for a 6+ years experienced Tableau pro to design and optimize dashboards for Banking & Financial Services.
🔹 Design & optimize interactive Tableau dashboards for large banking datasets
🔹 Translate KPIs into scalable reporting solutions
🔹 Ensure compliance with regulations like KYC, AML, Basel III, PCI-DSS
🔹 Collaborate with business analysts, data engineers, and banking experts
🔹 Bring deep knowledge of SQL, data modeling, and performance optimization
🌍 Location: Remote
📊 Domain Expertise: Banking / Financial Services
✨ Preferred experience with cloud data platforms (AWS, Azure, GCP) & certifications in Tableau are a big plus!
Bring your data visualization skills to transform banking intelligence & compliance reporting.
About the Role
We are seeking motivated Data Engineering Interns to join our team remotely for a 3-month internship. This role is designed for students or recent graduates interested in working with data pipelines, ETL processes, and big data tools. You will gain practical experience in building scalable data solutions. While this is an unpaid internship, interns who successfully complete the program will receive a Completion Certificate and a Letter of Recommendation.
Responsibilities
- Assist in designing and building data pipelines for structured and unstructured data.
- Support ETL (Extract, Transform, Load) processes to prepare data for analytics.
- Work with databases (SQL/NoSQL) for data storage and retrieval.
- Help optimize data workflows for performance and scalability.
- Collaborate with data scientists and analysts to ensure data quality and consistency.
- Document workflows, schemas, and technical processes.
Requirements
- Strong interest in data engineering, databases, and big data systems.
- Basic knowledge of SQL and relational database concepts.
- Familiarity with Python, Java, or Scala for data processing.
- Understanding of ETL concepts and data pipelines.
- Exposure to cloud platforms (AWS, Azure, or GCP) is a plus.
- Familiarity with big data frameworks (Hadoop, Spark, Kafka) is an advantage.
- Good problem-solving skills and ability to work independently in a remote setup.
What You’ll Gain
- Hands-on experience in data engineering and ETL pipelines.
- Exposure to real-world data workflows.
- Mentorship and guidance from experienced engineers.
- Completion Certificate upon successful completion.
- Letter of Recommendation based on performance.
Internship Details
- Duration: 3 months
- Location: Remote (Work from Home)
- Stipend: Unpaid
- Perks: Completion Certificate + Letter of Recommendation
About Data Axle:
Data Axle Inc. has been an industry leader in data, marketing solutions, sales and research for over 50 years in the USA. Data Axle now as an established strategic global centre of excellence in Pune. This centre delivers mission critical data services to its global customers powered by its proprietary cloud-based technology platform and by leveraging proprietary business & consumer databases.
Data Axle Pune is pleased to have achieved certification as a Great Place to Work!
Roles & Responsibilities:
We are looking for a Data Scientist to join the Data Science Client Services team to continue our success of identifying high quality target audiences that generate profitable marketing return for our clients. We are looking for experienced data science, machine learning and MLOps practitioners to design, build and deploy impactful predictive marketing solutions that serve a wide range of verticals and clients. The right candidate will enjoy contributing to and learning from a highly talented team and working on a variety of projects.
We are looking for a Senior Data Scientist who will be responsible for:
- Ownership of design, implementation, and deployment of machine learning algorithms in a modern Python-based cloud architecture
- Design or enhance ML workflows for data ingestion, model design, model inference and scoring
- Oversight on team project execution and delivery
- If senior, establish peer review guidelines for high quality coding to help develop junior team members’ skill set growth, cross-training, and team efficiencies
- Visualize and publish model performance results and insights to internal and external audiences
Qualifications:
- Masters in a relevant quantitative, applied field (Statistics, Econometrics, Computer Science, Mathematics, Engineering)
- Minimum of 3.5 years of work experience in the end-to-end lifecycle of ML model development and deployment into production within a cloud infrastructure (Databricks is highly preferred)
- Proven ability to manage the output of a small team in a fast-paced environment and to lead by example in the fulfilment of client requests
- Exhibit deep knowledge of core mathematical principles relating to data science and machine learning (ML Theory + Best Practices, Feature Engineering and Selection, Supervised and Unsupervised ML, A/B Testing, etc.)
- Proficiency in Python and SQL required; PySpark/Spark experience a plus
- Ability to conduct a productive peer review and proper code structure in Github
- Proven experience developing, testing, and deploying various ML algorithms (neural networks, XGBoost, Bayes, and the like)
- Working knowledge of modern CI/CD methods This position description is intended to describe the duties most frequently performed by an individual in this position.
It is not intended to be a complete list of assigned duties but to describe a position level.
Job Description: .NET + Angular Full Stack Developer
Position: Full Stack Developer (.NET + Angular)
Experience: 3 – 5 Years
About the Role
We are looking for a highly skilled .NET Angular Full Stack Developer to join our dynamic team. The ideal candidate should have strong expertise in both back-end and front-end development, hands-on experience with .NET Core and Angular, and a passion for building scalable, secure, and high-performance applications.
Key Responsibilities
- Design, develop, and maintain scalable, high-quality web applications using .NET Core 8, ASP.NET MVC, Web API, and Angular 13+.
- Build and integrate RESTful APIs and ensure seamless communication between front-end and back-end services.
- Develop, optimize, and maintain SQL Server (2012+) databases, ensuring high availability, performance, and reliability.
- Write complex stored procedures, functions, triggers, and perform query tuning and indexing for performance optimization.
- Work with Entity Framework/EF Core to implement efficient data access strategies.
- Collaborate with cross-functional teams to define, design, and ship new features.
- Implement OAuth 2.0 authentication/authorization for secure access control.
- Write clean, testable, and maintainable code following Test-Driven Development (TDD) principles.
- Use GIT / TFVC for version control and collaborate using Azure DevOps Services for CI/CD pipelines.
- Participate in code reviews, troubleshoot issues, and optimize application performance.
- Stay updated with emerging technologies and recommend improvements to enhance system architecture.
Required Technical Skills
- 3+ years of experience in .NET development (C#, .NET Core 8, ASP.NET MVC, Web API).
- Strong experience in SQL Server development including:
- Query tuning, execution plan analysis, and performance optimization.
- Designing and maintaining indexes, partitioning strategies, and database normalization.
- Handling large datasets and optimizing stored procedures for scalability.
- Experience with SQL Profiler, Extended Events, and monitoring tools.
- Proficiency in Entity Framework / EF Core for ORM-based development.
- Familiarity with PostgreSQL and cross-database integration is a plus.
- Expertise in Angular 13+, HTML5, CSS, TypeScript, JavaScript, and Bootstrap.
- Experience with REST APIs development and integration.
- Knowledge of OAuth 2.0 and secure authentication methods.
- Hands-on experience with GIT/TFVC and Azure DevOps for source control and CI/CD pipelines.
- Basic knowledge of Node.js framework is a plus.
- Experience with unit testing frameworks like NUnit, MSTest, etc.
Soft Skills
- Strong problem-solving and analytical skills, particularly in debugging performance bottlenecks.
- Excellent communication and collaboration abilities.
- Ability to work independently and in a team environment.
- Attention to detail and a passion for writing clean, scalable, and optimized code.
Full Stack Engineer (Frontend Strong, Backend Proficient)
5-10 Years Experience
Contract: 6months+extendable
Location: Remote
Technical Requirements Frontend Expertise (Strong)
*Need at least 4 Yrs in React web developement, Node & AI.*
● Deep proficiency in React, Next.js, TypeScript
● Experience with state management (Redux, Context API)
● Frontend testing expertise (Jest, Cypress)
● Proven track record of achieving high Lighthouse performance scores Backend Proficiency
● Solid experience with Node.js, NestJS (preferred), or ExpressJS
● Database management (SQL, NoSQL)
● Cloud technologies experience (AWS, Azure)
● Understanding of OpenAI and AI integration capabilities (bonus) Full Stack Integration
● Excellent ability to manage and troubleshoot integration issues between frontend and backend systems
● Experience designing cohesive systems with proper separation of concerns
Real people. Real service.
At SupplyHouse.com, we value every individual team member and cultivate a community where people come first. Led by our core values of Generosity, Respect, Innovation, Teamwork, and GRIT, we’re dedicated to maintaining a supportive work environment that celebrates diversity and empowers everyone to reach their full potential. As an industry-leading e-commerce company specializing in HVAC, plumbing, heating, and electrical supplies since 2004, we strive to foster growth while providing the best possible experience for our customers.
Through an Employer of Record (EOR), we are looking for a new, remote Backend Engineer in India to join our growing IT Team. This individual will report into our Full Stack Team Lead and have the opportunity to work on impactful projects that enhance our e-commerce platform and internal operations, while honing your skills in backend and full stack development. If you’re passionate about creating user-friendly interfaces, building scalable systems, and contributing to innovative solutions in a collaborative and fun environment, we’d love to hear from you!
Role Type: Full-Time
Location: Remote from India
Schedule: Monday through Friday, 4:00 a.m. – 1:00 p.m. U.S. Eastern Time / 12:00 p.m. – 9:00 p.m. Indian Standard Time to ensure effective collaboration
Base Salary: $25,000 - $30,000 USD per year
Responsibilities:
- Collaborate with cross-functional teams to gather and refine requirements, ensuring alignment with business needs.
- Design, develop, test, deploy, and maintain scalable, high-performance software applications.
- Develop and enhance internal tools and applications to improve company operations.
- Ensure system reliability, optimize application performance, and implement best practices for scalability.
- Continuously improve existing codebases, conducting code reviews, and implementing modern practices.
- Stay up to date with emerging technologies, trends, and best practices in software development.
Requirements:
- Bachelor’s or Master’s degree in Computer Science, Software Engineering, or a related field.
- 3+ years of hands-on experience in backend and/or full-stack development with a proven track record of delivering high-quality software.
Back-End Skills:
- Proficiency in Java and experience with back-end frameworks like Spring Boot.
- Strong understanding of database design, RDBMS concepts, and experience with SQL.
- Knowledge of RESTful API design and integration.
Development Lifecycle: Proven ability to contribute across the entire software development lifecycle, including planning, design, coding, testing, deployment, and maintenance.
Tools & Practices:
- Familiarity with version control systems, like Git, and CI/CD pipelines.
- Experience with agile development methodologies.
Additional Skills:
- Strong problem-solving and debugging capabilities.
- Ability to create reusable code libraries and write clean, maintainable code.
- Strong communication and collaboration skills to work effectively within a team and across departments.
- High-level proficiency of written and verbal communication in English.
Preferred Qualifications:
- Proficiency in HTML5, CSS3, JavaScript (ES6+), and responsive design principles.
- Expertise in modern JavaScript frameworks and libraries such as React, Angular, or Vue.js.
- Experience with cross-browser compatibility and performance optimization techniques.
- Experience working on Frontend responsibilities such as:
- Designing and implementing reusable, maintainable UI components and templates.
- Working closely with Designers to ensure technical feasibility and adherence to UI/UX design standards.
- Managing and updating promotional banners and site-wide templates to ensure timely execution of marketing initiatives.
Why work with us:
- We have awesome benefits – We offer a wide variety of benefits to help support you and your loved ones. These include: Comprehensive and affordable medical, dental, vision, and life insurance options; Competitive Provident Fund contributions; Paid casual and sick leave, plus country-specific holidays; Mental health support and wellbeing program; Company-provided equipment and one-time $250 USD work from home stipend; $750 USD annual professional development budget; Company rewards and recognition program; And more!
- We promote work-life balance – We value your time and encourage a healthy separation between your professional and personal life to feel refreshed and recharged. Look out for our 100% remote schedule and wellness initiatives!
- We support growth– We strive to innovate every day. In an exciting and evolving industry, we provide potential for career growth through our hands-on training, access to the latest technologies and tools, diversity and inclusion initiatives, opportunities for internal mobility, and professional development budget.
- We give back –We live and breathe our core value, Generosity, by giving back to the trades and organizations around the world. We make a difference through donation drives, employee-nominated contributions, support for DE&I organizations, and more.
- We listen – We value hearing from our employees. Everyone has a voice, and we encourage you to use it! We actively elicit feedback through our monthly town halls, regular 1:1 check-ins, and company-wide ideas form to incorporate suggestions and ensure our team enjoys coming to work every day.
Check us out and learn more at https://www.supplyhouse.com/our-company!
Additional Details:
- Remote employees are expected to work in a distraction-free environment. Personal devices, background noise, and other distractions should be kept to a minimum to avoid disrupting virtual meetings or business operations.
- SupplyHouse.com is an Equal Opportunity Employer, strongly values inclusion, and encourages individuals of all backgrounds and experiences to apply for this position.
- To ensure fairness, all application materials, assessments, and interview responses must reflect your own original work. The use of AI tools, plagiarism, or any uncredited assistance is not permitted at any stage of the hiring process and may result in disqualification. We appreciate your honesty and look forward to seeing your skills.
- We are committed to providing a safe and secure work environment and conduct thorough background checks on all potential employees in accordance with applicable laws and regulations.
- All emails from the SupplyHouse team will only be sent from an @supplyhouse.com email address. Please exercise caution if you receive an email from an alternate domain.
What is an Employer of Record (EOR)?
Through our partnership with Remote.com, a global Employer of Record (EOR), you can join SupplyHouse from home, while knowing your employment is handled compliantly and securely. Remote takes care of the behind-the-scenes details – like payroll, benefits, taxes, and local compliance – so you can focus on your work and career growth. Even though Remote manages these administrative functions, you’ll be a part of the SupplyHouse team: connected to our culture, collaborating with colleagues, and contributing to our shared success. This partnership allows us to welcome talented team members worldwide while ensuring you receive a best-in-class employee experience.
We seek a highly skilled and experienced Ruby on Rails Development Team Lead/Architect to join our dynamic team at Uphance. The ideal candidate will have proven expertise in leading and architecting RoR projects, focusing on building scalable, high-quality applications. This role requires a combination of technical leadership, mentorship, and a strong commitment to best practices in software development.
Job Type: Contract/Remote/Full-Time/Long-term
Responsibilities:
- Develop and maintain high-quality Ruby on Rails applications that meet our high-quality standards.
- Design, build, and maintain efficient, reusable, and reliable Ruby code.
- Utilise your expertise in Ruby on Rails to enhance the performance and reliability of our platform.
- Set the technical direction for the existing RoR project, including system architecture and technology stack decisions.
- Guide and mentor team members to enhance their technical skills and understanding of RoR best practices.
- Conduct code reviews to maintain high coding standards and ensure adherence to best practices.
- Optimise application performance, focusing on ActiveRecord queries and overall architecture.
- Tackle complex technical challenges and provide efficient solutions, particularly when specifications are unclear or incomplete.
- Establish and enforce testing protocols; write and guide the team in writing effective tests.
- Define and ensure consistent adherence to best practices, particularly in the context of large applications.
- Manage the development process using Agile methodologies, possibly acting as a Scrum Master if required.
- Work closely with product managers, designers, and other stakeholders to meet project requirements and timelines.
Technical Requirements and Qualifications:
- Bachelor's or Master's degree in Computer Science, Engineering, or a related field
- Proven experience with Ruby on Rails, MySQL, HTML, and JavaScript (6+ years)
- Extensive experience with Ruby on Rails and familiarity with its best practices
- Proven track record of technical leadership and team management
- Strong problem-solving skills and the ability to address issues with incomplete specifications
- Proficiency in performance optimisation and software testing
- Experience with Agile development and Scrum practices
- Excellent mentoring and communication skills
- Experience with large-scale application development
- Application performance monitoring/tuning
General Requirements:
- Availability to work during the IST working hours.
- High-speed Internet and the ability to join technical video meetings during business hours.
- Strong analytical and problem-solving skills and ability to work as part of multi-functional teams.
- Ability to collaborate and be a team player.
Why Uphance?
- Engage in Innovative Projects: Immerse yourself in cutting-edge projects that not only test your skills but also encourage the exploration of new design realms.
- AI-Integrated Challenges: Take on projects infused with AI, pushing the boundaries of your abilities and allowing for exploration in uncharted territories of software design and development.
- Flexible Work Environment: Whether you embrace the digital nomad lifestyle or prefer the comfort of your own space, Uphance provides the freedom to design and create from any corner of the globe.
- Inclusive Team Environment: Join a dynamic, international, and inclusive team that values and celebrates diverse ideas.
- Collaborative Team Dynamics: Become a part of a supportive and motivated team that shares in the celebration of collective successes.
- Recognition and Appreciation: Your accomplishments will be acknowledged and applauded regularly in our Recognition Rally.
Compensation:
Salary Range: INR 24 LPA to INR 32 LPA (Salary is not a constraint for the right candidate)
At Uphance, we value innovation, collaboration, and continuous learning. As part of our team, you'll have the opportunity to lead a group of talented RoR developers, contribute to exciting projects, and play a key role in our company's success. If you are passionate about Ruby on Rails and thrive in a leadership role, we would love to hear from you. Apply today and follow us on LinkedIn - https://www.linkedin.com/company/uphance !
Must have skills:
1. GCP - GCS, PubSub, Dataflow or DataProc, Bigquery, Airflow/Composer, Python(preferred)/Java
2. ETL on GCP Cloud - Build pipelines (Python/Java) + Scripting, Best Practices, Challenges
3. Knowledge of Batch and Streaming data ingestion, build End to Data pipelines on GCP
4. Knowledge of Databases (SQL, NoSQL), On-Premise and On-Cloud, SQL vs No SQL, Types of No-SQL DB (At least 2 databases)
5. Data Warehouse concepts - Beginner to Intermediate level
Role & Responsibilities:
● Work with business users and other stakeholders to understand business processes.
● Ability to design and implement Dimensional and Fact tables
● Identify and implement data transformation/cleansing requirements
● Develop a highly scalable, reliable, and high-performance data processing pipeline to extract, transform and load data
from various systems to the Enterprise Data Warehouse
● Develop conceptual, logical, and physical data models with associated metadata including data lineage and technical
data definitions
● Design, develop and maintain ETL workflows and mappings using the appropriate data load technique
● Provide research, high-level design, and estimates for data transformation and data integration from source
applications to end-user BI solutions.
● Provide production support of ETL processes to ensure timely completion and availability of data in the data
warehouse for reporting use.
● Analyze and resolve problems and provide technical assistance as necessary. Partner with the BI team to evaluate,
design, develop BI reports and dashboards according to functional specifications while maintaining data integrity and
data quality.
● Work collaboratively with key stakeholders to translate business information needs into well-defined data
requirements to implement the BI solutions.
● Leverage transactional information, data from ERP, CRM, HRIS applications to model, extract and transform into
reporting & analytics.
● Define and document the use of BI through user experience/use cases, prototypes, test, and deploy BI solutions.
● Develop and support data governance processes, analyze data to identify and articulate trends, patterns, outliers,
quality issues, and continuously validate reports, dashboards and suggest improvements.
● Train business end-users, IT analysts, and developers.
We are seeking a skilled SQL Developer to join our team. This role serves as a key bridge between insurance operations and technical solutions, ensuring business requirements are accurately translated into efficient system functionality. The SQL Developer will play a critical part in maintaining and enhancing underwriting software products and system integrations—helping deliver reliable, high-quality solutions to clients in the insurtech space.
The ideal candidate possesses strong SQL expertise, advanced data mapping capabilities, and hands-on experience working with APIs, JSON, XML, and other data exchange formats. Experience with insurance technology platforms, such as ConceptOne or similar underwriting systems, is preferred. In this role, you will regularly develop, maintain, and troubleshoot stored procedures and functions, perform data validation, support integration efforts across multiple systems, and configure insurance workflows. You will work closely with business analysts, underwriters, and technical teams to ensure smooth product updates and continuous improvement of system functionality.
What We’re Looking For:
- 3+ years of experience in a technical, insurance, or insurtech-focused role
- Strong proficiency in writing SQL, including complex queries, stored procedures, and performance tuning
- Expertise in data mapping, data validation, and reporting
- Experience working with APIs, JSON, XML, and system-to-system integrations
- Strong analytical and problem-solving skills with the ability to troubleshoot and optimize complex workflows
- Clear and effective communication skills, able to translate technical concepts for non-technical stakeholders
- Ability to work independently and manage multiple tasks in a fast-paced environment
- Keen attention to detail and commitment to delivering accurate, high-quality results
Bonus:
- Hands-on experience with underwriting or policy administration systems (e.g., ConceptOne or similar platforms)
- Familiarity with core insurance processes, such as policy issuance, endorsements, raters, claims, and reporting
- Experience with the U.S. P&C (Property & Casualty) insurance
What You’ll Be Doing:
- Develop and optimize SQL stored procedures, functions, and triggers to support underwriting and compliance requirements
- Create and maintain reports, quote covers, and validations or map and configure forms, raters, and system workflows to ensure accurate data processes
- Set up, troubleshoot, and optimize underwriting platforms (ConceptOne/others) for performance and accuracy
- Manage integrations with APIs, JSON, and XML to connect external services and streamline data exchange
- Collaborate with BAs, QAs, and Developers to translate requirements, test outputs, and resolve issues
- Provide technical support and training to internal teams and clients to ensure effective system usage
Your Impact
- Build scalable backend services.
- Design, implement, and maintain databases, ensuring data integrity, security, and efficient retrieval.
- Implement the core logic that makes applications work, handling data processing, user requests, and system operations
- Contribute to the architecture and design of new product features
- Optimize systems for performance, scalability, and security
- Stay up-to-date with new technologies and frameworks, contributing to the advancement of software development practices
- Working closely with product managers and designers to turn ideas into reality and shape the product roadmap.
What skills do you need?
- 4+ years of experience in backend development, especially building robust APIS using Node.js, Express.js, MYSQL
- Strong command of JavaScript and understanding of its quirks and best practices
- Ability to think strategically when designing systems—not just how to build, but why
- Exposure to system design and interest in building scalable, high-availability systems
- Prior work on B2C applications with a focus on performance and user experience
- Ensure that applications can handle increasing loads and maintain performance, even under heavy traffic
- Work with complex queries for performing sophisticated data manipulation, analysis, and reporting.
- Knowledge of Sequelize, MongoDB and AWS would be an advantage.
- Experience in optimizing backend systems for speed and scalability.

Backend Engineering Intern (Infrastructure Software) – Remote
Position Type: Internship (Full-Time or Part-Time)
Location: Remote
Duration: 12 weeks
Compensation: Unpaid (***3000 INR is just a placeholder***)
About the Role
We are seeking a motivated Backend Developer Intern to join our engineering team and contribute to building scalable, efficient, and secure backend services. This internship offers hands-on experience in API development, database management, and backend architecture, with guidance from experienced developers. You will work closely with cross-functional teams to deliver features that power our applications and improve user experience.
Responsibilities
- Assist in designing, developing, and maintaining backend services, APIs, and integrations.
- Collaborate with frontend engineers to support application functionality and data flow.
- Write clean, efficient, and well-documented code.
- Support database design, optimization, and query performance improvements.
- Participate in code reviews, debugging, and troubleshooting production issues.
- Assist with unit testing, integration testing, and ensuring system reliability.
- Work with cloud-based environments (e.g., AWS, Azure, GCP) to deploy and manage backend systems.
Requirements
- Currently pursuing or recently completed a degree in Computer Science, Software Engineering, or related field.
- Familiarity with one or more backend languages/frameworks (e.g., Node.js, Python/Django, Java/Spring Boot, Ruby on Rails).
- Understanding of RESTful APIs and/or GraphQL.
- Basic knowledge of relational and/or NoSQL databases (e.g., MySQL, PostgreSQL, MongoDB).
- Familiarity with version control (Git/GitHub).
- Strong problem-solving skills and attention to detail.
- Ability to work independently in a remote, collaborative environment.
Preferred Skills (Nice to Have)
- Experience with cloud services (AWS Lambda, S3, EC2, etc.).
- Familiarity with containerization (Docker) and CI/CD pipelines.
- Basic understanding of authentication and authorization (OAuth, JWT).
- Interest in backend performance optimization and scalability.
What You’ll Gain
- Hands-on experience building backend systems for real-world applications.
- Exposure to industry-standard tools, workflows, and coding practices.
- Mentorship from experienced backend engineers.
- Opportunity to contribute to live projects impacting end users.
Springer Capital is a cross-border asset management firm specializing in real estate investment banking between China and the USA. We are offering a remote internship for aspiring data engineers interested in data pipeline development, data integration, and business intelligence.
The internship offers flexible start and end dates. A short quiz or technical task may be required as part of the selection process.
Responsibilities:
- Design, build, and maintain scalable data pipelines for structured and unstructured data sources
- Develop ETL processes to collect, clean, and transform data from internal and external systems
- Support integration of data into dashboards, analytics tools, and reporting systems
- Collaborate with data analysts and software developers to improve data accessibility and performance
- Document workflows and maintain data infrastructure best practices
- Assist in identifying opportunities to automate repetitive data tasks
Job description:
6+ years of hands on experience with both Manual and Automated testing with strong preference of experience using AccelQ on Salesforce ans SAP platforms.
Proven expertise in Salesforce particularly within the Sales Cloud module.
Proficient in writing complex SOQL and SQL queries for data validation and backend testing.
Extensive experience in designing and developing robust, reusable automated test scripts for Salesforce environments.
Highly skilled at early issue detection, with a deep understanding of backend configurations, process flows and validation rules.
Should have a strong background in Salesforce testing, with hands-on experience in automation tools such as Selenium, Provar, or TestNG.
You will be responsible for creating and maintaining automated test scripts, executing test cases, identifying bugs, and ensuring the quality and reliability of Salesforce applications.
A solid understanding of Salesforce modules (Sales Cloud, Service Cloud, etc.) and APIs is essential.
Experience with CI/CD tools like Jenkins and version control systems like Git is preferred.
You will work closely with developers, business analysts, and stakeholders to define test strategies and improve the overall QA process.

























