50+ Remote SQL Jobs in India
Apply to 50+ Remote SQL Jobs on CutShort.io. Find your next job, effortlessly. Browse SQL Jobs and apply today!

We’re looking for a dynamic and driven Data Analyst to join our team of technology enthusiasts. This role is crucial in transforming data into insights that support strategic decision-making and innovation within the insurance technology (InsurTech) space. If you’re passionate about working with data, understanding systems, and delivering value through analytics, we’d love to hear from you.
What We’re Looking For
- Proven experience working as a Data Analyst or in a similar analytical role
- 3+ Years of experience in the field
- Strong command of SQL for querying and manipulating relational databases
- Experience with Power BI for building impactful dashboards and reports
- Familiarity with QlikView and Qlik Sense is a plus
- Ability to communicate findings clearly to technical and non-technical stakeholders
- Knowledge of Python or R for data manipulation is nice to have
- Bachelor’s degree in Computer Science, Statistics, Mathematics, Economics, or a related field
- Understanding of the insurance industry or InsurTech is a strong advantage
What You’ll Be Doing
- Delivering timely and insightful reports to support strategic decision-making
- Working extensively with Policy Administration System (PAS) data to uncover patterns and trends
- Ensuring data accuracy and consistency across reports and systems
- Collaborating with clients, underwriters, and brokers to translate business needs into data solutions
- Organizing and structuring datasets, contributing to data engineering workflows and pipelines
- Producing analytics to support business development and market strategy

We are seeking a highly skilled Fabric Data Engineer with strong expertise in Azure ecosystem to design, build, and maintain scalable data solutions. The ideal candidate will have hands-on experience with Microsoft Fabric, Databricks, Azure Data Factory, PySpark, SQL, and other Azure services to support advanced analytics and data-driven decision-making.
Key Responsibilities
- Design, develop, and maintain scalable data pipelines using Microsoft Fabric and Azure data services.
- Implement data integration, transformation, and orchestration workflows with Azure Data Factory, Databricks, and PySpark.
- Work with stakeholders to understand business requirements and translate them into robust data solutions.
- Optimize performance and ensure data quality, reliability, and security across all layers.
- Develop and maintain data models, metadata, and documentation to support analytics and reporting.
- Collaborate with data scientists, analysts, and business teams to deliver insights-driven solutions.
- Stay updated with emerging Azure and Fabric technologies to recommend best practices and innovations.
- Required Skills & Experience
- Proven experience as a Data Engineer with strong expertise in the Azure cloud ecosystem.
Hands-on experience with:
- Microsoft Fabric
- Azure Databricks
- Azure Data Factory (ADF)
- PySpark & Python
- SQL (T-SQL/PL-SQL)
- Solid understanding of data warehousing, ETL/ELT processes, and big data architectures.
- Knowledge of data governance, security, and compliance within Azure.
- Strong problem-solving, debugging, and performance tuning skills.
- Excellent communication and collaboration abilities.
Preferred Qualifications
- Microsoft Certified: Fabric Analytics Engineer Associate / Azure Data Engineer Associate.
- Experience with Power BI, Delta Lake, and Lakehouse architecture.
- Exposure to DevOps, CI/CD pipelines, and Git-based version control.
Real people. Real service.
At SupplyHouse.com, we value every individual team member and cultivate a community where people come first. Led by our core values of Generosity, Respect, Innovation, Teamwork, and GRIT, we’re dedicated to maintaining a supportive work environment that celebrates diversity and empowers everyone to reach their full potential. As an industry-leading e-commerce company specializing in HVAC, plumbing, heating, and electrical supplies since 2004, we strive to foster growth while providing the best possible experience for our customers.
Through an Employer of Record (EOR), we are looking for a new, remote Backend Engineer in India to join our growing IT Team. This individual will report into our Full Stack Team Lead and have the opportunity to work on impactful projects that enhance our e-commerce platform and internal operations, while honing your skills in backend and full stack development. If you’re passionate about creating user-friendly interfaces, building scalable systems, and contributing to innovative solutions in a collaborative and fun environment, we’d love to hear from you!
Role Type: Full-Time
Location: Remote from India
Schedule: Monday through Friday, 4:00 a.m. – 1:00 p.m. U.S. Eastern Time / 12:00 p.m. – 9:00 p.m. Indian Standard Time to ensure effective collaboration
Base Salary: $25,000 - $30,000 USD per year
Responsibilities:
- Collaborate with cross-functional teams to gather and refine requirements, ensuring alignment with business needs.
- Design, develop, test, deploy, and maintain scalable, high-performance software applications.
- Develop and enhance internal tools and applications to improve company operations.
- Ensure system reliability, optimize application performance, and implement best practices for scalability.
- Continuously improve existing codebases, conducting code reviews, and implementing modern practices.
- Stay up to date with emerging technologies, trends, and best practices in software development.
Requirements:
- Bachelor’s or Master’s degree in Computer Science, Software Engineering, or a related field.
- 3+ years of hands-on experience in backend and/or full-stack development with a proven track record of delivering high-quality software.
Back-End Skills:
- Proficiency in Java and experience with back-end frameworks like Spring Boot.
- Strong understanding of database design, RDBMS concepts, and experience with SQL.
- Knowledge of RESTful API design and integration.
Development Lifecycle: Proven ability to contribute across the entire software development lifecycle, including planning, design, coding, testing, deployment, and maintenance.
Tools & Practices:
- Familiarity with version control systems, like Git, and CI/CD pipelines.
- Experience with agile development methodologies.
Additional Skills:
- Strong problem-solving and debugging capabilities.
- Ability to create reusable code libraries and write clean, maintainable code.
- Strong communication and collaboration skills to work effectively within a team and across departments.
- High-level proficiency of written and verbal communication in English.
Preferred Qualifications:
- Proficiency in HTML5, CSS3, JavaScript (ES6+), and responsive design principles.
- Expertise in modern JavaScript frameworks and libraries such as React, Angular, or Vue.js.
- Experience with cross-browser compatibility and performance optimization techniques.
- Experience working on Frontend responsibilities such as:
- Designing and implementing reusable, maintainable UI components and templates.
- Working closely with Designers to ensure technical feasibility and adherence to UI/UX design standards.
- Managing and updating promotional banners and site-wide templates to ensure timely execution of marketing initiatives.
Why work with us:
- We have awesome benefits – We offer a wide variety of benefits to help support you and your loved ones. These include: Comprehensive and affordable medical, dental, vision, and life insurance options; Competitive Provident Fund contributions; Paid casual and sick leave, plus country-specific holidays; Mental health support and wellbeing program; Company-provided equipment and one-time $250 USD work from home stipend; $750 USD annual professional development budget; Company rewards and recognition program; And more!
- We promote work-life balance – We value your time and encourage a healthy separation between your professional and personal life to feel refreshed and recharged. Look out for our 100% remote schedule and wellness initiatives!
- We support growth– We strive to innovate every day. In an exciting and evolving industry, we provide potential for career growth through our hands-on training, access to the latest technologies and tools, diversity and inclusion initiatives, opportunities for internal mobility, and professional development budget.
- We give back –We live and breathe our core value, Generosity, by giving back to the trades and organizations around the world. We make a difference through donation drives, employee-nominated contributions, support for DE&I organizations, and more.
- We listen – We value hearing from our employees. Everyone has a voice, and we encourage you to use it! We actively elicit feedback through our monthly town halls, regular 1:1 check-ins, and company-wide ideas form to incorporate suggestions and ensure our team enjoys coming to work every day.
Check us out and learn more at https://www.supplyhouse.com/our-company!
Additional Details:
- Remote employees are expected to work in a distraction-free environment. Personal devices, background noise, and other distractions should be kept to a minimum to avoid disrupting virtual meetings or business operations.
- SupplyHouse.com is an Equal Opportunity Employer, strongly values inclusion, and encourages individuals of all backgrounds and experiences to apply for this position.
- To ensure fairness, all application materials, assessments, and interview responses must reflect your own original work. The use of AI tools, plagiarism, or any uncredited assistance is not permitted at any stage of the hiring process and may result in disqualification. We appreciate your honesty and look forward to seeing your skills.
- We are committed to providing a safe and secure work environment and conduct thorough background checks on all potential employees in accordance with applicable laws and regulations.
- All emails from the SupplyHouse team will only be sent from an @supplyhouse.com email address. Please exercise caution if you receive an email from an alternate domain.
What is an Employer of Record (EOR)?
Through our partnership with Remote.com, a global Employer of Record (EOR), you can join SupplyHouse from home, while knowing your employment is handled compliantly and securely. Remote takes care of the behind-the-scenes details – like payroll, benefits, taxes, and local compliance – so you can focus on your work and career growth. Even though Remote manages these administrative functions, you’ll be a part of the SupplyHouse team: connected to our culture, collaborating with colleagues, and contributing to our shared success. This partnership allows us to welcome talented team members worldwide while ensuring you receive a best-in-class employee experience.

We seek a highly skilled and experienced Ruby on Rails Development Team Lead/Architect to join our dynamic team at Uphance. The ideal candidate will have proven expertise in leading and architecting RoR projects, focusing on building scalable, high-quality applications. This role requires a combination of technical leadership, mentorship, and a strong commitment to best practices in software development.
Job Type: Contract/Remote/Full-Time/Long-term
Responsibilities:
- Develop and maintain high-quality Ruby on Rails applications that meet our high-quality standards.
- Design, build, and maintain efficient, reusable, and reliable Ruby code.
- Utilise your expertise in Ruby on Rails to enhance the performance and reliability of our platform.
- Set the technical direction for the existing RoR project, including system architecture and technology stack decisions.
- Guide and mentor team members to enhance their technical skills and understanding of RoR best practices.
- Conduct code reviews to maintain high coding standards and ensure adherence to best practices.
- Optimise application performance, focusing on ActiveRecord queries and overall architecture.
- Tackle complex technical challenges and provide efficient solutions, particularly when specifications are unclear or incomplete.
- Establish and enforce testing protocols; write and guide the team in writing effective tests.
- Define and ensure consistent adherence to best practices, particularly in the context of large applications.
- Manage the development process using Agile methodologies, possibly acting as a Scrum Master if required.
- Work closely with product managers, designers, and other stakeholders to meet project requirements and timelines.
Technical Requirements and Qualifications:
- Bachelor's or Master's degree in Computer Science, Engineering, or a related field
- Proven experience with Ruby on Rails, MySQL, HTML, and JavaScript (6+ years)
- Extensive experience with Ruby on Rails and familiarity with its best practices
- Proven track record of technical leadership and team management
- Strong problem-solving skills and the ability to address issues with incomplete specifications
- Proficiency in performance optimisation and software testing
- Experience with Agile development and Scrum practices
- Excellent mentoring and communication skills
- Experience with large-scale application development
- Application performance monitoring/tuning
General Requirements:
- Availability to work during the IST working hours.
- High-speed Internet and the ability to join technical video meetings during business hours.
- Strong analytical and problem-solving skills and ability to work as part of multi-functional teams.
- Ability to collaborate and be a team player.
Why Uphance?
- Engage in Innovative Projects: Immerse yourself in cutting-edge projects that not only test your skills but also encourage the exploration of new design realms.
- AI-Integrated Challenges: Take on projects infused with AI, pushing the boundaries of your abilities and allowing for exploration in uncharted territories of software design and development.
- Flexible Work Environment: Whether you embrace the digital nomad lifestyle or prefer the comfort of your own space, Uphance provides the freedom to design and create from any corner of the globe.
- Inclusive Team Environment: Join a dynamic, international, and inclusive team that values and celebrates diverse ideas.
- Collaborative Team Dynamics: Become a part of a supportive and motivated team that shares in the celebration of collective successes.
- Recognition and Appreciation: Your accomplishments will be acknowledged and applauded regularly in our Recognition Rally.
Compensation:
Salary Range: INR 24 LPA to INR 32 LPA (Salary is not a constraint for the right candidate)
At Uphance, we value innovation, collaboration, and continuous learning. As part of our team, you'll have the opportunity to lead a group of talented RoR developers, contribute to exciting projects, and play a key role in our company's success. If you are passionate about Ruby on Rails and thrive in a leadership role, we would love to hear from you. Apply today and follow us on LinkedIn - https://www.linkedin.com/company/uphance !
Job Description
3-5 years of hands-on experience in manual testing involving functional, non-functional, regression, and integration testing in a structured environment.
Candidate should have exceptional communication skills.
Should have minimum 1 year work experience in data comparison testing.
Experience in testing web-based applications.
Able to define the scope of testing.
Experience in testing large-scale solutions integrating multiple source and target systems.
Experience in API testing.
Experience in Database verification using SQL queries.
Experience working in an Agile team.
Should be able to attend Agile ceremonies in UK hours.
Having a good understanding of Data Migration projects will be a plus.

Must have skills:
1. GCP - GCS, PubSub, Dataflow or DataProc, Bigquery, Airflow/Composer, Python(preferred)/Java
2. ETL on GCP Cloud - Build pipelines (Python/Java) + Scripting, Best Practices, Challenges
3. Knowledge of Batch and Streaming data ingestion, build End to Data pipelines on GCP
4. Knowledge of Databases (SQL, NoSQL), On-Premise and On-Cloud, SQL vs No SQL, Types of No-SQL DB (At least 2 databases)
5. Data Warehouse concepts - Beginner to Intermediate level
Role & Responsibilities:
● Work with business users and other stakeholders to understand business processes.
● Ability to design and implement Dimensional and Fact tables
● Identify and implement data transformation/cleansing requirements
● Develop a highly scalable, reliable, and high-performance data processing pipeline to extract, transform and load data
from various systems to the Enterprise Data Warehouse
● Develop conceptual, logical, and physical data models with associated metadata including data lineage and technical
data definitions
● Design, develop and maintain ETL workflows and mappings using the appropriate data load technique
● Provide research, high-level design, and estimates for data transformation and data integration from source
applications to end-user BI solutions.
● Provide production support of ETL processes to ensure timely completion and availability of data in the data
warehouse for reporting use.
● Analyze and resolve problems and provide technical assistance as necessary. Partner with the BI team to evaluate,
design, develop BI reports and dashboards according to functional specifications while maintaining data integrity and
data quality.
● Work collaboratively with key stakeholders to translate business information needs into well-defined data
requirements to implement the BI solutions.
● Leverage transactional information, data from ERP, CRM, HRIS applications to model, extract and transform into
reporting & analytics.
● Define and document the use of BI through user experience/use cases, prototypes, test, and deploy BI solutions.
● Develop and support data governance processes, analyze data to identify and articulate trends, patterns, outliers,
quality issues, and continuously validate reports, dashboards and suggest improvements.
● Train business end-users, IT analysts, and developers.

About the Job
Data Analyst (Mid-Level)
Experience: 1–5 Years
Salary: Competitive
Preferred Notice Period: Immediate to 30 Days
Opportunity Type: Remote (Global)
Placement Type: Freelance/Contract
(Note: This is a requirement for one of TalentLo’s Clients)
Role Overview
We’re seeking experienced Data Analysts with 1–5 years of professional experience to help clients turn raw data into actionable insights. As part of our founding talent pool, you’ll work with global clients on projects that demand analytical depth, business understanding, and storytelling skills — enabling companies to make smarter, data-driven decisions.
Responsibilities
- Collect, clean, and analyze large datasets from multiple sources
- Build dashboards and visualizations to communicate insights effectively
- Conduct exploratory and statistical analysis to identify patterns and trends
- Translate business requirements into analytical models and reports
- Work with stakeholders to support data-driven decision-making
- Ensure data accuracy, integrity, and consistency across projects
Requirements
- Strong proficiency in SQL for querying and managing datasets
- Hands-on experience with visualization tools (Tableau, Power BI, Looker, or similar)
- Proficiency in Python or R for data analysis and automation
- Solid understanding of statistics, hypothesis testing, and data modeling
- Ability to translate complex data findings into clear business insights
- Familiarity with ETL processes and data pipelines (preferred but not mandatory)
How to Apply
- Create your profile on TalentLo’s platform → https://www.talentlo.com/signup
- Submit your GitHub, portfolio, or sample projects
- Take the required assessment and get qualified
- Get shortlisted & connect with the client
About TalentLo
TalentLo is a revolutionary talent platform connecting exceptional tech professionals with high-quality clients worldwide. We’re building a carefully curated pool of skilled experts to match with companies actively seeking specialized talent for impactful projects.
✨ If you’re ready to work on impactful data projects, collaborate with global teams, and advance your career — apply today!

About Us:
MyOperator is India’s leading Cloud Telephony platform, empowering 40,000+ businesses with smarter communication solutions. We are scaling our engineering team to build high-performing, reliable, and scalable web applications. We’re looking for a React.js Developer with strong expertise in frontend engineering who can take ownership of building pixel-perfect, user-friendly, and performant web applications. Exposure to backend (Node.js) is a plus.
Key Responsibilities
Frontend (React.js – Primary Focus):
- Build modern, responsive, and high-performance UIs using React.js.
- Implement state management using Redux, MobX, or similar libraries.
- Create and optimize React Hooks (inbuilt & custom).
- Write unit tests to ensure product quality and maintainability.
- Apply ES6+ features, Webpack, and other modern JS tooling.
- Diagnose and fix UI/UX performance bottlenecks.
- Debug and resolve cross-browser compatibility issues.
Backend (Node.js – Secondary):
- Basic ability to build and integrate RESTful APIs with Node.js.
- Familiarity with frameworks like Express.js or NestJS.
- Understanding of authentication, session handling, and caching.
Databases & Tools:
- Work with SQL databases (mandatory).
- Exposure to NoSQL databases and ORMs is a plus.
- Use Git for version control and collaborative coding.
Qualifications
- 3+ years of professional software development experience.
- 3+ years of proven experience with React.js.
- Solid understanding of JavaScript (ES6+), HTML5, CSS3.
- Strong knowledge of state management, hooks, and UI performance optimization.
- Good problem-solving skills with a focus on clean, maintainable code.
- Exposure to Node.js and backend concepts (good to have).
Good to Have
- Experience with TypeScript.
- Knowledge of Next.js for server-side rendering.
- Familiarity with REST APIs and basic backend integration.
- Strong debugging and browser performance optimization skills.
Why Join Us?
- Opportunity to specialize in React.js while working on impactful products.
- Collaborative environment with full ownership of features.
- Work with cutting-edge frontend technologies at scale.
- Competitive compensation and career growth opportunities.

About Ven Analytics
At Ven Analytics, we don’t just crunch numbers — we decode them to uncover insights that drive real business impact. We’re a data-driven analytics company that partners with high-growth startups and enterprises to build powerful data products, business intelligence systems, and scalable reporting solutions. With a focus on innovation, collaboration, and continuous learning, we empower our teams to solve real-world business problems using the power of data.
Role Overview
We’re looking for a Power BI Data Engineer who is not just proficient in tools but passionate about building insightful, scalable, and high-performing dashboards. The ideal candidate should have strong fundamentals in data modeling, a flair for storytelling through data, and the technical skills to implement robust data solutions using Power BI, Python, and SQL.
Key Responsibilities
- Technical Expertise: Develop scalable, accurate, and maintainable data models using Power BI, with a clear understanding of Data Modeling, DAX, Power Query, and visualization principles.
- Programming Proficiency: Use SQL and Python for complex data manipulation, automation, and analysis.
- Business Problem Translation: Collaborate with stakeholders to convert business problems into structured data-centric solutions considering performance, scalability, and commercial goals.
- Hypothesis Development: Break down complex use-cases into testable hypotheses and define relevant datasets required for evaluation.
- Solution Design: Create wireframes, proof-of-concepts (POC), and final dashboards in line with business requirements.
- Dashboard Quality: Ensure dashboards meet high standards of data accuracy, visual clarity, performance, and support SLAs.
- Performance Optimization: Continuously enhance user experience by improving performance, maintainability, and scalability of Power BI solutions.
- Troubleshooting & Support: Quick resolution of access, latency, and data issues as per defined SLAs.
Must-Have Skills
- Strong experience building robust data models in Power BI
- Hands-on expertise with DAX (complex measures and calculated columns)
- Proficiency in M Language (Power Query) beyond drag-and-drop UI
- Clear understanding of data visualization best practices (less fluff, more insight)
- Solid grasp of SQL and Python for data processing
- Strong analytical thinking and ability to craft compelling data stories
Good-to-Have (Bonus Points)
- Experience using DAX Studio and Tabular Editor
- Prior work in a high-volume data processing production environment
- Exposure to modern CI/CD practices or version control with BI tools
Why Join Ven Analytics?
- Be part of a fast-growing startup that puts data at the heart of every decision.
- Opportunity to work on high-impact, real-world business challenges.
- Collaborative, transparent, and learning-oriented work environment.
- Flexible work culture and focus on career development.

We are seeking a Technical Analyst (TA) to join our team. The TA is a hybrid expert in insurance processes and technical solutions, bridging the gap between business needs and technology. In this role, you will ensure that underwriting software, forms, raters, and integrations work seamlessly, providing critical support to clients in the insurtech space.
This position requires strong SQL expertise, data mapping skills, and experience with APIs, JSON, XML, and insurance-specific systems. The ideal candidate has hands-on experience working with underwriting platforms (e.g., ConceptOne) and can configure, optimize, and troubleshoot complex insurance workflows.
Key Responsibilities
1. Data Mapping for Forms & Raters
- Map backend database fields to front-end user interfaces to ensure accurate data display.
- Configure digital insurance forms (applications, dec pages, endorsements, claims, schedules of locations, coverage summaries).
- Integrate raters with underwriting systems via APIs, XML, and spreadsheet mapping.
- Ensure rating outputs (e.g., premiums) reflect correct calculations and business logic.
2. Writing SQL Reports, Quote Covers, & Validations
- Develop and optimize SQL reports (e.g., BDX reports, renewal reports, loss runs, underwriting performance, compliance audits).
- Build and automate Quote Covers, ensuring accurate data extraction, rating logic, and business rule application.
- Configure validations to enforce business rules in underwriting platforms, reducing errors and improving data accuracy.
3. Configuring & Optimizing Underwriting Platforms (C1/Other)
- Set up and configure system components (user-defined tables, corresponding templates, invoice/check documents, workflows).
- Troubleshoot and optimize system performance (e.g., speeding up slow reports, fixing data flow issues).
- Collaborate with stakeholders to align configurations with underwriting and compliance needs.
4. API, JSON, and XML Integration
- Integrate underwriting systems with external APIs (e.g., driving records, risk assessment tools, carrier services).
- Structure and transform data in JSON/XML for smooth communication between systems.
- Manage third-party integrations to ensure seamless system connectivity and data accuracy.
Tools & Technologies
- SQL & SSMS (SQL Server Management Studio) – for queries, stored procedures, validations, and reporting.
- Adobe Acrobat Pro – for form design and field alignment.
- Infomaker (PowerBuilder) – for reporting, form development, and front-end data mapping.
- Mapping Utility Tools & Excel – for spreadsheet rating mapping and integration.
- APIs (REST/SOAP) – for external integrations using JSON and XML formats.
Collaboration & Workflow
- Work with Business Analysts (BAs) to translate business requirements into technical specifications.
- Collaborate with Quality Assurance (QAs) to test and refine system outputs, resolving bugs or issues.
- Partner with Developers when deeper coding or new feature development is required.
Qualifications
- Strong proficiency in SQL, data mapping, and reporting.
- Hands-on experience with underwriting systems (ConceptOne or similar).
- Familiarity with insurance processes (policy issuance, endorsements, raters, claims, reporting).
- Experience with APIs, JSON, XML, and system integrations.
- Problem-solving mindset with ability to troubleshoot and optimize workflows.
- Strong communication skills, capable of bridging technical and business perspectives.
Why Join Us?
As a Technical Analyst, you are the engineer of clarity within complex underwriting systems. You ensure that underwriters, MGAs, and carriers get accurate data, seamless integrations, and reliable reporting. This isn’t just a support role—it’s a critical function that transforms technology into operational performance, enabling our clients to save time, reduce errors, and unlock the full potential of their platforms.

Springer Capital is a cross-border asset management firm specializing in real estate investment banking between China and the USA. We are offering a remote internship for aspiring data engineers interested in data pipeline development, data integration, and business intelligence. The internship offers flexible start and end dates. A short quiz or technical task may be required as part of the selection process.
Responsibilities:
-Design, build, and maintain scalable data pipelines for
structured and unstructured data sources
-Develop ETL processes to collect, clean, and transform data
from internal and external systems. Support integration of data into
dashboards, analytics tools, and reporting systems
-Collaborate with data analysts and software developers to
improve data accessibility and performance.
-Document workflows and maintain data infrastructure best
practices.
-Assist in identifying opportunities to automate repetitive data
tasks
Please send your resume to talent@springer. capital

Springer Capital is a cross-border asset management firm specializing in real estate investment banking between China and the USA. We are offering a remote internship for aspiring data engineers interested in data pipeline development, data integration, and business intelligence. The internship offers flexible start and end dates. A short quiz or technical task may be required as part of the selection process.
Responsibilities:
-Design, build, and maintain scalable data pipelines for
structured and unstructured data sources
-Develop ETL processes to collect, clean, and transform data
from internal and external systems. Support integration of data into
dashboards, analytics tools, and reporting systems
-Collaborate with data analysts and software developers to
improve data accessibility and performance.
-Document workflows and maintain data infrastructure best
practices.
-Assist in identifying opportunities to automate repetitive data
tasks
Please send your resume to talent@springer. capital
Your Impact
- Build scalable backend services.
- Design, implement, and maintain databases, ensuring data integrity, security, and efficient retrieval.
- Implement the core logic that makes applications work, handling data processing, user requests, and system operations
- Contribute to the architecture and design of new product features
- Optimize systems for performance, scalability, and security
- Stay up-to-date with new technologies and frameworks, contributing to the advancement of software development practices
- Working closely with product managers and designers to turn ideas into reality and shape the product roadmap.
What skills do you need?
- 4+ years of experience in backend development, especially building robust APIS using Node.js, Express.js, MYSQL
- Strong command of JavaScript and understanding of its quirks and best practices
- Ability to think strategically when designing systems—not just how to build, but why
- Exposure to system design and interest in building scalable, high-availability systems
- Prior work on B2C applications with a focus on performance and user experience
- Ensure that applications can handle increasing loads and maintain performance, even under heavy traffic
- Work with complex queries for performing sophisticated data manipulation, analysis, and reporting.
- Knowledge of Sequelize, MongoDB and AWS would be an advantage.
- Experience in optimizing backend systems for speed and scalability.


Position Responsibilities:
- Collaborate with the development team to maintain, enhance, and scale the product for enterprise use.
- Design and develop scalable, high-performance solutions using cloud technologies and containerization.
- Contribute to all phases of the development lifecycle, following SOLID principles and best practices.
- Write well-designed, testable, and efficient code with a strong emphasis on Test-Driven Development (TDD), ensuring comprehensive unit, integration, and performance testing.
- Ensure software designs comply with specifications and security best practices.
- Recommend changes to improve application architecture, maintainability, and performance.
- Develop and optimize database queries using T-SQL.
- Build and maintain microservices-based architecture where applicable.
- Prepare and produce software component releases.
- Develop and execute unit, integration, and performance tests.
- Support formal testing cycles and resolve test defects.
- Enhance software efficiency, maintainability, and scalability, with a strong focus on cloud-native solutions.
AI-Specific Responsibilities:
- Integrate AI-powered tools and frameworks to enhance code quality and development efficiency.
- Utilize AI-driven analytics to identify performance bottlenecks and optimize system performance.
- Implement AI-based security measures to proactively detect and mitigate potential threats.
- Leverage AI for automated testing and continuous integration/continuous deployment (CI/CD) processes.
- Guide the adoption and effective use of AI agents for automating repetitive development, deployment, and testing processes within the engineering team.
Qualifications:
- Bachelor’s degree in computer science, IT, or a related field.
- Highly proficient in ASP.NET Core (C#) and full-stack development.
- Experience developing REST APIs and microservices.
- Strong knowledge of cloud platforms (Azure preferred), including scalability and enterprise cloud.
- Experience with containerization.
- Proficiency in front-end technologies (JavaScript, HTML, CSS, Bootstrap, and UI frameworks).
- Strong database experience, particularly with T-SQL and relational database design.
- Advanced understanding of object-oriented programming (OOP) and SOLID principles.
- Experience with security best practices in web and API development.
- Knowledge of Agile SCRUM methodology and experience in collaborative environments.
- Experience with Test-Driven Development (TDD).
- Strong analytical skills, problem-solving abilities, and curiosity to explore new technologies.
- Ability to communicate effectively, including explaining technical concepts to non-technical stakeholders.
- High commitment to continuous learning, innovation, and improvement.
AI-Specific Qualifications:
- Proficiency in AI-driven development tools and platforms.
- Knowledge of AI-based security protocols and threat detection systems.
- Experience integrating GenAI or Agentic AI agents into full-stack workflows (e.g., using AI for code reviews, automated bug fixes, or system monitoring).
- Demonstrated proficiency with AI-assisted development tools and prompt engineering for code generation, testing, or documentation.
Work Schedule
- Mid-shift schedule (2 PM to 11 PM India Standard Time)

Job Description: Data Analyst
About the Role
We are seeking a highly skilled Data Analyst with strong expertise in SQL/PostgreSQL, Python (Pandas), Data Visualization, and Business Intelligence tools to join our team. The candidate will be responsible for analyzing large-scale datasets, identifying trends, generating actionable insights, and supporting business decisions across marketing, sales, operations, and customer experience..
Key Responsibilities
- Data Extraction & Management
- Write complex SQL queries in PostgreSQL to extract, clean, and transform large datasets.
- Ensure accuracy, reliability, and consistency of data across different platforms.
- Data Analysis & Insights
- Conduct deep-dive analyses to understand customer behavior, funnel drop-offs, product performance, campaign effectiveness, and sales trends.
- Perform cohort, LTV (lifetime value), retention, and churn analysis to identify opportunities for growth.
- Provide recommendations to improve conversion rates, average order value (AOV), and repeat purchase rates.
- Business Intelligence & Visualization
- Build and maintain interactive dashboards and reports using BI tools (e.g., PowerBI, Metabase or Looker).
- Create visualizations that simplify complex datasets for stakeholders and management.
- Python (Pandas)
- Use Python (Pandas, NumPy) for advanced analytics.
- Collaboration & Stakeholder Management
- Work closely with product, operations, and leadership teams to provide insights that drive decision-making.
- Communicate findings in a clear, concise, and actionable manner to both technical and non-technical stakeholders.
Required Skills
- SQL/PostgreSQL
- Complex joins, window functions, CTEs, aggregations, query optimization.
- Python (Pandas & Analytics)
- Data wrangling, cleaning, transformations, exploratory data analysis (EDA).
- Libraries: Pandas, NumPy, Matplotlib, Seaborn
- Data Visualization & BI Tools
- Expertise in creating dashboards and reports using Metabase or Looker.
- Ability to translate raw data into meaningful visual insights.
- Business Intelligence
- Strong analytical reasoning to connect data insights with e-commerce KPIs.
- Experience in funnel analysis, customer journey mapping, and retention analysis.
- Analytics & E-commerce Knowledge
- Understanding of metrics like CAC, ROAS, LTV, churn, contribution margin.
- General Skills
- Strong communication and presentation skills.
- Ability to work cross-functionally in fast-paced environments.
- Problem-solving mindset with attention to detail.
Education: Bachelor’s degree in Data Science, Computer Science, data processing

Business Summary
The Deltek Global Cloud team focuses on the delivery of first-class services and solutions for our customers. We are an innovative and dynamic team that is passionate about transforming the Deltek cloud services that power our customers' project success. Our diverse, global team works cross-functionally to make an impact on the business. If you want to work in a transformational environment, where education and training are encouraged, consider Deltek as the next step in your career!
Position Responsibilities
- Diagnose and resolve complex software issues, including operational challenges, performance bottlenecks, and system faults.
- Collaborate with Global SRE, Product Delivery, Product Engineering, and Support Services teams to ensure seamless operations.
- Maintain consistent service availability by proactively monitoring system stability and performance.
- Execute daily operational tasks such as database creation, schema management, restores, application configuration, and other system administration duties.
- Lead incident response efforts, manage major incident bridges, and contribute to post-incident reviews to drive continuous improvement and prevent recurrence.
- Develop and implement automation strategies to minimize manual work and enhance operational efficiency.
- Monitor system resource utilization, errors, and alert trends to support capacity planning.
- Document internal operational processes, procedures, and policies for consistency and knowledge sharing.
- Work within a 24/7 shift schedule to provide reliable coverage.
- Participate in maintenance activities and on-call rotations as needed.
Qualifications
- Bachelor’s degree in Computer Science or a related field, or equivalent experience.
- 5+ years of experience supporting large-scale enterprise applications and systems on public cloud infrastructure (AWS preferred).
- 5+ years of hands-on experience managing and operating enterprise-grade Windows or Linux production environments.
- 3+ years of experience applying an automation-first approach using configuration management tools and scripting (e.g., Bash, Python, PowerShell).
- Familiarity with Incident Management and ITIL service operations (ServiceNow experience preferred).
- Experience with monitoring and observability tools such as AppDynamics, Splunk, SolarWinds, DPA, Nagios, NewRelic, Grafana, or Prometheus.
- Proficiency in database management, including Oracle or Microsoft SQL Server administration.
- Strong knowledge of database query optimization and performance tuning.
- Passion for leveraging technology with a proactive, self-directed learning mindset.
- Detail-oriented, results-driven, and able to communicate effectively in English.
- Strong teamwork and collaboration skills, with the ability to solve problems across departments.


Backend Engineering Intern (Infrastructure Software) – Remote
Position Type: Internship (Full-Time or Part-Time)
Location: Remote
Duration: 12 weeks
Compensation: Unpaid (***3000 INR is just a placeholder***)
About the Role
We are seeking a motivated Backend Developer Intern to join our engineering team and contribute to building scalable, efficient, and secure backend services. This internship offers hands-on experience in API development, database management, and backend architecture, with guidance from experienced developers. You will work closely with cross-functional teams to deliver features that power our applications and improve user experience.
Responsibilities
- Assist in designing, developing, and maintaining backend services, APIs, and integrations.
- Collaborate with frontend engineers to support application functionality and data flow.
- Write clean, efficient, and well-documented code.
- Support database design, optimization, and query performance improvements.
- Participate in code reviews, debugging, and troubleshooting production issues.
- Assist with unit testing, integration testing, and ensuring system reliability.
- Work with cloud-based environments (e.g., AWS, Azure, GCP) to deploy and manage backend systems.
Requirements
- Currently pursuing or recently completed a degree in Computer Science, Software Engineering, or related field.
- Familiarity with one or more backend languages/frameworks (e.g., Node.js, Python/Django, Java/Spring Boot, Ruby on Rails).
- Understanding of RESTful APIs and/or GraphQL.
- Basic knowledge of relational and/or NoSQL databases (e.g., MySQL, PostgreSQL, MongoDB).
- Familiarity with version control (Git/GitHub).
- Strong problem-solving skills and attention to detail.
- Ability to work independently in a remote, collaborative environment.
Preferred Skills (Nice to Have)
- Experience with cloud services (AWS Lambda, S3, EC2, etc.).
- Familiarity with containerization (Docker) and CI/CD pipelines.
- Basic understanding of authentication and authorization (OAuth, JWT).
- Interest in backend performance optimization and scalability.
What You’ll Gain
- Hands-on experience building backend systems for real-world applications.
- Exposure to industry-standard tools, workflows, and coding practices.
- Mentorship from experienced backend engineers.
- Opportunity to contribute to live projects impacting end users.

Position Responsibilities :
- Lead application design and prototyping efforts, making key architectural decisions for new features and systems
- Design and implement complex database solutions using Oracle and Microsoft SQL Server, focusing on performance optimization and scalability
- Develop and maintain enterprise applications using Spring Framework and Spring Boot, ensuring best practices and patterns
- Create and enhance frontend applications using Angular, TypeScript, and modern JavaScript frameworks
- Design and implement RESTful web services, ensuring security and performance
- Lead database design and optimisation efforts, including writing complex SQL queries and stored procedures
- Provide senior-level technical support to QA staff and contribute to test strategy development
- Evaluate and prioritise development initiatives across projects
- Coordinate with global counterparts in development and testing
- Create technical documentation for internal team use
- Mentor junior developers in both backend and frontend technologies
- Review and optimise application performance across the full technology stack
- Lead troubleshooting efforts for complex technical issues across the application stack
- Guide architectural decisions for new features and system enhancements
Qualifications :
Technical Skills Required:
- Strong expertise in Java development (5+ years)
- Expert knowledge in:
- Spring Framework
- Spring Boot
- Maven
- RESTful Web Services
- Frontend Development:
- Angular
- JavaScript
- jQuery
- TypeScript
- Database expertise:
- Microsoft SQL Server
- Oracle SQL
- Advanced SQL query optimization
- Complex database design
- Advanced understanding of:
- Object-oriented programming
- Design patterns
- Microservices architecture
- Code refactoring principles
- Performance optimization
Education & Experience:
- Bachelor's Degree in Computer Science or related fields
- 5+ years of large-scale Web Application Development
- Enterprise development experience using Agile/Scrum methodology
- Expert knowledge of industry best practices
- Cross browser compatibility development experience
Professional Skills:
- Strong analytical and debugging skills
- Ability to multitask in a fast-paced environment
- Strong interpersonal skills
- Excellent written and oral communication
- Willingness to work night shift
About the Role:
We are on the lookout for a dynamic Marketing Automation and Data Analytics Specialist, someone who is not only adept in marketing automation/operation but also possesses a keen expertise in data analytics and visualization. This role is tailor-made for individuals who are proficient with tools like Eloqua, Marketo, Salesforce Pardot, and Power BI.
As our Marketing Automation and Data Analytics Specialist, your responsibilities will span across managing and optimizing marketing automation systems and overseeing the migration and enhancement of data systems and dashboards. You will play a pivotal role in blending marketing strategies with data analytics, ensuring the creation of visually appealing and effective reports and dashboards. Collaborating closely with marketing teams, you will help in making data-driven decisions that propel the company forward.
We believe in fostering an environment where initiative and self-direction are valued. While you will receive the necessary guidance and support, the autonomy of your role is a testament to our trust in your abilities and professionalism.
Responsibilities:
- Manage and optimize marketing automation systems (Eloqua, Marketo, Salesforce Pardot) to map and improve business processes.
- Develop, audit, and enhance data systems, ensuring accuracy and efficiency in marketing efforts.
- Build and migrate interactive, visually appealing dashboards and reports.
- Develop and maintain reporting and analytics for marketing efforts, database health, lead scoring, and dashboard performance.
- Handle technical aspects of key marketing systems and integrate them with data visualization tools like Power BI.
- Review and improve existing SQL data sources for effective integration and analytics.
- Collaborate closely with sales, marketing, and analytics teams to define requirements, establish best practices, and ensure successful outcomes.
- Ensure all marketing data, dashboards, and reports are accurate and effectively meet business needs.
Ideal Candidate Qualities:
- Strong commitment to the role with a focus on long-term growth.
- Exceptional communication and collaboration skills across diverse teams.
- High degree of autonomy and ability to work effectively without micromanagement.
- Strong attention to detail and organization skills.
Qualifications:
- Hands-on experience with marketing automation systems and data analytics tools like Eloqua, Marketo, Salesforce Pardot and Power Bi .
- Proven experience in data visualization and dashboard creation using Power BI.
- Experience with SQL, including building and optimizing queries.
- Knowledge of ABM and Intent Signaling technologies is a plus.
- Outstanding analytical skills with an ability to work with complex datasets.
- Familiarity with data collection, cleaning, and transformation processes.
Benefits:
- Work-from-home flexibility.
- Career advancement opportunities and professional development support.
- Supportive and collaborative team environment.
Hiring Process:
The hiring process at InEvolution is thoughtfully designed to ensure alignment between your career goals and our company's objectives. The process will include:
- Initial Phone Screening: A brief conversation to discuss your background and understand your career aspirations.
- Team Introduction Interview: Candidates who excel in the first round will engage in a deeper discussion with our team, providing insights into our work culture and the specificities of the role.
- Technical Assessment: In the final round, you will meet our Technical Director for an in-depth conversation about your technical skills and how these align with the demands of the role.
About the Role
We are looking for a motivated and detail-oriented Email Marketing Intern to join our marketing team. This is a hands-on opportunity to work with tools like MailChimp and Iterable, and gain real-world experience in executing and analyzing email campaigns. Ideal for someone with a background in marketing and prior internship experience in a similar field.
Key Responsibilities:
- Support the execution of email marketing campaigns using MailChimp and Iterable.
- Assist in segmenting audiences and setting up batch, nurture, and trigger-based campaigns.
- Collaborate with the analytics team to help track campaign performance and contribute to reporting.
- Work with cross-functional teams including Demand Gen, Product Marketing, and Marketing Operations to implement email marketing best practices.
What We're Looking For:
- Educational background in Marketing, Communications, or related fields.
- Prior internship or project experience in email marketing, CRM, or digital campaigns is a strong plus.
- Familiarity with email marketing tools like MailChimp or Iterable.
- Basic knowledge of HTML and content management systems.
- Ability to handle reporting for Email Marketing and General Ecommerce Marketing.
- Curiosity, attention to detail, and a willingness to learn.
Nice to Have (Not Mandatory)
- Experience creating performance or campaign reports.
- Exposure to SQL, CSS, or HTML for email customization
Springer Capital is a cross-border asset management firm specializing in real estate investment banking between China and the USA. We are offering a remote internship for aspiring data engineers interested in data pipeline development, data integration, and business intelligence.
The internship offers flexible start and end dates. A short quiz or technical task may be required as part of the selection process.
Responsibilities:
- Design, build, and maintain scalable data pipelines for structured and unstructured data sources
- Develop ETL processes to collect, clean, and transform data from internal and external systems
- Support integration of data into dashboards, analytics tools, and reporting systems
- Collaborate with data analysts and software developers to improve data accessibility and performance
- Document workflows and maintain data infrastructure best practices
- Assist in identifying opportunities to automate repetitive data tasks
Job description:
6+ years of hands on experience with both Manual and Automated testing with strong preference of experience using AccelQ on Salesforce ans SAP platforms.
Proven expertise in Salesforce particularly within the Sales Cloud module.
Proficient in writing complex SOQL and SQL queries for data validation and backend testing.
Extensive experience in designing and developing robust, reusable automated test scripts for Salesforce environments.
Highly skilled at early issue detection, with a deep understanding of backend configurations, process flows and validation rules.
Should have a strong background in Salesforce testing, with hands-on experience in automation tools such as Selenium, Provar, or TestNG.
You will be responsible for creating and maintaining automated test scripts, executing test cases, identifying bugs, and ensuring the quality and reliability of Salesforce applications.
A solid understanding of Salesforce modules (Sales Cloud, Service Cloud, etc.) and APIs is essential.
Experience with CI/CD tools like Jenkins and version control systems like Git is preferred.
You will work closely with developers, business analysts, and stakeholders to define test strategies and improve the overall QA process.
Springer Capital is a cross-border asset management firm specializing in real estate investment banking between China and the USA. We are offering a remote internship for aspiring data engineers interested in data pipeline development, data integration, and business intelligence. The internship offers flexible start and end dates. A short quiz or technical task may be required as part of the selection process.
Responsibilities:
▪ Design, build, and maintain scalable data pipelines for structured and unstructured data sources
▪ Develop ETL processes to collect, clean, and transform data from internal and external systems
▪ Support integration of data into dashboards, analytics tools, and reporting systems
▪ Collaborate with data analysts and software developers to improve data accessibility and performance
▪ Document workflows and maintain data infrastructure best practices
▪ Assist in identifying opportunities to automate repetitive data tasks


Role Objective
Develop business relevant, high quality, scalable web applications. You will be part of a dynamic AdTech team solving big problems in the Media and Entertainment Sector.
Roles & Responsibilities
* Application Design: Understand requirements from the user, create stories and be a part of the design team. Check designs, give regular feedback and ensure that the designs are as per user expectations.
* Architecture: Create scalable and robust system architecture. The design should be in line with the client infra. This could be on-prem or cloud (Azure, AWS or GCP).
* Development: You will be responsible for the development of the front-end and back-end. The application stack will comprise of (depending on the project) SQL, Django, Angular/React, HTML, CSS. Knowledge of GoLang and Big Data is a plus point.
* Deployment: Suggest and implement a deployment strategy that is scalable and cost-effective. Create a detailed resource architecture and get it approved. CI/CD deployment on IIS or Linux. Knowledge of dockers is a plus point.
* Maintenance: Maintaining development and production environments will be a key part of your job profile. This will also include trouble shooting, fixing bugs and suggesting ways for improving the application.
* Data Migration: In the case of database migration, you will be expected to suggest appropriate strategies and implementation plans.
* Documentation: Create a detailed document covering important aspects like HLD, Technical Diagram, Script Design, SOP etc.
* Client Interaction: You will be interacting with the client on a day-to-day basis and hence having good communication skills is a must.
Requirements
Education-B. Tech (Comp. Sc, IT) or equivalent
Experience- 3+ years of experience developing applications on Django, Angular/React, HTML and CSS
Behavioural Skills-
1. Clear and Assertive communication
2. Ability to comprehend the business requirement
3. Teamwork and collaboration
4. Analytics thinking
5. Time Management
6. Strong troubleshooting and problem-solving skills
Technical Skills-
1. Back-end and Front-end Technologies: Django, Angular/React, HTML and CSS.
2. Cloud Technologies: AWS, GCP, and Azure
3. Big Data Technologies: Hadoop and Spark
4. Containerized Deployment: Dockers and Kubernetes is a plus.
5. Other: Understanding of Golang is a plus.


Ops Analysts/Sys Admin
Company Summary :
As the recognized global standard for project-based businesses, Deltek delivers software and information solutions to help organizations achieve their purpose. Our market leadership stems from the work of our diverse employees who are united by a passion for learning, growing and making a difference. At Deltek, we take immense pride in creating a balanced, values-driven environment, where every employee feels included and empowered to do their best work. Our employees put our core values into action daily, creating a one-of-a-kind culture that has been recognized globally. Thanks to our incredible team, Deltek has been named one of America's Best Midsize Employers by Forbes, a Best Place to Work by Glassdoor, a Top Workplace by The Washington Post and a Best Place to Work in Asia by World HRD Congress. www.deltek.com
Ops Analysts/Sys Admin
Company Summary :
As the recognized global standard for project-based businesses, Deltek delivers software and information solutions to help organizations achieve their purpose. Our market leadership stems from the work of our diverse employees who are united by a passion for learning, growing and making a difference. At Deltek, we take immense pride in creating a balanced, values-driven environment, where every employee feels included and empowered to do their best work. Our employees put our core values into action daily, creating a one-of-a-kind culture that has been recognized globally. Thanks to our incredible team, Deltek has been named one of America's Best Midsize Employers by Forbes, a Best Place to Work by Glassdoor, a Top Workplace by The Washington Post and a Best Place to Work in Asia by World HRD Congress. www.deltek.com
External Job Title :
Systems Engineer 1
Position Responsibilities :
We are seeking a highly skilled and motivated System Engineer to join our team. Apart from a strong technical background, excellent problem-solving abilities, and a collaborative mindset the ideal candidate will be a self-starter with a high level of initiative and a passion for experimentation. This role requires someone who thrives in a fast-paced environment and is eager to take on new challenges.
- Technical Skills:
Must Have Skills:
- PHP
- SQL; Relational Database Concepts
- At least one scripting language (Python, PowerShell, Bash, UNIX, etc.)
- Experience with Learning and Utilizing APIs
Nice to Have Skills:
- Experience with AI Initiatives & exposure of GenAI and/or Agentic AI projects
- Microsoft Power Apps
- Microsoft Power BI
- Atomic
- Snowflake
- Cloud-Based Application Development
- Gainsight
- Salesforce
- Soft Skills:
Must Have Skills:
- Flexible Mindset for Solution Development
- Independent and Self-Driven; Autonomous
- Investigative; drives toward resolving Root Cause of Stakeholder needs instead of treating Symptoms; Critical Thinker
- Collaborative mindset to drive best results
Nice to Have Skills:
- Business Acumen (Very Nice to Have)
- Responsibilities:
- Develop and maintain system solutions to meet stakeholder needs.
- Collaborate with team members and stakeholders to ensure effective communication and teamwork.
- Independently drive projects and tasks to completion with minimal supervision.
- Investigate and resolve root causes of issues, applying critical thinking to develop effective solutions.
- Adapt to changing requirements and maintain a flexible approach to solution development.
Qualifications :
- A college degree in Computer Science, Software Engineering, Information Science or a related field is required
- Minimum 2-3 years of experience programming skills on PHP, Powe BI or Snowflake, Python and API Integration.
- Proven experience in system engineering or a related field.
- Strong technical skills in the required areas.
- Excellent problem-solving and critical thinking abilities.
- Ability to work independently and as part of a team.
- Strong communication and collaboration skills.



Remote Job Opportunity
Job Title: Data Scientist
Contract Duration: 6 months+
Location: offshore India
Work Time: 3 pm to 12 am
Must have 4+ Years of relevant experience.
Job Summary:
We are seeking an AI Data Scientist with a strong foundation in machine learning, deep learning, and statistical modeling to design, develop, and deploy cutting-edge AI solutions.
The ideal candidate will have expertise in building and optimizing AI models, with a deep understanding of both statistical theory and modern AI techniques. You will work on high-impact projects, from prototyping to production, collaborating with engineers, researchers, and business stakeholders to solve complex problems using AI.
Key Responsibilities:
Research, design, and implement machine learning and deep learning models for predictive and generative AI applications.
Apply advanced statistical methods to improve model robustness and interpretability.
Optimize model performance through hyperparameter tuning, feature engineering, and ensemble techniques.
Perform large-scale data analysis to identify patterns, biases, and opportunities for AI-driven automation.
Work closely with ML engineers to validate, train, and deploy the models.
Stay updated with the latest research and developments in AI and machine learning to ensure innovative and cutting-edge solutions.
Qualifications & Skills:
Education: PhD or Master's degree in Statistics, Mathematics, Computer Science, or a related field.
Experience:
4+ years of experience in machine learning and deep learning, with expertise in algorithm development and optimization.
Proficiency in SQL, Python and visualization tools ( Power BI).
Experience in developing mathematical models for business applications, preferably in finance, trading, image-based AI, biomedical modeling, or recommender systems industries
Strong communication skills to interact effectively with both technical and non-technical stakeholders.
Excellent problem-solving skills with the ability to work independently and as part of a team.
Job Title: Salesforce QA Engineer
Experience: 6+ Years
Location: Bangalore - Hybrid (Manyata Tech Park)
Job description:
6+ years of hands on experience with both Manual and Automated testing with strong preference of experience using AccelQ on Salesforce ans SAP platforms.
Proven expertise in Salesforce particularly within the Sales Cloud module.
Proficient in writing complex SOQL and SQL queries for data validation and backend testing.
Extensive experience in designing and developing robust, reusable automated test scripts for Salesforce environments.
Highly skilled at early issue detection, with a deep understanding of backend configurations, process flows and validation rules.
Should have a strong background in Salesforce testing, with hands-on experience in automation tools such as Selenium, Provar, or TestNG.
You will be responsible for creating and maintaining automated test scripts, executing test cases, identifying bugs, and ensuring the quality and reliability of Salesforce applications.
A solid understanding of Salesforce modules (Sales Cloud, Service Cloud, etc.) and APIs is essential.
Experience with CI/CD tools like Jenkins and version control systems like Git is preferred.
You will work closely with developers, business analysts, and stakeholders to define test strategies and improve the overall QA process.
AccioJob is conducting a Walk-In Hiring Drive with FloBiz for the position of Backend Intern.
To apply, register and select your slot here: https://go.acciojob.com/dkfKBz
Required Skills: SQL, RestAPI, OOPs, DSA
Eligibility:
- Degree: BTech./BE, BCA, BSc.
- Branch: Computer Science/CSE/Other CS related branch, IT
- Graduation Year: 2025, 2026
Work Details:
- Work Location: (Remote)
- CTC: ₹12 LPA to ₹15 LPA
Evaluation Process:
Round 1: Offline Assessment at AccioJob Skill Centres Noida, Pune, Chennai, Hyderabad, Bangalore
Further Rounds (for shortlisted candidates only):
Profile & Background Screening Round, Technical Interview Round 1, Technical Interview Round 2, Cultural Fit Round
Important Note: Bring your laptop & earphones for the test.
Register here: https://go.acciojob.com/dkfKBz
Or apply in seconds — straight from our brand-new app!
https://go.acciojob.com/L6rH7C
Position Overview: We are looking for an experienced and highly skilled Senior Data Engineer to join our team and help design, implement, and optimize data systems that support high-end analytical solutions for our clients. As a customer-centric Data Engineer, you will work closely with clients to understand their business needs and translate them into robust, scalable, and efficient technical solutions. You will be responsible for end-to-end data modelling, integration workflows, and data transformation processes while ensuring security, privacy, and compliance.In this role, you will also leverage the latest advancements in artificial intelligence, machine learning, and large language models (LLMs) to deliver high-impact solutions that drive business success. The ideal candidate will have a deep understanding of data infrastructure, optimization techniques, and cost-effective data management
Key Responsibilities:
• Customer Collaboration:
– Partner with clients to gather and understand their business
requirements, translating them into actionable technical specifications.
– Act as the primary technical consultant to guide clients through data challenges and deliver tailored solutions that drive value.
•Data Modeling & Integration:
– Design and implement scalable, efficient, and optimized data models to support business operations and analytical needs.
– Develop and maintain data integration workflows to seamlessly extract, transform, and load (ETL) data from various sources into data repositories.
– Ensure smooth integration between multiple data sources and platforms, including cloud and on-premise systems
• Data Processing & Optimization:
– Develop, optimize, and manage data processing pipelines to enable real-time and batch data processing at scale.
– Continuously evaluate and improve data processing performance, optimizing for throughput while minimizing infrastructure costs.
• Data Governance & Security:
–Implement and enforce data governance policies and best practices, ensuring data security, privacy, and compliance with relevant industry regulations (e.g., GDPR, HIPAA).
–Collaborate with security teams to safeguard sensitive data and maintain privacy controls across data environments.
• Cross-Functional Collaboration:
– Work closely with data engineers, data scientists, and business
analysts to ensure that the data architecture aligns with organizational objectives and delivers actionable insights.
– Foster collaboration across teams to streamline data workflows and optimize solution delivery.
• Leveraging Advanced Technologies:
– Utilize AI, machine learning models, and large language models (LLMs) to automate processes, accelerate delivery, and provide
smart, data-driven solutions to business challenges.
– Identify opportunities to apply cutting-edge technologies to improve the efficiency, speed, and quality of data processing and analytics.
• Cost Optimization:
–Proactively manage infrastructure and cloud resources to optimize throughput while minimizing operational costs.
–Make data-driven recommendations to reduce infrastructure overhead and increase efficiency without sacrificing performance.
Qualifications:
• Experience:
– Proven experience (5+ years) as a Data Engineer or similar role, designing and implementing data solutions at scale.
– Strong expertise in data modelling, data integration (ETL), and data transformation processes.
– Experience with cloud platforms (AWS, Azure, Google Cloud) and big data technologies (e.g., Hadoop, Spark).
• Technical Skills:
– Advanced proficiency in SQL, data modelling tools (e.g., Erwin,PowerDesigner), and data integration frameworks (e.g., Apache
NiFi, Talend).
– Strong understanding of data security protocols, privacy regulations, and compliance requirements.
– Experience with data storage solutions (e.g., data lakes, data warehouses, NoSQL, relational databases).
• AI & Machine Learning Exposure:
– Familiarity with leveraging AI and machine learning technologies (e.g., TensorFlow, PyTorch, scikit-learn) to optimize data processing and analytical tasks.
–Ability to apply advanced algorithms and automation techniques to improve business processes.
• Soft Skills:
– Excellent communication skills to collaborate with clients, stakeholders, and cross-functional teams.
– Strong problem-solving ability with a customer-centric approach to solution design.
– Ability to translate complex technical concepts into clear, understandable terms for non-technical audiences.
• Education:
– Bachelor’s or Master’s degree in Computer Science, Information Systems, Data Science, or a related field (or equivalent practical experience).
LIFE AT FOUNTANE:
- Fountane offers an environment where all members are supported, challenged, recognized & given opportunities to grow to their fullest potential.
- Competitive pay
- Health insurance for spouses, kids, and parents.
- PF/ESI or equivalent
- Individual/team bonuses
- Employee stock ownership plan
- Fun/challenging variety of projects/industries
- Flexible workplace policy - remote/physical
- Flat organization - no micromanagement
- Individual contribution - set your deadlines
- Above all - culture that helps you grow exponentially!
A LITTLE BIT ABOUT THE COMPANY:
Established in 2017, Fountane Inc is a Ventures Lab incubating and investing in new competitive technology businesses from scratch. Thus far, we’ve created half a dozen multi-million valuation companies in the US and a handful of sister ventures for large corporations, including Target, US Ventures, and Imprint Engine.
We’re a team of 120+ strong from around the world that are radically open-minded and believes in excellence, respecting one another, and pushing our boundaries to the furthest it's ever been.
We are looking for a passionate and experienced Business Analyst Trainer to join our training team. This role involves delivering high-quality training programs on business analysis tools, methodologies, and best practices, both in-person and online.
Job Title : Solution Architect – Denodo
Experience : 10+ Years
Location : Remote / Work from Home
Notice Period : Immediate joiners preferred
Job Overview :
We are looking for an experienced Solution Architect – Denodo to lead the design and implementation of data virtualization solutions. In this role, you will work closely with cross-functional teams to ensure our data architecture aligns with strategic business goals. The ideal candidate will bring deep expertise in Denodo, strong technical leadership, and a passion for driving data-driven decisions.
Mandatory Skills : Denodo, Data Virtualization, Data Architecture, SQL, Data Modeling, ETL, Data Integration, Performance Optimization, Communication Skills.
Key Responsibilities :
- Architect and design scalable data virtualization solutions using Denodo.
- Collaborate with business analysts and engineering teams to understand requirements and define technical specifications.
- Ensure adherence to best practices in data governance, performance, and security.
- Integrate Denodo with diverse data sources and optimize system performance.
- Mentor and train team members on Denodo platform capabilities.
- Lead tool evaluations and recommend suitable data integration technologies.
- Stay updated with emerging trends in data virtualization and integration.
Required Qualifications :
- Bachelor’s degree in Computer Science, IT, or a related field.
- 10+ Years of experience in data architecture and integration.
- Proven expertise in Denodo and data virtualization frameworks.
- Strong proficiency in SQL and data modeling.
- Hands-on experience with ETL processes and data integration tools.
- Excellent communication, presentation, and stakeholder management skills.
- Ability to lead technical discussions and influence architectural decisions.
- Denodo or data architecture certifications are a strong plus.

.NET + Angular Full Stack Developer (4–5 Years Experience)
Location: Pune/Remote
Experience Required: 4 to 5 years
Communication: Fluent English (verbal & written)
Technology: .NET, Angular
Only immediate joiners who can start on 21st July should apply.
Job Overview
We are seeking a skilled and experienced Full Stack Developer with strong expertise in .NET (C#) and Angular to join our dynamic team in Pune. The ideal candidate will have hands-on experience across the full development stack, a strong understanding of relational databases and SQL, and the ability to work independently with clients. Experience in microservices architecture is a plus.
Key Responsibilities
- Design, develop, and maintain modern web applications using .NET Core / .NET Framework and Angular
- Write clean, scalable, and maintainable code for both backend and frontend components
- Interact directly with clients for requirement gathering, demos, sprint planning, and issue resolution
- Work closely with designers, QA, and other developers to ensure high-quality product delivery
- Perform regular code reviews, ensure adherence to coding standards, and mentor junior developers if needed
- Troubleshoot and debug application issues and provide timely solutions
- Participate in discussions on architecture, design patterns, and technical best practices
Must-Have Skills
✅ Strong hands-on experience with .NET Core / .NET Framework (Web API, MVC)
✅ Proficiency in Angular (Component-based architecture, RxJS, State Management)
✅ Solid understanding of RDBMS and SQL (preferably with SQL Server)
✅ Familiarity with Entity Framework or Dapper
✅ Strong knowledge of RESTful API design and integration
✅ Version control using Git
✅ Excellent verbal and written communication skills
✅ Ability to work in a client-facing role and handle discussions independently
Good-to-Have / Optional Skills
Understanding or experience in Microservices Architecture
Exposure to CI/CD pipelines, unit testing frameworks, and cloud environments (e.g., Azure or AWS)

Primary skill set: QA Automation, Python, BDD, SQL
As Senior Data Quality Engineer you will:
- Evaluate product functionality and create test strategies and test cases to assess product quality.
- Work closely with the on-shore and the offshore team.
- Work on multiple reports validation against the databases by running medium to complex SQL queries.
- Better understanding of Automation Objects and Integrations across various platforms/applications etc.
- Individual contributor exploring opportunities to improve performance and suggest/articulate the areas of improvements importance/advantages to management.
- Integrate with SCM infrastructure to establish a continuous build and test cycle using CICD tools.
- Comfortable working on Linux/Windows environment(s) and Hybrid infrastructure models hosted on Cloud platforms.
- Establish processes and tools set to maintain automation scripts and generate regular test reports.
- Peer review to provide feedback and to make sure the test scripts are flaw-less.
Core/Must have skills:
- Excellent understanding and hands on experience in ETL/DWH testing preferably DataBricks paired with Python experience.
- Hands on experience SQL (Analytical Functions and complex queries) along with knowledge of using SQL client utilities effectively.
- Clear & crisp communication and commitment towards deliverables
- Experience on BigData Testing will be an added advantage.
- Knowledge on Spark and Scala, Hive/Impala, Python will be an added advantage.
Good to have skills:
- Test automation using BDD/Cucumber / TestNG combined with strong hands-on experience with Java with Selenium. Especially working experience in WebDriver.IO
- Ability to effectively articulate technical challenges and solutions
- Work experience in qTest, Jira, WebDriver.IO
Position : Business Analyst
Experience : 5+ Years
Location : Remote
Notice Period : Immediate Joiners Preferred (or candidates serving 10–15 days’ notice)
Interview Mode : Virtual
Job Description :
We are seeking an experienced Business Analyst with a strong background in requirements gathering, functional documentation, and stakeholder management, particularly in the US Healthcare payer domain.
Mandatory Skills :
Business Analysis, US Healthcare Payer Domain, Requirement Gathering, User Stories, Gap & Impact Analysis, Azure DevOps/TFS, SQL, UML Modeling, SDLC/STLC, System Testing, UAT, Strong Communication Skills.
Key Responsibilities :
- Analyze and understand complex business and functional requirements.
- Translate business needs into detailed User Stories, functional and technical specifications.
- Conduct gap analysis and impact assessment for new and existing product features.
- Create detailed documentation including scope, project plans, and secure stakeholder approvals.
- Support System Testing and User Acceptance Testing (UAT) from a functional perspective.
- Prepare and maintain release notes, end-user documentation, training materials, and process flows.
- Serve as a liaison between business and technical teams, ensuring cross-functional alignment.
- Assist with sprint planning, user story tracking, and status updates using Azure DevOps / TFS.
- Write and execute basic SQL queries for data validation and analysis.
Required Skills :
- Minimum 5 years of experience as a Business Analyst.
- Strong analytical, problem-solving, and communication skills.
- Solid understanding of Project Life Cycle, STLC, and UML modeling.
- Prior experience in US Healthcare payer domain is mandatory.
- Familiarity with tools like Azure DevOps / TFS.
- Ability to work with urgency, manage priorities, and maintain attention to detail.
- Strong team collaboration and stakeholder management.

🚀 We're Urgently Hiring – Node.js Backend Development Intern
Join our backend team as an intern and get hands-on experience building scalable, real-world applications with Node.js, Firebase, and AWS.
📍 Remote / Onsite
📍 📅 Duration: 2 Months
🔧 What You’ll Work On:
Backend development using Node.js
Firebase, SQL & NoSQL database management
RESTful API integration
Deployment on AWS infrastructure
Support Services Analyst
Company Summary :
As the recognized global standard for project-based businesses, Deltek delivers software and information solutions to help organizations achieve their purpose. Our market leadership stems from the work of our diverse employees who are united by a passion for learning, growing and making a difference. At Deltek, we take immense pride in creating a balanced, values-driven environment, where every employee feels included and empowered to do their best work. Our employees put our core values into action daily, creating a one-of-a-kind culture that has been recognized globally. Thanks to our incredible team, Deltek has been named one of America's Best Midsize Employers by Forbes, a Best Place to Work by Glassdoor, a Top Workplace by The Washington Post and a Best Place to Work in Asia by World HRD Congress. www.deltek.com
Business Summary :
Deltek’s award winning Support Services team provides best-in-class assistance to Deltek’s customers across the world via phone, chat and email. Our team is comprised of a group of diverse, collaborative and passionate professionals who come from varying industries, backgrounds and professions. Our diversity and passion is our strength, so however you identify and whatever background you bring, we invite you to explore our team as a potential next step in your career!
External Job Title :
Support Services Analyst
Position Responsibilities :
- Serves as the second level of support to all the customers’ queries.
- Resolves the queries and issues by ensuring that all assigned requests are addressed within the SLA or escalate it further as required.
- Take ownership and responsibility of issues from start through to a successful resolution and follow the escalation process, to speed up the resolution.
- Effective & efficient working in partnership with other departments to prevent delay in resolution
- Applying technology in multiple ways to configure the product and helping the customer implement Replicons products.
- Solid understanding of product limits and suggesting ways of improving the product
- Logically understanding the concepts of other SaaS based products for integration requests.
- Need to be multi skilled in all three mediums (phone. Chats and emails.)
Qualifications :
- Any Bachelor's Degree
- At least 2 years of experience in software application support and/or infrastructure support
- Basic understanding of Web technology, basic networking & hardware knowledge, and software applications
- Excellent communication skills - verbal, written, listening skills and interpersonal skills.
- Ability to communicate in a tactful, courteous manner and to deal with and resolve complex situations in a professional manner
- Ability to handle multiple tasks/projects simultaneously and effectively work individually or in a team environment
- Open to work in a 24/7 support environment

Job Title : Technical Architect
Experience : 8 to 12+ Years
Location : Trivandrum / Kochi / Remote
Work Mode : Remote flexibility available
Notice Period : Immediate to max 15 days (30 days with negotiation possible)
Summary :
We are looking for a highly skilled Technical Architect with expertise in Java Full Stack development, cloud architecture, and modern frontend frameworks (Angular). This is a client-facing, hands-on leadership role, ideal for technologists who enjoy designing scalable, high-performance, cloud-native enterprise solutions.
🛠 Key Responsibilities :
- Architect scalable and high-performance enterprise applications.
- Hands-on involvement in system design, development, and deployment.
- Guide and mentor development teams in architecture and best practices.
- Collaborate with stakeholders and clients to gather and refine requirements.
- Evaluate tools, processes, and drive strategic technical decisions.
- Design microservices-based solutions deployed over cloud platforms (AWS/Azure/GCP).
✅ Mandatory Skills :
- Backend : Java, Spring Boot, Python
- Frontend : Angular (at least 2 years of recent hands-on experience)
- Cloud : AWS / Azure / GCP
- Architecture : Microservices, EAI, MVC, Enterprise Design Patterns
- Data : SQL / NoSQL, Data Modeling
- Other : Client handling, team mentoring, strong communication skills
➕ Nice to Have Skills :
- Mobile technologies (Native / Hybrid / Cross-platform)
- DevOps & Docker-based deployment
- Application Security (OWASP, PCI DSS)
- TOGAF familiarity
- Test-Driven Development (TDD)
- Analytics / BI / ML / AI exposure
- Domain knowledge in Financial Services or Payments
- 3rd-party integration tools (e.g., MuleSoft, BizTalk)
⚠️ Important Notes :
- Only candidates from outside Hyderabad/Telangana and non-JNTU graduates will be considered.
- Candidates must be serving notice or joinable within 30 days.
- Client-facing experience is mandatory.
- Java Full Stack candidates are highly preferred.
🧭 Interview Process :
- Technical Assessment
- Two Rounds – Technical Interviews
- Final Round


What You’ll Be Doing:
● Design and build parts of our data pipeline architecture for extraction, transformation, and loading of data from a wide variety of data sources using the latest Big Data technologies.
● Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
● Work with stakeholders including the Executive, Product, Data and Design teams to assist with data-related technical issues and support their data infrastructure needs.
● Work with machine learning, data, and analytics experts to drive innovation, accuracy and greater functionality in our data system. Qualifications:
● Bachelor's degree in Engineering, Computer Science, or relevant field.
● 10+ years of relevant and recent experience in a Data Engineer role. ● 5+ years recent experience with Apache Spark and solid understanding of the fundamentals.
● Deep understanding of Big Data concepts and distributed systems.
● Strong coding skills with Scala, Python, Java and/or other languages and the ability to quickly switch between them with ease.
● Advanced working SQL knowledge and experience working with a variety of relational databases such as Postgres and/or MySQL.
● Cloud Experience with DataBricks
● Experience working with data stored in many formats including Delta Tables, Parquet, CSV and JSON.
● Comfortable working in a linux shell environment and writing scripts as needed.
● Comfortable working in an Agile environment
● Machine Learning knowledge is a plus.
● Must be capable of working independently and delivering stable, efficient and reliable software.
● Excellent written and verbal communication skills in English.
● Experience supporting and working with cross-functional teams in a dynamic environment
EMPLOYMENT TYPE: Full-Time, Permanent
LOCATION: Remote (Pan India)
SHIFT TIMINGS: 2.00 pm-11:00pm IST


Responsibilities
Develop and maintain web and backend components using Python, Node.js, and Zoho tools
Design and implement custom workflows and automations in Zoho
Perform code reviews to maintain quality standards and best practices
Debug and resolve technical issues promptly
Collaborate with teams to gather and analyze requirements for effective solutions
Write clean, maintainable, and well-documented code
Manage and optimize databases to support changing business needs
Contribute individually while mentoring and supporting team members
Adapt quickly to a fast-paced environment and meet expectations within the first month
Leadership Opportunities
Lead and mentor junior developers in the team
Drive projects independently while collaborating with the broader team
Act as a technical liaison between the team and stakeholders to deliver effective solutions
Selection Process
1. HR Screening: Review of qualifications and experience
2. Online Technical Assessment: Test coding and problem-solving skills
3. Technical Interview: Assess expertise in web development, Python, Node.js, APIs, and Zoho
4. Leadership Evaluation: Evaluate team collaboration and leadership abilities
5. Management Interview: Discuss cultural fit and career opportunities
6. Offer Discussion: Finalize compensation and role specifics
Experience Required
5- 7 years of relevant experience as a Software Developer
Proven ability to work as a self-starter and contribute individually
Strong technical and interpersonal skills to support team members effectively
Solution Engineer
Primary Responsibilities
● Serve as the primary resource during the client implementation/onboarding phase
● Identify, document, and define customer business and technical needs
● Develop clear user documentation, instructions, and standard procedures
● Deliver training sessions on solution administration and usage
● Participate in customer project calls and serve as a subject matter expert on solutions
● Coordinate tasks across internal and client project teams, ensuring accountability and progress tracking
● Perform hands-on configuration, scripting, data imports, testing, and knowledge transfer activities
● Translate business requirements into technical specifications for product configuration or enhancements
● Collaborate with global team members across multiple time zones, including the U.S., India, and China
● Build and maintain strong customer relationships to gather and validate requirements
● Contribute to the development of implementation best practices and suggest improvements to processes
● Execute other tasks and duties as assigned
Note: Salary offered will depend on the candidate's qualifications and experience.
Required Skills & Experience
● Proven experience leading software implementation projects from presales through delivery
● Strong organizational skills with the ability to manage multiple detailed and interdependent tasks
● 2–5 years of experience in JavaScript and web development, including prior implementation work in a software company
● Proficiency in some or all of the following:
○ JavaScript, PascalScript, MS SQL Script, RESTful APIs, Azure, Postman
○ Embarcadero RAD Studio, Delphi
○ Basic SQL and debugging
○ SMS integration and business intelligence tools
● General knowledge of database structures and data migration processes
● Familiarity with project management tools and methodologies
● Strong interpersonal skills with a focus on client satisfaction and relationship-building
● Self-starter with the ability to work productively in a remote, distributed team environment
● Experience in energy efficiency retrofits, construction, or utility demand-side management is a plus

Responsibilities:
● Technical Leadership:
○ Architect and design complex software systems
○ Lead the development team in implementing software solutions
○ Ensure adherence to coding standards and best practices
○ Conduct code reviews and provide constructive feedback
○ Troubleshoot and resolve technical issues
● Project Management:
○ Collaborate with project managers to define project scope and requirements
○ Estimate project timelines and resource needs
○ Track project progress and ensure timely delivery
○ Manage risks and identify mitigation strategies
● Team Development:
○ Mentor and coach junior developers
○ Foster a collaborative and supportive team environment
○ Conduct performance evaluations and provide feedback
○ Identify training and development opportunities for team members
● Innovation:
○ Stay abreast of emerging technologies and industry trends
○ Evaluate and recommend new technologies for adoption
○ Encourage experimentation and innovation within the team
Qualifications
● Experience:
○ 12+ years of experience in software development
○ 4+ years of experience in a leadership role
○ Proven track record of delivering successful software projects
● Skills:
○ Strong proficiency in C# programming languages
○ Good knowledge on Java for reporting
○ Strong on SQL - Microsoft azure
○ Expertise in software development methodologies (e.g., Agile, Scrum)
○ Excellent problem-solving and analytical skills
○ Strong communication and interpersonal skills
○ Ability to work independently and as part of a team


POSITION:
Senior Data Engineer
The Senior Data Engineer will be responsible for building and extending our data pipeline architecture, as well as optimizing data flow and collection for cross functional teams. The ideal candidate is an experienced data pipeline builder and data wrangler who enjoys working with big data and building systems from the ground up.
You will collaborate with our software engineers, database architects, data analysts and data scientists to ensure our data delivery architecture is consistent throughout the platform. You must be self-directed and comfortable supporting the data needs of multiple teams, systems and products. The right candidate will be excited by the prospect of optimizing or even re-designing our company’s data architecture to support our next generation of products and data initiatives.
What You’ll Be Doing:
● Design and build parts of our data pipeline architecture for extraction, transformation, and loading of data from a wide variety of data sources using the latest Big Data technologies.
● Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
● Work with stakeholders including the Executive, Product, Data and Design teams to assist with data-related technical issues and support their data infrastructure needs.
● Work with machine learning, data, and analytics experts to drive innovation, accuracy and greater functionality in our data system.
Qualifications:
● Bachelor's degree in Engineering, Computer Science, or relevant
field.
● 10+ years of relevant and recent experience in a Data Engineer role.
● 5+ years recent experience with Apache Spark and solid understanding of the fundamentals.
● Deep understanding of Big Data concepts and distributed systems.
● Strong coding skills with Scala, Python, Java and/or other languages and the ability to quickly switch between them with ease.
● Advanced working SQL knowledge and experience working with a variety of relational databases such as Postgres and/or MySQL.
● Cloud Experience with DataBricks
● Experience working with data stored in many formats including Delta Tables, Parquet, CSV and JSON.
● Comfortable working in a linux shell environment and writing scripts as needed.
● Comfortable working in an Agile environment
● Machine Learning knowledge is a plus.
● Must be capable of working independently and delivering stable,
efficient and reliable software.
● Excellent written and verbal communication skills in English.
● Experience supporting and working with cross-functional teams in a dynamic environment.
REPORTING: This position will report to our CEO or any other Lead as assigned by Management.
EMPLOYMENT TYPE: Full-Time, Permanent LOCATION: Remote (Pan India) SHIFT TIMINGS: 2.00 pm-11:00pm IST
WHO WE ARE:
SalesIntel is the top revenue intelligence platform on the market. Our combination of automation and researchers allows us to reach 95% data accuracy for all our published contact data, while continuing to scale up our number of contacts. We currently have more than 5 million human-verifi ed contacts, another 70 million plus machine processed contacts, and the highest number of direct dial contacts in the industry. We guarantee our accuracy with our well-trained research team that re-verifi es every direct dial number, email, and contact every 90 days. With the most comprehensive contact and company data and our excellent customer service, SalesIntel has the best B2B data available. For more information, please visit – www.salesintel.io
WHAT WE OFFER: SalesIntel’s workplace is all about diversity. Different countries and cultures are represented in our workforce. We are growing at a fast pace and our work environment is constantly evolving with changing times. We motivate our team to better themselves by offering all the good stuff you’d expect like Holidays, Paid Leaves, Bonuses, Incentives, Medical Policy and company paid Training Programs.
SalesIntel is an Equal Opportunity Employer. We prohibit discrimination and harassment of any type and offer equal employment opportunities to employees and applicants without regard to race, color, religion, sex, sexual orientation, gender identity or expression, pregnancy, age, national origin, disability status, genetic information, protected veteran status, or any other characteristic protected by law.


Knowledge of Gen AI technology ecosystem including top tier LLMs, prompt engineering, knowledge of development frameworks such as LLMaxindex and LangChain, LLM fine tuning and experience in architecting RAGs and other LLM based solution for enterprise use cases. 1. Strong proficiency in programming languages like Python and SQL. 2. 3+ years of experience of predictive/prescriptive analytics including Machine Learning algorithms (Supervised and Unsupervised) and deep learning algorithms and Artificial Neural Networks such as Regression , classification, ensemble model,RNN,LSTM,GRU. 3. 2+ years of experience in NLP, Text analytics, Document AI, OCR, sentiment analysis, entity recognition, topic modeling 4. Proficiency in LangChain and Open LLM frameworks to perform summarization, classification, Name entity recognition, Question answering 5. Proficiency in Generative techniques prompt engineering, Vector DB, LLMs such as OpenAI,LlamaIndex, Azure OpenAI, Open-source LLMs will be important 6. Hands-on experience in GenAI technology areas including RAG architecture, fine tuning techniques, inferencing frameworks etc 7. Familiarity with big data technologies/frameworks 8. Sound knowledge of Microsoft Azure

Profile: Senior PHP Developer
Experience- 10+Years
Mode: Remote
Required Skills:
- PHP (10+ years) & Symfony framework (5+ years)
- Team leadership experience (3+ years)
- OOP, design patterns, RESTful APIs
- Database optimization (MySQL/PostgreSQL)
- Git, CI/CD, testing frameworks
- Excellent communication skills
Responsibilities:
- Lead PHP/Symfony development
- Mentor team members
- Ensure code quality through reviews
- Collaborate with stakeholders
- Manage sprint cycles
- Optimize application performance

We are looking for a highly skilled Senior Software Engineer with over 5 years of experience in full stack development using React.js and Node.js. As a senior member of our engineering team, you’ll take ownership of complex technical challenges, influence architecture decisions, mentor junior developers, and contribute to high-impact products.
Key Responsibilities:
Design, build, and maintain scalable web applications using React.js (frontend) and Node.js (backend).
Architect robust, secure, and scalable backend APIs and frontend components.
Collaborate closely with Product Managers, Designers, and DevOps to deliver end-to-end features.
Conduct code reviews, enforce best practices, and guide junior developers.
Optimize application performance, scalability, and responsiveness.
Troubleshoot, debug, and upgrade existing systems.
Stay current with new technologies and advocate for continuous improvement.
Required Qualifications:
Bachelor’s or Master’s degree in Computer Science, Engineering, or related field.
5+ years of experience in full stack development.
Strong expertise in React.js and related libraries (Redux, Hooks, etc.).
In-depth experience with Node.js, Express.js, and RESTful APIs.
Proficiency with JavaScript/TypeScript and modern frontend tooling (Webpack, Babel, etc.).
Experience with relational and NoSQL databases (e.g., PostgreSQL, MongoDB).
Solid understanding of CI/CD, testing (Jest, Mocha), and version control (Git).
Familiarity with cloud services (AWS/GCP/Azure) and containerization (Docker, Kubernetes) is a plus.
Excellent communication and problem-solving skills.
Nice to Have:
Experience with microservices architecture.
Knowledge of GraphQL.
Exposure to serverless computing.
Prior experience working in Agile/Scrum teams.


Title: Data Engineer II (Remote – India/Portugal)
Exp: 4- 8 Years
CTC: up to 30 LPA
Required Skills & Experience:
- 4+ years in data engineering or backend software development
- AI / ML is important
- Expert in SQL and data modeling
- Strong Python, Java, or Scala coding skills
- Experience with Snowflake, Databricks, AWS (S3, Lambda)
- Background in relational and NoSQL databases (e.g., Postgres)
- Familiar with Linux shell and systems administration
- Solid grasp of data warehouse concepts and real-time processing
- Excellent troubleshooting, documentation, and QA mindset
If interested, kindly share your updated CV to 82008 31681

Role: GCP Data Engineer
Notice Period: Immediate Joiners
Experience: 5+ years
Location: Remote
Company: Deqode
About Deqode
At Deqode, we work with next-gen technologies to help businesses solve complex data challenges. Our collaborative teams build reliable, scalable systems that power smarter decisions and real-time analytics.
Key Responsibilities
- Build and maintain scalable, automated data pipelines using Python, PySpark, and SQL.
- Work on cloud-native data infrastructure using Google Cloud Platform (BigQuery, Cloud Storage, Dataflow).
- Implement clean, reusable transformations using DBT and Databricks.
- Design and schedule workflows using Apache Airflow.
- Collaborate with data scientists and analysts to ensure downstream data usability.
- Optimize pipelines and systems for performance and cost-efficiency.
- Follow best software engineering practices: version control, unit testing, code reviews, CI/CD.
- Manage and troubleshoot data workflows in Linux environments.
- Apply data governance and access control via Unity Catalog or similar tools.
Required Skills & Experience
- Strong hands-on experience with PySpark, Spark SQL, and Databricks.
- Solid understanding of GCP services (BigQuery, Cloud Functions, Dataflow, Cloud Storage).
- Proficiency in Python for scripting and automation.
- Expertise in SQL and data modeling.
- Experience with DBT for data transformations.
- Working knowledge of Airflow for workflow orchestration.
- Comfortable with Linux-based systems for deployment and troubleshooting.
- Familiar with Git for version control and collaborative development.
- Understanding of data pipeline optimization, monitoring, and debugging.

Job Description:
As a Tally Developer, your main responsibility will be to develop custom solutions in Tally using TDL as per the customer requirements. You will work closely with clients, business analysts, Senior developers, and other stakeholders to understand their requirements and translate them into effective Tally-based solutions.
Responsibilities:
Collaborate business analysts and senior developer/project manager to gather and analyses client requirements.
Design, develop, and customize Tally-based software solutions to meet the specific requirements of clients.
Write efficient and well-documented code in Tally Definition Language (TDL) to extend the functionality of Tally software.
Follow the Software Development Life Cycle including requirements gathering, design, coding, testing, and deployment.
Troubleshoot and debug issues related to Tally customization, data import/export, and software integrations.
Provide technical support and assistance to clients and end-users in utilizing and troubleshooting Tally-based software solutions.
Stay updated with the latest features and updates in Tally software to leverage new functionalities in solution development.
Adhere to coding standards, documentation practices, and quality assurance processes.
Requirements:
Any Degree. Relevant work experience may be considered in place of a degree.
Experience in Tally development and customization for projects using Tally Definition Language (TDL).
Hands-on experience in Tally and implementation of its features.
Familiarity with database systems, data structures, and SQL for efficient data management and retrieval.
Strong problem-solving skills and attention to detail.
Good communication and teamwork abilities.
Continuous learning mindset to keep up with advancements in Tally software and related technologies.
Key Skills Required:
TDL (Tally Definition Language), Tally, Excel, XML/JSON.
Good to have Basic Skills:
Database like MS SQL, MySQL
API Integration.
WORK EXPERIENCE- MINIMUM 2 YEARS AND MAXIMUM 7 YEARS
Interested candidate may what's app their cv on TRIPLE NINE ZERO NINE THREE DOUBLE ONE DOUBLE FOURE.
Please answer the below question?
Do you have knowledge of Tally Definition Language?
How many experience do you have as TDL?
The Consultant / Senior Consultant – Adobe Campaign is a technical role that requires providing Consulting advice and support to Clients for Implementing Adobe Campaign solution and any technical advisory required afterwards. This is a client-facing role and requires consultant to liaise with the client, understand their technical and business requirements and then Implement Adobe Campaign solution in a manner client gets most value out of the solution. Consultant’s main objective is to drive successful delivery and maintaining a high level of satisfaction for our customer.
What you need to succeed
• Expertise and Experience in SQL (Oracle / SQL Server / PostgreSQL) • Programming experience (Javascript / Java / VB / C# / PHP)
• Knowledge on Web Technologies like HTML, CSS would be a plus
• Good communication skills to ensure effective customer interactions, communications, and documentation
• Self-starter - Organized and highly motivated
• Fast learner, ability to learn new technologies/languages
• Knowledge of HTML DOM manipulation and page load events a plus
• Project Management skills a plus
• Ability to develop creative solutions to problems
• Able to multi-task in a dynamic environment
• Able to work independently with minimal supervision
• Experience leading team members will be a plus Adobe is an equal opportunity/affirmative action employer. We welcome and encourage diversity in the workplace.
Proficient in Looker Action, Looker Dashboarding, Looker Data Entry, LookML, SQL Queries, BigQuery, LookML, Looker Studio, BigQuery, GCP.
Remote Working
2 pm to 12 am IST or
10:30 AM to 7:30 PM IST
Sunday to Thursday
Responsibilities:
● Create and maintain LookML code, which defines data models, dimensions, measures, and relationships within Looker.
● Develop reusable LookML components to ensure consistency and efficiency in report and dashboard creation.
● Build and customize dashboard to Incorporate data visualizations, such as charts and graphs, to present insights effectively.
● Write complex SQL queries when necessary to extract and manipulate data from underlying databases and also optimize SQL queries for performance.
● Connect Looker to various data sources, including databases, data warehouses, and external APIs.
● Identify and address bottlenecks that affect report and dashboard loading times and Optimize Looker performance by tuning queries, caching strategies, and exploring indexing options.
● Configure user roles and permissions within Looker to control access to sensitive data & Implement data security best practices, including row-level and field-level security.
● Develop custom applications or scripts that interact with Looker's API for automation and integration with other tools and systems.
● Use version control systems (e.g., Git) to manage LookML code changes and collaborate with other developers.
● Provide training and support to business users, helping them navigate and use Looker effectively.
● Diagnose and resolve technical issues related to Looker, data models, and reports.
Skills Required:
● Experience in Looker's modeling language, LookML, including data models, dimensions, and measures.
● Strong SQL skills for writing and optimizing database queries across different SQL databases (GCP/BQ preferable)
● Knowledge of data modeling best practices
● Proficient in BigQuery, billing data analysis, GCP billing, unit costing, and invoicing, with the ability to recommend cost optimization strategies.
● Previous experience in Finops engagements is a plus
● Proficiency in ETL processes for data transformation and preparation.
● Ability to create effective data visualizations and reports using Looker’s dashboard tools.
● Ability to optimize Looker performance by fine-tuning queries, caching strategies, and indexing.
● Familiarity with related tools and technologies, such as data warehousing (e.g., BigQuery ), data transformation tools (e.g., Apache Spark), and scripting languages (e.g., Python).